Context. Galaxy clusters selected based on overdensities of galaxies in photometric surveys provide the largest cluster samples. However, modeling the selection function of such samples is complicated by noncluster members projected along the line of sight (projection effects) and the potential detection of unvirialized objects (contamination).
Aims. We empirically constrained the magnitude of these effects by cross-matching galaxy clusters selected in the Dark Energy Survey data with the redMaPPer algorithm with significant detections in three South Pole Telescope surveys (SZ, pol-ECS, pol-500d).
Methods. For matched clusters, we augmented the redMaPPer catalog with the SPT detection significance. For unmatched objects we used the SPT detection threshold as an upper limit on the SZe signature. Using a Bayesian population model applied to the collected multiwavelength data, we explored various physically motivated models to describe the relationship between observed richness and halo mass.
Results. Our analysis reveals a clear preference for models with an additional skewed scatter component associated with projection effects over a purely log-normal scatter model. We rule out significant contamination by unvirialized objects at the high-richness end of the sample. While dedicated simulations offer a well-fitting calibration of projection effects, our findings suggest the presence of redshift-dependent trends that these simulations may not have captured. Our findings highlight that modeling the selection function of optically detected clusters remains a complicated challenge that requires a combination of simulation and data-driven approaches.