{"success":1,"msg":"","color":"rgb(28, 35, 49)","title":"Escaping the curse of dimensionality in Bayesian model based clustering<\/b>","description":"webinar","title2":"","start":"2021-02-08 09:00","end":"2021-02-08 10:00","responsable":"Dan Kowal <\/i><\/a>","speaker":"Antonio Canale","id":"23","type":"webinar","timezone":"America\/Chicago","activity":"Register in advance for this meeting:\r\nhttps:\/\/riceuniversity.zoom.us\/meeting\/register\/tJctdOGgrzwsGNLipbTR_uIQxx37Y9rBOr67\r\n \r\nAfter registering, you will receive a confirmation email containing information about joining the meeting.","abstract":"In many applications, there is interest in clustering very high-dimensional data. A common strategy is first stage dimensionality reduction followed by a standard clustering algorithm, such as k-means. This approach does not target dimension reduction to the clustering objective, and fails to quantify uncertainty. Model-based Bayesian approaches provide an appealing alternative, but often have poor performance in high-dimensions, producing too many or too few clusters. This article provides an explanation for this behavior through studying the clustering posterior in a non-standard setting with fixed sample size and increasing dimensionality. We show that the finite sample posterior tends to either assign every observation to a different cluster or all observations to the same cluster as dimension grows, depending on the kernels and prior specification but not on the true data-generating model. To find models avoiding this pitfall, we define a Bayesian oracle for clustering, with the oracle clustering posterior based on the true values of low-dimensional latent variables. We define a class of LAtent Mixtures for Bayesian (Lamb) clustering that have equivalent behavior to this oracle as dimension grows. Lamb is shown to have good performance in simulation studies and an application to inferring cell types based on scRNAseq. \r\n(Joint project with Noirrit Kiran Chandra and David B. Dunson)"}