{"success":1,"msg":"","color":"rgb(28, 35, 49)","title":"Generalized mixtures of finite mixtures<\/b>","description":"webinar","title2":"","start":"2020-12-04 08:00","end":"2020-12-04 08:00","responsable":"Michele Guindani <\/i><\/a>","speaker":"Gertraud Malsiner-Walli","id":"19","type":"webinar","timezone":"America\/Los_Angeles","activity":"Join Zoom Meeting\r\nhttps:\/\/uci.zoom.us\/j\/96364696943\r\n\r\nMeeting ID: 963 6469 6943\r\nPasscode: 244487\r\n","abstract":"Within a Bayesian framework of model-based clustering, an investigation of the model class of mixtures of finite mixtures (MFMs) where a prior on the number of components is specified is performed. We contribute to MFMs by considering a generalized class of MFMs containing static and dynamic MFMs where the Dirichlet parameter of the component weights is either fixed or depends on the number of components. We emphasize the distinction between the number of components $K$ of a mixture and the number of clusters $K_+$, i.e., the number of filled components given the data. In the MFM model, $K_+$ is a random\r\nvariable and its prior depends on the prior on the number of components $K$ and the mixture weights. We characterize the prior on the number of clusters $K_+$ for generalized MFMs and derive computationally feasible formulas to calculate this implicit prior. In addition we propose a flexible prior distribution class for the number of components $K$ and link MFMs to Bayesian non-parametric mixtures.\r\n\r\nFor posterior inference of a generalized MFM, we propose the novel telescoping sampler which allows Bayesian inference for mixtures with\r\narbitrary component distributions without the need to resort to RJMCMC methods. The telescoping sampler explicitly samples the number of components, but otherwise requires only the usual MCMC steps for estimating a finite mixture model. The ease of its application using different component distributions is\r\ndemonstrated on several data sets.\r\n"}