Mixtures: Estimation and Applications contains a collection of chapters written by international experts in the field, representing the state of the art in mixture modelling, inference and computation. A wide and representative array of applications of mixtures, for instance in biology and economics, are covered. Both Bayesian and non–Bayesian methodologies, parametric and non–parametric perspectives, statistics and machine learning schools appear in the book.
- Provides a contemporary account of mixture inference, with Bayesian, non–parametric and learning interpretations.
- Explores recent developments about the EM (expectation maximization) algorithm for maximum likelihood estimation.
- Looks at the online algorithms used to process unlimited amounts of data as well as large dataset applications.
- Compares testing methodologies and details asymptotics in finite mixture models.
- Introduces mixture of experts modeling and mixed membership models with social science applications.
- Addresses exact Bayesian analysis, the label switching debate, and manifold Markov Chain Monte Carlo for mixtures.
- Includes coverage of classification and machine learning extensions.
- Features contributions from leading statisticians and computer scientists.
This area of statistics is important to a range of disciplines, including bioinformatics, computer science, ecology, social sciences, signal processing, and finance. This collection will prove useful to active researchers and practitioners in these areas.
List of Contributors
1 The EM algorithm, variational approximations and expectation propagation for mixtures
1.2 The EM algorithm
1.3 Variational approximations
2 Online expectation maximisation
2.2 Model and assumptions
2.3 The EM algorithm and the limiting EM recursion
2.4 Online expectation maximisation
3 The limiting distribution of the EM test of the order of a finite mixture
J. Chen and Pengfei Li
3.2 The method and theory of the EM test
4 Comparing Wald and likelihood regions applied to locally identifiable mixture models
Daeyoung Kim and Bruce G. Lindsay
4.2 Background on likelihood confidence regions
4.3 Background on simulation and visualisation of the likelihood regions
4.4 Comparison between the likelihood regions and the Wald regions
4.5 Application to a finite mixture model
4.6 Data analysis
5 Mixture of experts modelling with social science applications
Isobel Claire Gormley and Thomas Brendan Murphy
5.2 Motivating examples
5.3 Mixture models
5.4 Mixture of experts models
5.5 A Mixture of experts model for ranked preference data
5.6 A Mixture of experts latent position cluster model
6 Modelling conditional densities using finite smooth mixtures
Feng Li, Mattias Villani and Robert Kohn
6.2 The model and prior
6.3 Inference methodology
Appendix: Implementation details for the gamma and log–normal models
7 Nonparametric mixed membership modelling using the IBP compound Dirichlet process
Sinead Williamson, Chong Wang, Katherine A. Heller, and David M. Blei
7.2 Mixed membership models
7.4 Decorrelating prevalence and proportion
7.5 Related models
7.6 Empirical studies
8 Discovering nonbinary hierarchical structures with Bayesian rose trees
Charles Blundell, Yee Whye Teh, and Katherine A. Heller
8.2 Prior work
8.3 Rose trees, partitions and mixtures
8.4 Greedy Construction of Bayesian Rose Tree Mixtures
8.5 Bayesian hierarchical clustering, Dirichlet process models and product partition models
9 Mixtures of factor analyzers for the analysis of high–dimensional data
Geoffrey J. McLachlan, Jangsun Baek, and Suren I. Rathnayake
9.2 Single–factor analysis model
9.3 Mixtures of factor analyzers
9.4 Mixtures of common factor analyzers (MCFA)
9.5 Some related approaches
9.6 Fitting of factor–analytic models
9.7 Choice of the number of factors q
9.9 Low–dimensional plots via MCFA approach
9.10 Multivariate t–factor analysers
10 Dealing with Label Switching under model uncertainty
10.2 Labelling through clustering in the point–process representation
10.3 Identifying mixtures when the number of components is unknown
10.4 Overfitting heterogeneity of component–specific parameters
10.5 Concluding remarks
11 Exact Bayesian analysis of mixtures
Christian .P. Robert and Kerrie L. Mengersen
11.2 Formal derivation of the posterior distribution
12 Manifold MCMC for mixtures
Vassilios Stathopoulos and Mark Girolami
12.2 Markov chain Monte Carlo methods
12.3 Finite Gaussian mixture models
13 How many components in a finite mixture?
13.2 The galaxy data
13.3 The normal mixture model
13.4 Bayesian analyses
13.5 Posterior distributions for K (for flat prior)
13.6 Conclusions from the Bayesian analyses
13.7 Posterior distributions of the model deviances
13.8 Asymptotic distributions
13.9 Posterior deviances for the galaxy data
14 Bayesian mixture models: a blood–free dissection of a sheep
Clair L. Alston, Kerrie L. Mengersen, and Graham E. Gardner
14.2 Mixture models
14.3 Altering dimensions of the mixture model
14.4 Bayesian mixture model incorporating spatial information
14.5 Volume calculation