The first part of this book presents the foundations of Bayesian inference, via simple inferential problems in the social sciences: proportions, cross–tabulations, counts, means and regression analysis. A review of modern, simulation–based inference is presented with a detailed examination of the suite of computational tools (Markov chain Monte Carlo algorithms) that underlie the Bayesian revolution in contemporary statistics. Furthermore, the book introduces the general purpose Bayesian computer programs BUGS and JAGS along with numerous examples, and a detailed consideration of the art of using these programs in real–world settings.
The second half of the book focuses on intermediate to advanced applications in the social sciences, including hierarchical or multi–level models, models for discrete responses (binary, ordinal, and multinomial data), measurement models (factor analysis, item–response models, dynamic linear models), and mixture models, along with models that are interesting hybrids of these models. Each model is accompanied by worked examples using BUGS/JAGS, using data from political science, sociology, psychology, education, communications, economics and anthropology.
Each chapter is accompanied with exercises to further the students′ understanding of Bayesian methods and applications. Extensive appendices provide important technical background and proofs of key theoretical propositions.
This book presents a forceful argument for the philosophical and practical utility of the Bayesian approach in many social science settings. Graduate and postgraduate students in such fields as political science, sociology, psychology, communications, education, and economics and statisticians will find much value in this book.
List of Tables.
Part I: Introducing Bayesian Analysis.
1. The foundations of Bayesian inference.
1.1 What is probability?
1.2 Subjective probability in Bayesian statistics.
1.3 Bayes theorem, discrete case.
1.4 Bayes theorem, continuous parameter.
1.5 Parameters as random variables, beliefs as distributions.
1.6 Communicating the results of a Bayesian analysis.
1.7 Asymptotic properties of posterior distributions.
1.8 Bayesian hypothesis testing.
1.9 From subjective beliefs to parameters and models.
1.10 Historical note.
2. Getting started: Bayesian analysis for simple models.
2.1 Learning about probabilities, rates and proportions.
2.2 Associations between binary variables.
2.3 Learning from counts.
2.4 Learning about a normal mean and variance.
2.5 Regression models.
2.6 Further reading.
Part II: Simulation Based Bayesian Analysis.
3. Monte Carlo methods.
3.1 Simulation consistency.
3.2 Inference for functions of parameters.
3.3 Marginalization via Monte Carlo integration.
3.4 Sampling algorithms.
3.5 Further reading.
4. Markov chains.
4.1 Notation and definitions.
4.2 Properties of Markov chains.
4.3 Convergence of Markov chains.
4.4 Limit theorems for Markov chains.
4.5 Further reading.
5. Markov chain Monte Carlo.
5.1 Metropolis–Hastings algorithm.
5.2 Gibbs sampling.
6. Implementing Markov chain Monte Carlo.
6.1 Software for Markov chain Monte Carlo.
6.2 Assessing convergence and run–length.
6.3 Working with BUGS/JAGS from R.
6.4 Tricks of the trade.
6.5 Other examples.
6.6 Further reading.
Part III: Advanced Applications in the Social Sciences.
7. Hierarchical Statistical Models.
7.1 Data and parameters that vary by groups: the case for hierarchical modeling.
7.2 ANOVA as a hierarchical model.
7.3 Hierarchical models for longitudinal data.
7.4 Hierarchical models for non–normal data.
7.5 Multi–level models.
8. Bayesian analysis of choice making.
8.1 Regression models for binary responses.
8.2 Ordered outcomes.
8.3 Multinomial outcomes.
8.4 Multinomial probit.
9. Bayesian approaches to measurement.
9.1 Bayesian inference for latent states.
9.2 Factor analysis.
9.3 Item–response models.
9.4 Dynamic measurement models.
Part IV: Appendices.
Appendix A: Working with vectors and matrices.
Appendix B: Probability review.
B.1 Foundations of probability.
B.2 Probability densities and mass functions.
B.3 Convergence of sequences of random variabales.
Appendix C: Proofs of selected propositions.
C.1 Products of normal densities.
C.2 Conjugate analysis of normal data.
C.3 Asymptotic normality of the posterior density.