Understanding Computational Bayesian StatisticsISBN: 978-0-470-04609-8
Hardcover
336 pages
December 2009
This is a Print-on-Demand title. It will be printed specifically to fill your order. Please allow an additional 10-15 days delivery time. The book is not returnable.
|
Preface xi
1 Introduction to Bayesian Statistics I
1.1 The Frequentist Approach to Statistics 1
1.2 The Bayesian Approach to Statistics 3
1.3 Comparing Likelihood and Bayesian Approaches to Statistics 6
1.4 Computational Bayesian Statistics 19
1.5 Purpose and Organization of This Book 20
2 Monte Carlo Sampling from the Posterior 25
2.1 Acceptance-Rejection-Sampling 27
2.2 Sampling-Importance-Resampling 33
2.3 Adaptive-Rejection-Sampling from a Log-Concave Distribution 35
2.4 Why Direct Methods Are Inefficient for High-Dimension Parameter Space 42
3. Bayesian Inference 47
3.1 Bayesian Inference from the Numerical Posterior 47
3.2 Bayesian Inference from Posterior Random Sample 54
4. Bayesian Statistics Using Conjugate Priors 61
4.1 One-Dimensional Exponential Family of Densities 61
4.2 Distributions for Count Data 62
4.3 Distributions for Waiting Times 69
4.4 Normally Distributed Observations with Known Variance 75
4.5 Normally Distributed Observations with Known Mean 78
4.6 Normally Distributed Observations with Unknown Mean and Variance 80
4.7 Multivariate Normal Observations with Known Covariance Matrix 85
4.8 Observations from Normal Linear Regression Model 87
Appendix: Proof of Poisson Process Theorem 97
5. Markov Chains 101
5.1 Stochastic Processes 102
5.2 Markov Chains 103
5.3 Time-Invariant Markov Chains with Finite State Space 104
5.4 Classification of States of a Markov Chain 109
5.5 Sampling from a Markov Chain 114
5.6 Time-Reversible Markov Chains and Detailed Balance 117
5.7 Markov Chains with Continuous State Space 120
6. Markov Chain Monte Carlo Sampling from Posterior 127
6.1 Metropolis-Hastings Algorithm for a Single Parameter 130
6.2 Metropolis-Hastings Algorithm for Multiple Parameters 137
6.3 Blockwise Metropolis-Hastings Algorithm 144
6.4 Gibbs Sampling 149
6.5 Summary 150
7 Statistical Inference from a Markov Chain Monte Carlo Sample 159
7.1 Mixing Properties of the Chain 160
7.2 Finding a Heavy-Tailed Matched Curvature Candidate Density 162
7.3 Obtaining An Approximate Random Sample For Inference 168
Appendix: Procedure for Finding the Matched
Curvature Candidate Density for a Multivariate Parameter 176
8 Logistic Regression 179
8.1 Logistic Regression Model 180
8.2 Computational Bayesian Approach to the Logistic Regression Model 184
8.3 Modelling with the Multiple Logistic Regression Model 192
9 Poisson Regression and Proportional Hazards Model 203
9.1 Poisson Regression Model 204
9.2 Computational Approach to Poisson Regression Model 207
9.3 The Proportional Hazards Model 214
9.4 Computational Bayesian Approach to Proportional Hazards Model 218
10 Gibbs Sampling and Hierarchical Models 235
10.1 Gibbs Sampling Procedure 236
10.2 The Gibbs Sampler for the Normal Distribution 237
10.3 Hierarchical Models and Gibbs Sampling 242
10.4 Modelling Related Populations with Hierarchical Models 244
Appendix: Proof That Improper Jeffrey's Prior Distribution for the Hypervariance Can Lead to an
Improper Postenor 261
11 Going Forward with Markov Chain Monte Carlo 265
A Using the Included Minitab Macros 271
B Using the Included R Functions 289
References 307
Topic Index 313