Bayesian Posterior Comprehension via Message from Monte Carlo
L. J. Fitzgibbon, D. L. Dowe and L. Allison, Second Hawaii International Conference on Statistics and Related Fields, Hawaii, 5-8 June 2003
Abstract. We discuss the problem of producing an epitome, or brief summary, of a Bayesian posterior distribution - and then investigate a general solution based on the Minimum Message Length (MML) principle. Clearly, the optimal criterion for choosing such an epitome is determined by the epitome's intended use. The interesting general case is where this use is unknown since, in order to be practical, the choice of epitome criterion becomes subjective. We identify a number of desirable properties that an epitome could have - facilitation of point estimation, human comprehension, and fast approximation of posterior expectations. We call these the properties of Bayesian Posterior Comprehension and show that the Minimum Message Length principle can be viewed as an epitome criterion that produces epitomes having these properties. We then present and extend Message from Monte Carlo as a means for constructing instantaneous Minimum Message Length codebooks (and epitomes) using Markov Chain Monte Carlo methods. The Message from Monte Carlo methodology is illustrated for binary regression, generalised linear model, and multiple change-point problems.
Keywords: Bayesian, Minimum Message Length, MML, MCMC, RJMCMC, Message from Monte Carlo, MMC, posterior summary, epitome, Bayesian Posterior Comprehension.
[paper.pdf][5/'03].
- 1 Introduction
- 2 Bayesian Posterior Comprehension
- 3 Message ffrom Monte Carlo
- 3.1 The Point Estimate
- 4 Unimodal Likelihood Function
- 4.1 Example: Dog Shock Experiment
- 4.2 Example: Beetle Mortality Data
- 5 Multimodal Likelihood Function
- 6 Variable Dimension Posterior
- 6.1 Example: Multiple Change-Point Model
- 2 Bayesian Posterior Comprehension
- (segmentation)
- . . .