Checking for prior-data conflict using prior to posterior divergences

David J. Nott, Department of Statistics and Applied Probability, Operations Research and Analytics Cluster, National University of Singapore, Singapore 117546

Wang Xueou, Department of Statistics and Applied Probability, National University of Singapore, Singapore 117546

Michael Evans, Department of Statistics, University of Toronto, Toronto, Ontario, M5S 3G3, Canada

Berthold-Georg Englert, Department of Physics, Centre for Quantum Technologies, National University of Singapore, Singapore 117542 & MajuLab, CNRS-UNS-NUS-NTU International Joint Research Unit, UMI 3654, Singapore

ABSTRACT

When using complex Bayesian models to combine information, the checking for con- sistency of the information being combined is good statistical practice. Here a new method is developed for detecting prior-data conflicts in Bayesian models based on comparing the observed value of a prior to posterior divergence to its distribution un- der the prior predictive distribution for the data. The divergence measure used in our model check is a measure of how much beliefs have changed from prior to posterior, and can be thought of as a measure of the overall size of a relative belief function. It is shown that the proposed method is intuitive, has desirable properties, can be extended to hierarchical settings, and is related asymptotically to Jeffreys’ and reference prior distributions. In the case where calculations are difficult, the use of variational approx- imations as a way of relieving the computational burden is suggested. The methods are compared in a number of examples with an alternative but closely related approach in the literature based on the prior predictive distribution of a minimal sufficient statistic.