ELLIS header
University of Stuttgart Logo
Max Planck Institute for Intelligent Systems Logo

Flexible Prior Elicitation via the Prior Predictive Distribution

Marcelo Hartmann, Georgi Agiashvili, Paul Bürkner, Arto Klami

Proceedings of the 36th Conference on Uncertainty in Artificial Intelligence (UAI), pp. 1129–1138, 2020.


Abstract

The prior distribution for the unknown model parameters plays a crucial role in the process of statistical inference based on Bayesian methods. However, specifying suitable priors is often difficult even when detailed prior knowledge is available in principle. The challenge is to express quantitative information in the form of a probability distribution. Prior elicitation addresses this question by extracting subjective information from an expert and transforming it into a valid prior. Most existing methods, however, require information to be provided on the unobservable parameters, whose effect on the data generating process is often complicated and hard to understand. We propose an alternative approach that only requires knowledge about the observable outcomes - knowledge which is often much easier for experts to provide. Building upon a principled statistical framework, our approach utilizes the prior predictive distribution implied by the model to automatically transform experts judgements about plausible outcome values to suitable priors on the parameters. We also provide computational strategies to perform inference and guidelines to facilitate practical use.

Links


BibTeX

@inproceedings{hartmann20_uai, title = {Flexible Prior Elicitation via the Prior Predictive Distribution}, author = {Hartmann, Marcelo and Agiashvili, Georgi and Bürkner, Paul and Klami, Arto}, year = {2020}, booktitle = {Proceedings of the 36th Conference on Uncertainty in Artificial Intelligence (UAI)}, volume = {124}, pages = {1129--1138}, url = {https://proceedings.mlr.press/v124/hartmann20a.html} }