ELLIS header
University of Stuttgart Logo
Max Planck Institute for Intelligent Systems Logo

Uncertainty Estimation and Calibration with Finite-State Probabilistic RNNs

Cheng Wang, Carolin Lawrence, Mathias Niepert

Proc. of the Ninth International Conference on Learning Representations (ICLR), 2021.


Abstract

Uncertainty quantification is crucial for building reliable and trustable machine learning systems. We propose to estimate uncertainty in recurrent neural networks (RNNs) via stochastic discrete state transitions over recurrent timesteps. The uncertainty of the model can be quantified by running a prediction several times, each time sampling from the recurrent state transition distribution, leading to potentially different results if the model is uncertain. Alongside uncertainty quantification, our proposed method offers several advantages in different settings. The proposed method can (1) learn deterministic and probabilistic automata from data, (2) learn well-calibrated models on real-world classification tasks, (3) improve the performance of out-of-distribution detection, and (4) control the exploration-exploitation trade-off in reinforcement learning. An implementation is available.

Links


BibTeX

@inproceedings{wang21_iclr, title = {Uncertainty Estimation and Calibration with Finite-State Probabilistic RNNs}, author = {Wang, Cheng and Lawrence, Carolin and Niepert, Mathias}, year = {2021}, booktitle = {Proc. of the Ninth International Conference on Learning Representations (ICLR)}, doi = {}, url = {https://openreview.net/forum?id=9EKHN1jOlA} }