Towards statistically reliable uncertainty quantification in deep learning

  • Tuesday, 13. May 2025, 15:00
  • Mathematikon, conference room (5/104)
    • Mathias Trabs (Karlsruhe Institute of Technology)
  • Address

    Mathematikon, Im Neuenheimer Feld 205
    Conference room (5/104), 5th floor

  • Event Type

An essential feature in modern data science, especially in machine learning as well as high-dimensional statistics, are large sample sizes and large parameter space dimensions. As a consequence, the design of methods for uncertainty quantification is characterized by a tension between numerically feasible and efficient algorithms and approaches which satisfy theoretically justified statistical properties. 
In this talk we discuss Bayesian uncertainty quantification with a stochastic Metropolis-Hastings sampling algorithm as a potential solution. By calculating acceptance probabilities on batches, a stochastic Metropolis-Hastings step saves computational costs, but reduces the effective sample size. We show that this obstacle can be avoided by a simple correction term. We study statistical properties of the resulting surrogate posterior distribution in a non-parametric regression setting. Focusing on deep neural network regression, we prove a PAC-Bayes oracle inequality which yields optimal contraction rates for Bayesian neural networks and we analyze the diameter and show high coverage probability of the resulting credible sets.

All Dates of the Event 'Symposium Mathematical Statistics'