Logo der Physikalisch-Technischen Bundesanstalt

A novel framework for benchmarking uncertainty in deep regression

20.10.2022

Uncertainty quantification can help to understand the behavior of a trained neural network and, in particular, foster confidence in its predictions. This is especially true for deep regression, where a single-point estimate of a sought function without any information regarding its accuracy can be largely meaningless. We propose a novel framework for benchmarking uncertainty quantification in deep regression. The framework is based on regression problems where the regression function is a linear combination of nonlinear functions. Any level of complexity can be realized through the choice of the nonlinear functions and the dimensionality of their domain. Results of uncertainty quantification for deep regression are compared against those obtained by a statistical reference method. The reference method utilizes knowledge about the underlying nonlinear functions and is based on a Bayesian linear regression using a reference prior. The flexibility to design test problems with specified complexity, along with the availability of a reference solution, qualifies the proposed framework to form the basis for designing a set of benchmark problems for uncertainty quantification in deep regression. The reliability of uncertainty quantification is assessed in terms of coverage probabilities, and accuracy through the size of calculated uncertainties. The proposed framework is illustrated by applying it to current approaches for uncertainty quantification in deep regression. In addition, results for three real-world regression tasks are presented.

 

The research is published in: Schmähling, F., Martin, J., Elster, C. (2022), A framework for benchmarking uncertainty in deep regression. Applied Intelligence, 1573-7497.

 

The paper is available at https://doi.org/10.1007/s10489-022-03908-3.

 

Contact: Franko Schmähling (8.42)