Skip to content

Latest commit

 

History

History
76 lines (76 loc) · 2.96 KB

2024-09-10-okanik24a.md

File metadata and controls

76 lines (76 loc) · 2.96 KB
title booktitle year volume series month publisher pdf url abstract layout issn id tex_title firstpage lastpage page order cycles bibtex_editor editor bibtex_author author date address container-title genre issued extras
Uncertainty Quantification for Metamodels
Proceedings of the Thirteenth Symposium on Conformal and Probabilistic Prediction with Applications
2024
230
Proceedings of Machine Learning Research
0
PMLR
In the realm of computational science, metamodels serve as indispensable tools for approximating complex systems, facilitating the exploration of scenarios where traditional modelling may prove computationally infeasible. However, the inherent uncertainties within these metamodels, particularly those driven by Machine Learning (ML), necessitate rigorous quantification to ensure reliability and robustness in decision-making processes. One alternative of obtaining uncertainty estimates is using ML models that have a native notion of uncertainty, such as the Bayesian Neural Networks (BNNs), however its repeated sampling necessary to approximate the output distribution is computationally demanding and might defeat the purpose of building metamodels in the first place. In datasets with multidimensional input space and a limited amount of training examples, error estimates provided by BNNs often have poor quality. This study explores alternative empirical approaches to uncertainty quantification, based on knowledge extraction from output space as opposed to input space. Leveraging patterns of magnitude of error committed by the metamodel in output space, we obtain significant improvement of adaptivity of prediction intervals, both over pure Conformal Prediction (CP) and BNNs. Our findings underscore the potential of integrating diverse uncertainty quantification methods to fortify reliability of metamodels, highlighting their robust and quantifiable confidence in model predictions.
inproceedings
2640-3498
okanik24a
Uncertainty Quantification for Metamodels
315
344
315-344
315
false
Vantini, Simone and Fontana, Matteo and Solari, Aldo and Bostr\"{o}m, Henrik and Carlsson, Lars
given family
Simone
Vantini
given family
Matteo
Fontana
given family
Aldo
Solari
given family
Henrik
Boström
given family
Lars
Carlsson
Ok\'anik, Martin and Trantas, Athanasios and de Bakker, Merijn Pepijn and Lazovik, Elena
given family
Martin
Okánik
given family
Athanasios
Trantas
given family prefix
Merijn Pepijn
Bakker
de
given family
Elena
Lazovik
2024-09-10
Proceedings of the Thirteenth Symposium on Conformal and Probabilistic Prediction with Applications
inproceedings
date-parts
2024
9
10