Skip to content

Latest commit

 

History

History
52 lines (52 loc) · 2.16 KB

2024-04-18-batten24a.md

File metadata and controls

52 lines (52 loc) · 2.16 KB
title software abstract layout series publisher issn id month tex_title firstpage lastpage page order cycles bibtex_author author date address container-title volume genre issued pdf extras
Tight Verification of Probabilistic Robustness in Bayesian Neural Networks
We introduce two algorithms for computing tight guarantees on the probabilistic robustness of Bayesian Neural Networks (BNNs). Computing robustness guarantees for BNNs is a significantly more challenging task than verifying the robustness of standard Neural Networks (NNs) because it requires searching the parameters’ space for safe weights. Moreover, tight and complete approaches for the verification of standard NNs, such as those based on Mixed-Integer Linear Programming (MILP), cannot be directly used for the verification of BNNs because of the polynomial terms resulting from the consecutive multiplication of variables encoding the weights. Our algorithms efficiently and effectively search the parameters’ space for safe weights by using iterative expansion and the network’s gradient and can be used with any verification algorithm of choice for BNNs. In addition to proving that our algorithms compute tighter bounds than the SoA, we also evaluate our algorithms against the SoA on standard benchmarks, such as MNIST and CIFAR10, showing that our algorithms compute bounds up to 40% tighter than the SoA.
inproceedings
Proceedings of Machine Learning Research
PMLR
2640-3498
batten24a
0
Tight Verification of Probabilistic Robustness in {B}ayesian Neural Networks
4906
4914
4906-4914
4906
false
Batten, Ben and Hosseini, Mehran and Lomuscio, Alessio
given family
Ben
Batten
given family
Mehran
Hosseini
given family
Alessio
Lomuscio
2024-04-18
Proceedings of The 27th International Conference on Artificial Intelligence and Statistics
238
inproceedings
date-parts
2024
4
18