diff --git a/assets/elsa_logo.png b/assets/elsa_logo.png new file mode 100644 index 0000000..4047d88 Binary files /dev/null and b/assets/elsa_logo.png differ diff --git a/index.html b/index.html index e839672..3685aa0 100755 --- a/index.html +++ b/index.html @@ -97,19 +97,22 @@

Outline

Introduction: Why & where is UQ helpful?

Initial exploration into the critical role of uncertainty quantification (UQ) within the realm - of computer vision (CV): participants will gain an understanding of why it’s essential to consider uncertainty in CV, especially concerning decision-making in complex + of computer vision (CV): participants will gain an understanding of why it’s essential to consider + uncertainty in CV, especially concerning decision-making in complex environments. We will introduce real-world scenarios where uncertainty can profoundly impact model performance and safety, setting the stage for deeper exploration through out the tutorial.

From maximum a posteriori to BNNs.

In this part, we will journey through the evolution of UQ techniques, starting - from classic approaches such as maximum a posteriori estimation to the more ellaborate Bayesian Neural Networks. The participants will grasp the conceptual foundations + from classic approaches such as maximum a posteriori estimation to the more ellaborate Bayesian Neural + Networks. The participants will grasp the conceptual foundations of UQ, laying the groundwork for the subsequent discussions of Bayesian methods.

Strategies for BNN posterior inference.

- This is the core part, which will dive into the process of estimating the posterior distribution of BNNs. The participants + This is the core part, which will dive into the process of estimating the posterior distribution of BNNs. + The participants will gain insights into the computational complexities involved in modeling uncertainty through a comprehensive overview of techniques such as Variational Inference (VI), Hamiltonian Monte Carlo (HMC), and Langevin Dynamics. Moreover, we will explore @@ -118,19 +121,21 @@

Strategies for BNN posterior inference.

Computationally-efficient BNNs for CV.

- Here, we will present recent techniques to improve the computational efficiency of BNNs for computer vision tasks. + Here, we will present recent techniques to improve the computational efficiency of BNNs for computer vision + tasks. We will present different forms of obtaining BNNs from a intermediate checkpoints, weight trajectories during a training run, different types of variational subnetworks, etc., along with their main strenghts and limitations.

Convert your DNN into a BNN: post-hoc BNN inference.

-

- This segment focuses on post-hoc inference techniques, with a focus on Laplace approximation. The participants +

+ This segment focuses on post-hoc inference techniques, with a focus on Laplace approximation. The + participants will learn how Laplace approximation serves as a computationally efficient method for approximating the posterior distribution of Bayesian Neural Networks.

Quality of estimated uncertainty and practical examples.

-

+

In the final session, participants will learn how to evaluate the quality of UQ in practi- cal settings. We will develop multiple approaches to assess the reliability and calibra- tion of uncertainty estimates, equipping participants with the tools to gauge the robust- @@ -142,9 +147,10 @@

Quality of estimated uncertainty and practical exam

Uncertainty Quantification Framework.

-

- This tutorial will also very quickly introduce the TorchUncertainty - library, an uncertainty-aware open-source framework for training models in PyTorch. +

+ This tutorial will also very quickly introduce the TorchUncertainty + library, an uncertainty-aware open-source framework for training models in PyTorch.

@@ -202,8 +208,21 @@

Selected References

href="https://github.com/ensta-u2is-ai/awesome-uncertainty-deeplearning">Awesome Uncertainty in deep learning. + +
+ +
+

Andrei Bursuc is supported by

+ +
+ +
+ +
+ +