@@ -73,26 +73,79 @@

Organizers

-
Andrei Bursuc
Valeo.ai
+
Andrei Bursuc
valeo.ai

Overview

-
TBA
+
+ This tutorial is here to help researchers understand and handle uncertainty in their models, + making them more reliable using Bayesian methods. We'll start by discussing different Bayesian + approaches and then focus on Bayesian Neural Networks and how to approximate them efficiently for + computer vision tasks. We will also use real-world examples and practical methods to show how to + put these ideas into practice. +

Outline

-
TBA
-

Uncertainty quantification framework

-

In this section, we will quickly introduce the TorchUncertainty - library, an uncertainty-aware open-source framework for training models in PyTorch.

+

Introduction: Why & where is UQ helpful?

+

+ Initial exploration into the critical role of uncertainty quantification (UQ) within the realm + of computer vision (CV): participants will gain an understanding of why it’s essential to consider uncertainty in CV, especially concerning decision-making in complex + environments. We will introduce real-world scenarios where uncertainty can profoundly + impact model performance and safety, setting the stage for deeper exploration through out the tutorial. +

+

From maximum a posteriori to BNNs.

+

+ In this part, we will journey through the evolution of UQ techniques, starting + from classic approaches such as maximum a posteriori estimation to the more ellaborate Bayesian Neural Networks. The participants will grasp the conceptual foundations + of UQ, laying the groundwork for the subsequent discussions of Bayesian methods. +

+

Strategies for BNN posterior inference.

+

+ This is the core part, which will dive into the process of estimating the posterior distribution of BNNs. The participants + will gain insights into the computational complexities involved in modeling uncertainty + through a comprehensive overview of techniques such as Variational Inference (VI), + Hamiltonian Monte Carlo (HMC), and Langevin Dynamics. Moreover, we will explore + the characteristics and visual representation of posterior distributions, providing a better + understanding of Bayesian inference. +

+

Computationally-efficient BNNs for CV.

+

+ Here, we will present recent techniques to improve the computational efficiency of BNNs for computer vision tasks. + We will present different forms of obtaining BNNs from a intermediate checkpoints, + weight trajectories during a training run, different types of variational subnetworks, + etc., along with their main strenghts and limitations. +

+

Convert your DNN into a BNN: post-hoc BNN inference.

+

+ This segment focuses on post-hoc inference techniques, with a focus on Laplace approximation. The participants + will learn how Laplace approximation serves as a computationally efficient method for + approximating the posterior distribution of Bayesian Neural Networks. +

+

Quality of estimated uncertainty and practical examples.

+

+ In the final session, participants will learn how to evaluate the quality of UQ in practi- + cal settings. We will develop multiple approaches to assess the reliability and calibra- + tion of uncertainty estimates, equipping participants with the tools to gauge the robust- + ness of their models. Additionally, we will dive into real-world examples and applica- + tions, showcasing how UQ can enhance the reliability + and performance of computer vision systems in diverse scenarios. Through interactive + discussions and case studies, participants will gain practical insights into deploying + uncertainty-aware models in real-world applications. +

+ +

Uncertainty Quantification Framework.

+

+ This tutorial will also very quickly introduce the TorchUncertainty + library, an uncertainty-aware open-source framework for training models in PyTorch. +

@@ -103,16 +156,15 @@

Uncertainty quantification framework

Relation to prior tutorials and short courses

-

This tutorial is affiliated with the UNCV workshop, - which had its inaugural edition at ECCV and the subsequent one at ICCV, although our primary - emphasis in this tutorial will be on the theoretical facets.

-

Uncertainty Quantification has received some attention - in recent times, as evidenced by its inclusion as sections in +

This tutorial is affiliated with the UNCV Workshop, + which had its inaugural edition at ECCV 2022, a subsequent one at ICCV, and is back at ECCV this year. + In constrast to the workshop, the tutorial puts its primary emphasis on the theoretical facets.

+

UQ has received some attention + in recent times, as evidenced by its inclusion in the tutorial 'Many Faces of Reliability of Deep - Learning for Real-World Deployment'. While this excellent - tutorial explored various applications associated with uncertainty, it did not place a specific emphasis on - probabilistic - models and Bayesian Neural Networks. Our tutorial aims + Learning for Real-World Deployment'. While this tutorial explored various applications associated with + uncertainty, + it did not place a specific emphasis on probabilistic models and Bayesian Neural Networks. Our tutorial aims to provide a more in-depth exploration of uncertainty theory, accompanied by the introduction of practical applications, including the presentation of the library, TorchUncertainty.

diff --git a/wacv_2024.html b/wacv_2024.html index e36bc2a..22250b9 100755 --- a/wacv_2024.html +++ b/wacv_2024.html @@ -70,7 +70,7 @@

Organizers

-
Andrei Bursuc
Valeo.ai
+
Andrei Bursuc
valeo.ai