Skip to content

Commit

Permalink
sfsu talk
Browse files Browse the repository at this point in the history
  • Loading branch information
shaoweilin committed Oct 27, 2023
1 parent 1a8f474 commit 14518f9
Show file tree
Hide file tree
Showing 2 changed files with 24 additions and 0 deletions.
Binary file added extra/public/20231025-sfsu.pdf
Binary file not shown.
24 changes: 24 additions & 0 deletions posts/2023-10-25-relative-information-and-the-dual-numbers.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
---
date: 2023-10-25
excerpts: 2
---

# Relative Information and the Dual Numbers

## Abstract

Relative information (Kullback-Leibler divergence) is a fundamental concept in statistics, machine learning and information theory.

In the first half of the talk, I will define conditional relative information, list its axiomatic properties, and describe how it is used in machine learning. For example, the generalization error of a learning algorithm depends on the structure of algebraic geometric singularities of relative information.

In the second half of the talk, I will define the rig category InfoRig of random variables and their conditional maps, as well as the rig category R(e) of dual numbers. Relative information can then be constructed, up to a scalar multiple, via rig functors from InfoRig to R(e). If time permits, I may discuss how this construction relates to the information cohomology of Baudot, Bennequin and Vigneaux, and to the operad derivations of Bradley.

## Pre-requisites

Random variables, probability distribution, conditional probability; category, object, morphism, functor; rig (semiring).


## Details
[San Francisco State University; Algebra, Geometry and Combinatorics Seminar](https://sites.google.com/view/sfsuagc/fall-2023)

[Slides](https://w3id.org/people/shaoweilin/public/20231025-sfsu.pdf)

0 comments on commit 14518f9

Please sign in to comment.