Skip to content

Commit

Permalink
Add ipam talk
Browse files Browse the repository at this point in the history
  • Loading branch information
shaoweilin committed Oct 15, 2024
1 parent a0bae41 commit 0766c1b
Show file tree
Hide file tree
Showing 2 changed files with 18 additions and 1 deletion.
2 changes: 1 addition & 1 deletion posts/2024-03-21-prior-work-on-program-synthesis.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ Here is a list of prior work I did with my collaborators and Ph.D. students on d

2021: Shaowei Lin. [Proofs as programs: challenges and strategies for program synthesis](https://shaoweilin.github.io/posts/2021-04-22-proofs-as-programs-challenges-and-strategies-for-program-synthesis/).

2018: Shaowei Lin. [Machine Reasoning and Deep Spiking Networks](https://shaoweilin.github.io/public/20180526-aisg.pdf).
2018: Shaowei Lin. [Machine reasoning and deep spiking networks](https://shaoweilin.github.io/public/20180526-aisg.pdf).

2017: Shaowei Lin. [Artificial general intelligence for the internet of things](https://shaoweilin.github.io/posts/2017-05-08-artificial-general-intelligence-for-the-internet-of-things/).

Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
---
date: 2024-10-16
excerpts: 2
---

# Singular learning, relative information and the dual numbers

## Abstract

Relative information (Kullback-Leibler divergence) is a fundamental concept in statistics, machine learning and information theory. In the first half of the talk, I will define conditional relative information, list its axiomatic properties, and describe how it is used in machine learning. For example, according to Sumio Watanabe's Singular Learning Theory, the generalization error of a learning algorithm depends on the structure of algebraic geometric singularities of relative information. In the second half of the talk, I will define the rig category Info of random variables and their conditional maps, as well as the rig category R(e) of dual numbers. Relative information can then be constructed, up to a scalar multiple, via rig monoidal functors from Info to R(e). If time permits, I may discuss how this construction relates to the information cohomology of Baudot, Bennequin and Vigneaux, and to the operad derivations of Bradley.

## Details
[IPAM Theory and Practice of Deep Learning Workshop](https://www.ipam.ucla.edu/abstract/?tid=20677&pcode=MOIWS2)

[Video]

[Slides]

0 comments on commit 0766c1b

Please sign in to comment.