Skip to content

Latest commit

 

History

History
35 lines (27 loc) · 2.46 KB

File metadata and controls

35 lines (27 loc) · 2.46 KB

Linear algebra:

  1. Essence of linear algebra (linear transformations; matrix multiplication)
  2. Essence of calculus (derivatives; chain rule)

We strongly recommend watching this to understand Backpropagation.

  1. Neural Networks (chapter 1 - chapter 4) (animated introduction to neural networks and backpropagation)

Intro to convolutions.

  1. But what is a convolution? (convolution example; convolutions in image processing; convolutions and polynomial multiplication; FFT)

A good starting point for understanding Large Language Models.

  1. Neural Networks (chapter 5 - chapter 7) (GPT; visual explanation of attention; LLMs)

Papers explained

A good list of videos to present some classic papers up to SOTA methods in deep learning which help to understand how things work. We recommend to watch them in order, as they are somewhat correlated.

  1. [Classic] ImageNet Classification with Deep Convolutional Neural Networks
  2. [Classic] Deep Residual Learning for Image Recognition
  3. [Classic] Generative Adversarial Networks
  4. [Classic] Playing Atari with Deep Reinforcement Learning
  5. Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift
  6. Attention Is All You Need
  7. An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale
  8. Image GPT: Generative Pretraining from Pixels
  9. DINO: Emerging Properties in Self-Supervised Vision Transformers
  10. Perceiver: General Perception with Iterative Attention
  11. Mamba: Linear-Time Sequence Modeling with Selective State Spaces
  12. xLSTM: Extended Long Short-Term Memory