Skip to content

"Deep Learning Crash Course" is a comprehensive and up-to-date guide that takes you from simple neural networks all the way to cutting-edge deep learning architectures-no advanced math and programming required, just a basic knowledge of programming.

Notifications You must be signed in to change notification settings

DeepTrackAI/DeepLearningCrashCourse

Repository files navigation

Deep Learning Crash Course

Early Access - Use Code PREORDER for 25% Off
by Benjamin Midtvedt, Jesús Pineda, Henrik Klein Moberg, Harshith Bachimanchi, Joana B. Pereira, Carlo Manzo, Giovanni Volpe
No Starch Press, San Francisco (CA), 2025
ISBN-13: 9781718503922
https://nostarch.com/deep-learning-crash-course


Deep Learning Crash Course is a comprehensive and up-to-date guide that takes you from simple neural networks all the way to cutting-edge deep learning architectures-no advanced math and programming required, just a basic knowledge of programming. From CNNs and GANs to Transformers and Diffusion Models, each chapter brings you hands-on, real-world projects so you can build and truly master the latest AI breakthroughs. Whether you’re an engineer, scientist, or just curious about AI, you’ll discover how to implement, optimize, and innovate with the full spectrum of modern deep learning techniques.


  1. Dense Neural Networks for Classification
    Introduces single- and multi-layer perceptrons for classification tasks (e.g., MNIST digit recognition).

  2. Dense Neural Networks for Regression
    Explores regression problems and digital twins, focusing on continuous-value prediction with multi-layer networks.

  3. Convolutional Neural Networks for Image Analysis
    Covers convolutional neural networks (CNNs) and their application to tasks such as image classification, localization, style transfer, and DeepDream.

  4. Encoders–Decoders for Latent Space Manipulation
    Focuses on autoencoders, variational autoencoders, Wasserstein autoencoders, and anomaly detection, enabling data compression and generation.

  5. U-Nets for Image Transformation
    Discusses U-Net architectures for image segmentation, cell counting, and various biomedical imaging applications.

  6. Self-Supervised Learning to Exploit Symmetries
    Explains how to use unlabeled data and the symmetries symmetries of a problem for improved model performance with an application in particle localization.

  7. Recurrent Neural Networks for Timeseries Analysis
    Uses recurrent neural networks (RNNs), GRUs, and LSTMs to forecast time-dependent data and build a simple text translator.

  8. Attention and Transformers for Sequence Processing
    Introduces attention mechanisms, transformer models, and vision transformers (ViT) for natural language processing (NLP) including improved text translation and sentiment analysis, and image classification.

  9. Generative Adversarial Networks for Image Synthesis
    Demonstrates generative adversarial networks (GAN) training for image generation, domain translation (CycleGAN), and virtual staining in microscopy.

  10. Diffusion Models for Data Representation and Exploration
    Presents denoising diffusion models for generating and enhancing images, including text-to-image synthesis and image super-resolution.

  11. Graph Neural Networks for Relational Data Analysis
    Shows how graph neural networks (GNNs) can model graph-structured data (molecules, cell trajectories, physics simulations) using message passing and graph convolutions.

  12. Active Learning for Continuous Learning
    Describes techniques to iteratively select the most informative samples to label, improving model performance efficiently.

  13. Reinforcement Learning for Strategy Optimization
    Explains Q-learning and Deep Q-learning by teaching an agent to master games such as Tetris.

  14. Reservoir Computing for Predicting Chaos
    Covers reservoir computing methods for forecasting chaotic systems such as the Lorenz attractor.


About

"Deep Learning Crash Course" is a comprehensive and up-to-date guide that takes you from simple neural networks all the way to cutting-edge deep learning architectures-no advanced math and programming required, just a basic knowledge of programming.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published