Skip to content

Commit

Permalink
Fixing broken image links from my automated english pass
Browse files Browse the repository at this point in the history
  • Loading branch information
profvjreddi committed Sep 22, 2023
1 parent 328f8d9 commit c05df19
Showing 1 changed file with 6 additions and 6 deletions.
12 changes: 6 additions & 6 deletions dl_primer.qmd
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@

Deep learning, a specialized area within machine learning and artificial intelligence (AI), utilizes algorithms modeled after the structure and function of the human brain, known as artificial neural networks. This field is a foundational element in AI, driving progress in diverse sectors such as computer vision, natural language processing, and self-driving vehicles. Its significance in embedded AI systems is highlighted by its capability to handle intricate calculations and predictions, optimizing the limited resources in embedded settings.

![](Image URL)
![](https://1394217531-files.gitbook.io/~/files/v0/b/gitbook-legacy-files/o/assets%2F-LvBP1svpACTB1R1x_U4%2F-LvCh0IFvnfX-S1za_GI%2F-LvD0gbfAKEIMXcVxdqQ%2Fimage.png?alt=media&token=d6ca58f0-ebe3-4188-a90a-dc68256e1b0a)

### Brief History of Deep Learning

Expand All @@ -16,7 +16,7 @@ The term "deep learning" became prominent in the 2000s, characterized by advance

In recent times, deep learning has seen exponential growth, transforming various industries. Computational growth followed an 18-month doubling pattern from 1952 to 2010, which then accelerated to a 6-month cycle from 2010 to 2022. Concurrently, we saw the emergence of large-scale models between 2015 and 2022, appearing 2 to 3 orders of magnitude faster and following a 10-month doubling cycle.

![Growth of deep learning models.](Image URL)
![Growth of deep learning models.](https://epochai.org/assets/images/posts/2022/compute-trends.png){#fig-trends}

Multiple factors have contributed to this surge, including advancements in computational power, the abundance of big data, and improvements in algorithmic designs. First, the growth of computational capabilities, especially the arrival of Graphics Processing Units (GPUs) and Tensor Processing Units (TPUs), has significantly sped up the training and inference times of deep learning models. These hardware improvements have enabled the construction and training of more complex, deeper networks than what was possible in earlier years.

Expand Down Expand Up @@ -50,27 +50,27 @@ Neural networks serve as the foundation of deep learning, inspired by the biolog

The perceptron is the basic unit or node that serves as the foundation for more complex structures. A perceptron takes various inputs, applies weights and a bias to these inputs, and then uses an activation function to produce an output.

![Perceptron](Image URL)
![Perceptron](https://upload.wikimedia.org/wikipedia/commons/thumb/f/ff/Rosenblattperceptron.png/500px-Rosenblattperceptron.png){#fig-perceptron}

Conceived in the 1950s, perceptrons paved the way for the development of more intricate neural networks and have been a fundamental building block in the field of deep learning.

### Multi-layer Perceptrons

Multi-layer perceptrons (MLPs) are an evolution of the single-layer perceptron model, featuring multiple layers of nodes connected in a feedforward manner. These layers include an input layer for data reception, several hidden layers for data processing, and an output layer for final result generation. MLPs are skilled at identifying non-linear relationships and use a backpropagation technique for training, where weights are optimized through a gradient descent algorithm.

![Multilayer Perceptron](Image URL)
![Multilayer Perceptron](https://www.nomidl.com/wp-content/uploads/2022/04/image-7.png){width=70%}

### Activation Functions

Activation functions are crucial components in neural networks, providing the mathematical equations that determine a network's output. These functions introduce non-linearity into the network, enabling the learning of complex patterns. Popular activation functions include the sigmoid, tanh, and ReLU (Rectified Linear Unit) functions.

![Activation Function](Image URL)
![Activation Function](https://1394217531-files.gitbook.io/~/files/v0/b/gitbook-legacy-files/o/assets%2F-LvBP1svpACTB1R1x_U4%2F-LvNWUoWieQqaGmU_gl9%2F-LvO3qs2RImYjpBE8vln%2Factivation-functions3.jpg?alt=media&token=f96a3007-5888-43c3-a256-2dafadd5df7c){width=70%}

### Computational Graphs

Deep learning uses computational graphs to represent the various operations and their interactions within a neural network. This subsection explores the key phases of computational graph processing.

![TensorFlow Computational Graph](Image URL)
![TensorFlow Computational Graph](https://github.com/tensorflow/docs/blob/master/site/en/guide/images/intro_to_graphs/two-layer-network.png?raw=1){width=70%}

#### Forward Pass

Expand Down

0 comments on commit c05df19

Please sign in to comment.