Skip to content

danpiwastaken/simple-NN-from-scratch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 

Repository files navigation

Simple NN from scratch trained on MNIST dataset

Our NN will have a simple two-layer architecture. Input layer a[0] will have 784 units corresponding to the 784 pixels in each 28x28 input image. A hidden layer a[1] will have 10 units with ReLU activation, and finally our output layer a[2] will have 10 units corresponding to the ten digit classes with softmax activation. Inspired by Samson Zhang.

About

simple NN from scratch

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published