Our NN will have a simple two-layer architecture. Input layer a[0] will have 784 units corresponding to the 784 pixels in each 28x28 input image. A hidden layer a[1] will have 10 units with ReLU activation, and finally our output layer a[2] will have 10 units corresponding to the ten digit classes with softmax activation. Inspired by Samson Zhang.
-
Notifications
You must be signed in to change notification settings - Fork 0
danpiwastaken/simple-NN-from-scratch
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
About
simple NN from scratch
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published