Skip to content

SaShukla090/Multilayer-Perceptron-Module

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Multilayer-Perceptron-Module

Neural Network implementation from scratch

File "DL_Assignment1_19301.ipynb" has example implementation of Simple ANN with Adam opimizer.

Currently module supports

  • Adam opimizer with batch size 1

  • Fully connected layers of any size

  • Activation layer with

    • Tanh
    • Relu
  • Softmax Activation layer.

  • loss functions:

    • cross entropy loss

more options for optimizers and activation functions will be added soon. if want specific custom optimizer or custom activation fucntion to be added to the module, you can mail me [email protected]

My Image

Contact:

About

Neural Network implementation from scratch

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published