This repository implements nn_from_scratch
module, which is a pet-project, which implements basic Neural Networks functionality from scratch, using only numpy
module.
This repository was created for the completion of Introduction to Computer Vision course at Innopolis Unviersity.
├── lab1_rescaling # Labs of the course, also examples & testing of the module functionality
│ ...
├── labX_task_description
├── nn_from_scratch
│ ├── LICENSE
│ ├── nn_from_scratch
│ │ ├── __init__.py
│ │ ├── interfaces.py # Abstract classes
│ │ ├── nodes.py # Implementation of different nodes
│ │ ├── neurons.py # Implementation of different neurons
│ │ ├── optimizers.py # Implementation of different optimizers
│ └── setup.py
└── README.md
├── lab1_rescaling # Labs of the course, also examples & testing of the module functionality
│ ...
├── lab6_neural_network
├── nn_from_scratch
│ ├── nn_from_scratch
│ │ ├── examples # Examples of usage
│ │ │ └── ...
│ │ ├── __init__.py
│ │ ├── interfaces.py # Abstract classes
│ │ ├── nodes.py # Implementation of different nodes
│ │ ├── neurons.py # Implementation of different neurons
│ │ └── optimizers.py # Implementation of different optimizers
│ └── setup.py
└── README.md
pip install --upgrade module
cd nn_from_scratch
python3 -m build
pip install -e .
The module currently supports:
- Nodes
SoftMax
NormalizedSoftMax
(which is equivalent to SoftMax(x / x_max))ReLU
SoftMaxLoss
- Neurons
Linear
- Optimizers
GradientDescent
- Networks
NeuralNetwor
-- simple sample wrapper for neural network learning and prediction illustration
lab4_backprop
contains tests and usage examples ofSoftMax
andNormalizedSoftMax
nodes.lab5_gradient
contains tests and usage examples ofReLU
andLinear
nodes.lab6_neural_network
contains tests and usage examples ofGradientDescent
optimizer andSoftMaxLoss
node and illustrates the performance of the network onMNIST
dataset.
- Interfaces, which minimize the amount of code needed for new node creation
- All nodes support vector and matrix inputs, with behaviour defined node-wise
- Reconsider
Neuron
abstract class. For now, it requires too much new code generation - Implement analogue of
nn.Sequential
- Make initialization of nodes batch-free
- Make inner dimensions more transparent and convinient to use. For now, they often require explicit treatment, and sometimes event walkarounds are required to work properly.