A GitHub repository implementing The Lottery Ticket Hypothesis paper by Jonathan Frankle & Michael Carbin
"lottery ticket hypothesis:" dense, randomly-initialized, feed-forward and/or convolutional networks contain subnetworks ("winning tickets") that - when trained in isolation - reach test accuracy comparable to the original network in a similar number of iterations. The winning tickets we find have won the initialization lottery: their connections have initial weights that make training particularly effective.
The paper can be downloaded from: The Lottery Ticket Hypothesis
Implementation for the paper Comparing Rewinding and Fine-tuning in Neural Network Pruning by Alex Renda et al.
- MNIST dataset using 300-100-10 Dense Fully connected neural network winning ticket identification.
- MNIST dataset using LeNet-5 Convolutional Neural Networks.
- Validation of the winning ticket identified for MNIST and CIFAR-10 dataset using relevant neural networks.
- Conv-2/4/6 Convolutional Neural Network (CNN) for CIFAR10 dataset; pruning network till 0.5% of original connections remain and observe training and testing accuracies and losses.
- Pruning Algorithm implementation: numpy based unstructured, layer-wise, absolute magnitude pruning and tensorflow_model_optimization toolkit based pruning (not the focus of most codes)
- Python 3.X
- numpy 1.17 and/or above
- TensorFlow 2.0
- PyTorch 2.X
- tensorflow_model_optimization (not focused on)