This repository implements SiCoVa (Self-Supervised Learning with Variance-Invariance-Covariance and Cross-Correlation Regularization), extending the existing VICReg framework with an additional cross-correlation regularization term. SiCoVa supports self-supervised pretraining, linear evaluation, and fine-tuning on downstream classification tasks. The implementation uses PyTorch and Torchvision, featuring a ResNet50 backbone, MLP projector, and linear classifier.
- SiCoVa Implementation:
- ResNet50 encoder with an MLP expander for self-supervised learning.
- Additional cross-correlation regularization term for improved representation learning.
- Linear Evaluation:
- Freezes the pretrained encoder and trains a linear classification layer.
- Fine-Tuning:
- Option to unfreeze parts of the encoder for end-to-end optimization.
- Training and Validation:
- Configurable training loop with support for checkpointing and detailed metrics.
- Top-k Accuracy:
This directory is dedicated to the pretraining stage using the SiCoVa method.
pretraining.py
: The main script for the pretraining process, implementing SiCoVa with ResNet-50.
Contains scripts and resources for evaluating the pretrained model via linear evaluation.
linear_eval.py
: The script for performing linear evaluation on the pretrained model to assess its learned representations.
Includes resources for fine-tuning the pretrained model on a downstream task.
fine_tune.py
: The script for fine-tuning the pretrained model on a specific dataset.
- Navigate to the
pretraining/
directory. - Run the
pretraining.py
script:python pretraining.py
- Checkpoints will be saved in the same directory for further use.
- Navigate to the
linear_evaluation/
directory. - Run the
linear_eval.py
script:python linear_eval.py
- Navigate to the
fine_tuning/
directory. - Run the
fine_tune.py
script:python fine_tune.py
- Python >= 3.8
- PyTorch >= 1.10
- Install additional dependencies:
pip install -r requirements.txt
This project is licensed under the MIT License. See the LICENSE
file for details.
Happy training! 🚀