DINO tricks, Imagenette benchmarks, API improvements
DINO tricks, Imagenette benchmarks, API improvements
DINO tricks
DINOHead now allows to freeze the last layer which stabilizes the model performance.
DINOHead now also allows to normalise the last layer.
This was implemented by @Atharva-Phatak. Thank you very much!
Imagenette benchmarks
We now include benchmarks of all models on Imagenette.
API Improvements
Better documentation of custom metadata
The CLI command to upload custom metadata is now included in the command line tool examples.
Better dataset upsizing
Upsizing a dataset in the Lightly Platform by adding more samples to it now cannot happen accidentally anymore, instead you have to specify append=True
. Furthermore, bugs regarding appending new custom metadata have been fixed.
Create ApiWorkflowClient with token from env
When creating an ApiWorkflowClient
, you can now pass the token as environment variable LIGHTLY_TOKEN
instead of as argument.
Bugfixes in check_embeddings()
When checking embedding files, now columns like masked
and selected
are accounted for properly.
Models
- Bootstrap your own latent: A new approach to self-supervised Learning, 2020
- Barlow Twins: Self-Supervised Learning via Redundancy Reduction, 2021
- DINO: Emerging Properties in Self-Supervised Vision Transformers, 2021
- SimSiam: Exploring Simple Siamese Representation Learning, 2020
- MoCo: Momentum Contrast for Unsupervised Visual Representation Learning, 2019
- SimCLR: A Simple Framework for Contrastive Learning of Visual Representations, 2020
- NNCLR: Nearest-Neighbor Contrastive Learning of Visual Representations, 2021
- SwAV: Unsupervised Learning of Visual Features by Contrasting Cluster Assignments, M. Caron, 2020