From 17510809dcc41b1011e3036e4ed67587663ad250 Mon Sep 17 00:00:00 2001 From: Ross Wightman Date: Thu, 11 May 2023 15:28:42 -0700 Subject: [PATCH] Update README.md --- README.md | 1 + 1 file changed, 1 insertion(+) diff --git a/README.md b/README.md index 9116bafbaf..78a03bbf9a 100644 --- a/README.md +++ b/README.md @@ -31,6 +31,7 @@ And a big thanks to all GitHub sponsors who helped with some of my costs before * The pretrained_tag is the specific weight variant (different head) for the architecture. * Using just using `architecture` uses the 'default' pretrained tag (first instance in default_cfgs for that arch). * In adding pretrained tags, many model names that existed to differentiate were renamed to use the tag (ex: `vit_base_patch16_224_in21k` -> `vit_base_patch16_224.augreg_in21k`). There are deprecation mappings for these. +* A number of models had their checkpoints remaped to match architecture changes needed to better support `features_only=True`, there are `checkpoint_filter_fn` methods in any model module that was remapped. These can be passed to `timm.models.load_checkpoint(..., filter_fn=timm.models.swin_transformer_v2.checkpoint_filter_fn)` to remap your existing checkpoint. * The Hugging Face Hub (https://huggingface.co/timm) is now the primary source for `timm` weights. Model cards include link to papers, original source, license. * Previous 0.6.x can be cloned from [0.6.x](https://github.com/rwightman/pytorch-image-models/tree/0.6.x) branch or installed via pip with version.