You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I tried to compare swin_base_patch4_window7_224.pth and synapse_pretrain.model you website mentioned. But I could not find the correspondences.
For example,
model_down.layers.0.blocks.0.attn.proj.weight torch.Size([192, 192]) in synapse_pretrain.model not existed at all at swin transfromer I found.
Can you tell where your get pretrained swin-transformer? If you can explain more that would be highly appreciated. ( I understand you get attn and fcn weight from swin transformer, and assign to your pretrained model)
The text was updated successfully, but these errors were encountered:
I tried to compare swin_base_patch4_window7_224.pth and synapse_pretrain.model you website mentioned. But I could not find the correspondences.
For example,
model_down.layers.0.blocks.0.attn.proj.weight torch.Size([192, 192]) in synapse_pretrain.model not existed at all at swin transfromer I found.
Can you tell where your get pretrained swin-transformer? If you can explain more that would be highly appreciated. ( I understand you get attn and fcn weight from swin transformer, and assign to your pretrained model)
The text was updated successfully, but these errors were encountered: