You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I don't know why accuracy is 0 even I have set the path of pre-trained weights parser.add_argument('--finetune', default='/mnt/d/My experimets on mamba/Vim-main/vim_t_midclstok_76p1acc.pth', help='finetune from checkpoint')
parser.add_argument('--resume', default='/mnt/d/My experimets on mamba/Vim-main/vim_t_midclstok_76p1acc.pth', help='resume from checkpoint')
I found that I was testing the code on the wrong dataset. After switching to the correct dataset (ImageNet1K validation set), everything is working as expected now.
I don't know why accuracy is 0 even I have set the path of pre-trained weights parser.add_argument('--finetune', default='/mnt/d/My experimets on mamba/Vim-main/vim_t_midclstok_76p1acc.pth', help='finetune from checkpoint')
parser.add_argument('--resume', default='/mnt/d/My experimets on mamba/Vim-main/vim_t_midclstok_76p1acc.pth', help='resume from checkpoint')
Not using distributed mode
Namespace(gpu=0, batch_size=64, epochs=300, bce_loss=False, unscale_lr=False, model='vim_tiny_patch16_224_bimambav2_final_pool_mean_abs_pos_embed_with_midclstok_div2', input_size=224, drop=0.0, drop_path=0.1, model_ema=True, model_ema_decay=0.99996, model_ema_force_cpu=False, opt='adamw', opt_eps=1e-08, opt_betas=None, clip_grad=None, momentum=0.9, weight_decay=0.05, sched='cosine', lr=0.0005, lr_noise=None, lr_noise_pct=0.67, lr_noise_std=1.0, warmup_lr=1e-06, min_lr=1e-05, decay_epochs=30, warmup_epochs=5, cooldown_epochs=10, patience_epochs=10, decay_rate=0.1, color_jitter=0.3, aa='rand-m9-mstd0.5-inc1', smoothing=0.1, train_interpolation='bicubic', repeated_aug=True, train_mode=True, ThreeAugment=False, src=False, reprob=0.25, remode='pixel', recount=1, resplit=False, mixup=0.8, cutmix=1.0, cutmix_minmax=None, mixup_prob=1.0, mixup_switch_prob=0.5, mixup_mode='batch', teacher_model='regnety_160', teacher_path='', distillation_type='none', distillation_alpha=0.5, distillation_tau=1.0, cosub=False, finetune='/mnt/d/My experimets on mamba/Vim-main/vim_t_midclstok_76p1acc.pth', attn_only=False, data_path='/mnt/d/My experimets on mamba/Vim-main/imagenet', data_set='IMNET', inat_category='name', output_dir='', device='cuda', seed=0, resume='/mnt/d/My experimets on mamba/Vim-main/vim_t_midclstok_76p1acc.pth', start_epoch=0, eval='False', eval_crop_ratio=0.875, dist_eval=False, num_workers=1, pin_mem=True, distributed=False, world_size=1, dist_url='env://', if_amp=False, if_continue_inf=False, if_nan2num=False, if_random_cls_token_position=False, if_random_token_rank=False, local_rank=0)
Creating model: vim_tiny_patch16_224_bimambav2_final_pool_mean_abs_pos_embed_with_midclstok_div2
number of params: 7148008
The text was updated successfully, but these errors were encountered: