You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have trained a base model using 4 female datasets using the followding lr.
optD = torch.optim.Adam(netD.parameters(), lr=1e-4, betas=(0.5, 0.9))
Then I want to finetune the model using a new small female dataset, should I change the lr.
Maybe lr=0.00001 without decay?
Does anyone have done the experiements and have any conclusion?
The text was updated successfully, but these errors were encountered:
I have trained a base model using 4 female datasets using the followding lr.
optD = torch.optim.Adam(netD.parameters(), lr=1e-4, betas=(0.5, 0.9))
Then I want to finetune the model using a new small female dataset, should I change the lr.
Maybe lr=0.00001 without decay?
Does anyone have done the experiements and have any conclusion?
The text was updated successfully, but these errors were encountered: