You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Have you implemented a way to freeze training for certain layers of the network?
I have the backbone and geometric feature portion of the network trained, I would like to fine-tune the Patch-PnP and the head.
If you have not implemented this, would you recommend editing the model itself, or do it some other way?
The text was updated successfully, but these errors were encountered:
But this does not work if I need to train up to a checkpoint with all layers trainable, then freezing some and resuming the training. I think the optimizer finds a mismatch in parameters from the checkpoint.
Have you implemented a way to freeze training for certain layers of the network?
I have the backbone and geometric feature portion of the network trained, I would like to fine-tune the Patch-PnP and the head.
If you have not implemented this, would you recommend editing the model itself, or do it some other way?
The text was updated successfully, but these errors were encountered: