You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, I'm trying to migrate this sample to TL2.
But I found in train.py : 116 that optimizer only works for Decoder's variables, and there are codes restoring weights from .npz file to Encoders before training (and these weights will not be changed during training).
In order to migrate to TL2, and without pretrained Encoder's weights at hand, I need to write a model composed of 3 Encoders and 1 Decoders, and then config a tf.keras.optimizers.Adam optimizer to apply gradients to all trainable variables of the model, am i right?
The text was updated successfully, but these errors were encountered:
Hi, I'm trying to migrate this sample to TL2.
But I found in
train.py : 116
that optimizer only works for Decoder's variables, and there are codes restoring weights from .npz file to Encoders before training (and these weights will not be changed during training).In order to migrate to TL2, and without pretrained Encoder's weights at hand, I need to write a model composed of 3 Encoders and 1 Decoders, and then config a
tf.keras.optimizers.Adam
optimizer to apply gradients to all trainable variables of the model, am i right?The text was updated successfully, but these errors were encountered: