You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I've stared at these lines in your excellent tutorial for a while now:
enc_padding_mask=tf.keras.layers.Lambda(
create_padding_mask, output_shape=(1, 1, None),
name='enc_padding_mask')(inputs)
# mask the future tokens for decoder inputs at the 1st attention blocklook_ahead_mask=tf.keras.layers.Lambda(
create_look_ahead_mask,
output_shape=(1, None, None),
name='look_ahead_mask')(dec_inputs)
# mask the encoder outputs for the 2nd attention blockdec_padding_mask=tf.keras.layers.Lambda(
create_padding_mask, output_shape=(1, 1, None),
name='dec_padding_mask')(inputs)
enc_padding_mask and dec_padding_mask will always be equal. Is this intentional? It seems weird to create two different padding masks that are the same.
The text was updated successfully, but these errors were encountered:
I've stared at these lines in your excellent tutorial for a while now:
enc_padding_mask
anddec_padding_mask
will always be equal. Is this intentional? It seems weird to create two different padding masks that are the same.The text was updated successfully, but these errors were encountered: