You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
prompt is a sentence ,we don't need to predict next token in prompt, is there a question to see the right tokens?
x = self.attention(x, causal_mask=True)
The text was updated successfully, but these errors were encountered:
I had the same query. This is the answer I found in the CLIP paper by OpenAI:
"Masked self-attention was used in the text encoder to preserve the ability to initialize with a pre-trained language model or add language modeling as an auxiliary objective, though exploration of this is left as future work."
prompt is a sentence ,we don't need to predict next token in prompt, is there a question to see the right tokens?
x = self.attention(x, causal_mask=True)
The text was updated successfully, but these errors were encountered: