You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In one of the recent iterations of the transformers library, they took out the max_model_length warning for larger requests. Previously, I was splitting larger requests on max_model_length (512 chars) prior to sending it to the model server.
I would love to split into more manageable sizes like groups of 20-40 sentences, but wanted to make sure I could ignore the max-model-length now (specifically for the opus-mt models)
The text was updated successfully, but these errors were encountered:
In one of the recent iterations of the transformers library, they took out the max_model_length warning for larger requests. Previously, I was splitting larger requests on max_model_length (512 chars) prior to sending it to the model server.
I would love to split into more manageable sizes like groups of 20-40 sentences, but wanted to make sure I could ignore the max-model-length now (specifically for the opus-mt models)
The text was updated successfully, but these errors were encountered: