You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I don't know all the cases where we can get a BadRequestError, but one of them is exceeded context length:
[2023-11-14 11:09:32,241] [_common.py:105] Backing off openai_completion_create_retrying(...) for 17.0s (openai.BadRequestError: Error code: 400 - {'error': {'message': "This model's maximum context length is 8001 tokens, however you requested 8184 tokens (7672 in your prompt; 512 for the completion). Please reduce your prompt; or completion length.", 'type': 'invalid_request_error', 'param': None, 'code': None}})
And in this case we should definitely avoid repeating the request.
(I found this on #1407, but I don't think this is related to this particular PR)
NOTE: logic corresponding to backing off on openai.APIError is in a few different places, I think all of them need the same fix.
The text was updated successfully, but these errors were encountered:
After the recent change we back off on
openai.APIError
. This means backing off onopenai.BadRequestError
:I don't know all the cases where we can get a
BadRequestError
, but one of them is exceeded context length:And in this case we should definitely avoid repeating the request.
(I found this on #1407, but I don't think this is related to this particular PR)
NOTE: logic corresponding to backing off on
openai.APIError
is in a few different places, I think all of them need the same fix.The text was updated successfully, but these errors were encountered: