Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

remove max_tokens from request in case of o3 and o1 models #911

Open
narengogi opened this issue Feb 2, 2025 · 0 comments
Open

remove max_tokens from request in case of o3 and o1 models #911

narengogi opened this issue Feb 2, 2025 · 0 comments
Labels

Comments

@narengogi
Copy link
Collaborator

narengogi commented Feb 2, 2025

o3 and o1 models do not support max_tokens, they expect max_completion_tokens instead which includes all the tokens used for reasoning also. though I dont think it is right to map max_tokens to max_completion_tokens incase the user only sends max_tokens and max_completion_tokens, we should atleast remove max_tokens so that it doesn't throw an error.

Since max_tokens is depecated in favour of max_completion_tokens we should just map it to max_completion_tokens

Image
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

1 participant