Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bug: The latest library of SemanticKernel does not support Azure o1 series models #9165

Closed
2023pfm opened this issue Oct 9, 2024 · 3 comments
Assignees
Labels
ai connector Anything related to AI connectors bug Something isn't working

Comments

@2023pfm
Copy link

2023pfm commented Oct 9, 2024

The latest library of SemanticKernel does not support Azure o1 series models.
Reason for error: Unsupported parameter: 'max_tokens' is not supported with this model Use 'max_completion_tokens' instead.
Additionally, calling the tool may also result in an error stating that the tool is not supported.
What should I do now, thank you.

@2023pfm 2023pfm added the bug Something isn't working label Oct 9, 2024
@markwallace-microsoft markwallace-microsoft added ai connector Anything related to AI connectors and removed triage labels Oct 9, 2024
@RogerBarreto
Copy link
Member

@2023pfm, can you provide more context for this issue?

What was the version used, some code for reproduction?

Currently we are using the new property from OpenAI SDK which already supports the new max_completion_tokens

Open SDK using the new name behind the scenes as you can see in here:
https://github.com/openai/openai-dotnet/blob/c49dd7065215bc0d094c7f79ccd634a38f0d7b66/src/Custom/Chat/ChatCompletionOptions.cs#L156
There you can also see their comments on the deprecated maxtokens.

So everything points me that you may be using an older release of our packages and that problem should not happen when using SK 1.22 moving forward.

Let me know if you still have problems with the latest 1.22 and more details about your error and problem.

@github-project-automation github-project-automation bot moved this from Bug to Sprint: Done in Semantic Kernel Oct 9, 2024
@2023pfm
Copy link
Author

2023pfm commented Oct 10, 2024

Image

How to implement MaxOutputTokenCount in the Azure OpenAiPomptExecutionSettings class.

@fwaris
Copy link

fwaris commented Nov 21, 2024

running into the same issue on Azure.

MaxTokens is not accepted for o1 models.

o1 supports a new parameter instead called 'max_completion_tokens'

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ai connector Anything related to AI connectors bug Something isn't working
Projects
Archived in project
Development

No branches or pull requests

4 participants