Replies: 1 comment
-
Once I upgraded to version 1.25.0 of Semantic Kernel, it worked. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I am getting a http 400 response_format value as json_schema is enabled only for api versions 2024-08-01-preview and later error even though I am using gpt-4o model version 2024-08-06 (full error message below).
Any ideas on what I am doing wrong here?
I am following the steps in this link, but instead using AzureOpenAI:
https://devblogs.microsoft.com/semantic-kernel/using-json-schema-for-structured-output-in-net-for-openai-models/#structured-outputs-with-system.type
Here is my source code (I am using version 1.21.1 for the SemanticKernal and SemanticKernel.Connectors.OpenAI nuget packages):
Here is the full error message:
Microsoft.SemanticKernel.HttpOperationException
HResult=0x80131500
Message=HTTP 400 (BadRequest)
response_format value as json_schema is enabled only for api versions 2024-08-01-preview and later
$>d__0.MoveNext() in C:\Users\mark.jones\source\repos\SK_Sample_MJ\Program.cs:line 29Source=Microsoft.SemanticKernel.Connectors.OpenAI
StackTrace:
at Microsoft.SemanticKernel.Connectors.OpenAI.ClientCore.d__69
1.MoveNext() at Microsoft.SemanticKernel.Connectors.OpenAI.ClientCore.<GetChatMessageContentsAsync>d__15.MoveNext() at Microsoft.SemanticKernel.KernelFunctionFromPrompt.<GetChatCompletionResultAsync>d__20.MoveNext() at Microsoft.SemanticKernel.KernelFunctionFromPrompt.<InvokeCoreAsync>d__3.MoveNext() at System.Threading.Tasks.ValueTask
1.get_Result() in System.Threading.Tasks\ValueTask.cs:line 484at Microsoft.SemanticKernel.KernelFunction.<>c__DisplayClass21_0.<b__0>d.MoveNext()
at Microsoft.SemanticKernel.Kernel.d__34.MoveNext()
at Microsoft.SemanticKernel.Kernel.d__33.MoveNext()
at Microsoft.SemanticKernel.KernelFunction.d__21.MoveNext()
at Program.<
at Program.(String[] args)
This exception was originally thrown at this call stack:
Azure.AI.OpenAI.ClientPipelineExtensions.ProcessMessageAsync(System.ClientModel.Primitives.ClientPipeline, System.ClientModel.Primitives.PipelineMessage, System.ClientModel.Primitives.RequestOptions)
System.Threading.Tasks.ValueTask.Result.get() in ValueTask.cs
Azure.AI.OpenAI.Chat.AzureChatClient.CompleteChatAsync(System.ClientModel.BinaryContent, System.ClientModel.Primitives.RequestOptions)
OpenAI.Chat.ChatClient.CompleteChatAsync(System.Collections.Generic.IEnumerable<OpenAI.Chat.ChatMessage>, OpenAI.Chat.ChatCompletionOptions, System.Threading.CancellationToken)
Microsoft.SemanticKernel.Connectors.OpenAI.ClientCore.RunRequestAsync(System.Func<System.Threading.Tasks.Task>)
Inner Exception 1:
ClientResultException: HTTP 400 (BadRequest)
response_format value as json_schema is enabled only for api versions 2024-08-01-preview and later
Beta Was this translation helpful? Give feedback.
All reactions