You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the bug
The AzureAIInferenceChatCompletionService class is currently marked as deprecated. However, for scenarios where we want to deploy open-source models (or other models that need custom handling), having a dedicated and fully supported Azure AI Inference chat completion service is crucial. The recommended alternative—using Azure.AI.Inference.ChatCompletionsClient.AsChatClient().AsChatCompletionService()—does not work as expected in my environment. In particular, there are issues with the extension methods (e.g., no overload for AsChatClient accepting a model/deployment parameter), making it impossible to correctly configure the service.
Expected behavior
It would be ideal if one of the following were addressed:
The AzureAIInferenceChatCompletionService should be re-enabled (or at least not marked obsolete) so that it can be used directly.
Alternatively, the recommended pathway using Azure.AI.Inference.ChatCompletionsClient should be updated or adapted to correctly support specifying a model/deployment parameter and properly wrap the client into an IChatCompletionService without the extension method issues.
Screenshots
If applicable, add screenshots to help explain your problem.
github-actionsbot
changed the title
Bug: Re-enable AzureAIInferenceChatCompletionService
.Net: Bug: Re-enable AzureAIInferenceChatCompletionService
Feb 25, 2025
Describe the bug
The AzureAIInferenceChatCompletionService class is currently marked as deprecated. However, for scenarios where we want to deploy open-source models (or other models that need custom handling), having a dedicated and fully supported Azure AI Inference chat completion service is crucial. The recommended alternative—using Azure.AI.Inference.ChatCompletionsClient.AsChatClient().AsChatCompletionService()—does not work as expected in my environment. In particular, there are issues with the extension methods (e.g., no overload for AsChatClient accepting a model/deployment parameter), making it impossible to correctly configure the service.
Expected behavior
It would be ideal if one of the following were addressed:
Screenshots
If applicable, add screenshots to help explain your problem.
Platform
https://github.com/microsoft/semantic-kernel/blob/6d0c6148cb0bbdb81f2e7f2430e69d325652cdb6/dotnet/src/Connectors/Connectors.AzureAIInference/Services/AzureAIInferenceChatCompletionService.cs
The text was updated successfully, but these errors were encountered: