Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added support for AzureAI client service #188

Open
wants to merge 16 commits into
base: main
Choose a base branch
from

Conversation

adityasugandhi
Copy link
Contributor

@adityasugandhi adityasugandhi commented Aug 21, 2024

Added Support Azure AI, client serivce #92

Now you can use either OPEN_AI_KEY or you can use Azure identity service to directly authenticate to your end points using credentials

@hmehta92
Copy link

@adityasugandhi
It seems that line 183 to 200, is duplicate of 163 to 180. Also, token in line 178 didn't work for me. I used azure_ad_token_provider instead.

fixed azure_ad_token provider in AzureOpenAI
@adityasugandhi
Copy link
Contributor Author

adityasugandhi commented Aug 28, 2024

@hmehta92 I have fixed the azure_ad_token provider call, and line 163 to 180 serves for AzureOpenAI, where line 183 to 200 is for AsyncAzureOpenAI.

@chainyo
Copy link

chainyo commented Sep 6, 2024

Hi, what's the status of this PR? Do you need help @adityasugandhi for anything?

I would like to see this feature implemented asap along with AWS Bedrock as they could serve more companies as they usually go with cloud providers when the other part of the cloud is hosted there.

@adityasugandhi
Copy link
Contributor Author

@chainyo I am still waiting for the review from the repo owner. @liyin2015

Sure, I will start looking into AWS Bedrock

Copy link
Member

@liyin2015 liyin2015 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

(1) Please add azure as a optional package in lazy_import
(2) please add a test file in adalflow/tests
(3) add azure in the pyproject.toml as an extra package

@adityasugandhi

chat_completion_parser: Callable[[Completion], Any] = None,
input_type: Literal["text", "messages"] = "text",
):
r"""It is recommended to set the OPENAI_API_KEY environment variable instead of passing it as an argument.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

so it does not have it only api_key? why using openai_api_key?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@liyin2015 I have made changes as mentioned in the comments

@adityasugandhi
Copy link
Contributor Author

@liyin2015 I have made changes

log_probs = []
for c in completion.choices:
content = c.logprobs.content
print(content)

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@adityasugandhi I think you forgot to remove print statement here on line 90.

@chainyo
Copy link

chainyo commented Oct 3, 2024

Hey where are we at with this PR? @adityasugandhi

@@ -0,0 +1,450 @@
"""AzureOpenAI ModelClient integration."""

import os
Copy link
Member

@liyin2015 liyin2015 Oct 6, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This feels almost the same as openai client, maybe we should just subclass from openai client and overwrite a few functions. [Will accept for now, but will need a lot more work to simplify]



openai = safe_import(OptionalPackages.OPENAI.value[0], OptionalPackages.OPENAI.value[1])
from azure.identity import DefaultAzureCredential, get_bearer_token_provider
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

please change this to safe import

Copy link
Member

@liyin2015 liyin2015 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Approve for now and will test more later and release in the next version

from openai.types.chat import ChatCompletionChunk, ChatCompletion
from adalflow.core.model_client import ModelClient
from adalflow.core.types import ModelType, EmbedderOutput, TokenLogProb, CompletionUsage, GeneratorOutput
from adalflow.components.model_client.openai_client import AzureAIClient
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

all tests are wrong.

Copy link
Member

@liyin2015 liyin2015 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This pr is nowhere near to be accepted, and will need to take a more serious investigation and tests and code refactor

@adityasugandhi
Copy link
Contributor Author

This pr is nowhere near to be accepted, and will need to take a more serious investigation and tests and code refactor

Hi @liyin2015 , I have updated the pr and have made fixes as suggested earlier. with test_azure_client.py tested successfully.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants