Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

separate llm and embedder #454

Merged
merged 17 commits into from
Jul 3, 2024
Merged

Conversation

cyyeh
Copy link
Member

@cyyeh cyyeh commented Jun 27, 2024

With this PR, users can choose LLM and Embedding Model using different providers; for example. LLM using Ollama, Embedding Model using OpenAI.

updated corresponding documentation: https://github.com/Canner/docs.getwren.ai/pull/22

@cyyeh cyyeh added the module/ai-service ai-service related label Jun 27, 2024
@cyyeh cyyeh force-pushed the feature/ai-service/seperate-llm-provider branch from 2dcb8a7 to 7f553eb Compare June 27, 2024 17:54
@cyyeh cyyeh requested a review from paopa June 27, 2024 18:06
@cyyeh cyyeh force-pushed the feature/ai-service/seperate-llm-provider branch 2 times, most recently from 9140908 to 40a23da Compare June 28, 2024 07:46
@cyyeh cyyeh changed the title seperate llm and embedder separate llm and embedder Jun 28, 2024
@cyyeh cyyeh force-pushed the feature/ai-service/seperate-llm-provider branch 3 times, most recently from 0ed56b9 to 7adc61c Compare July 1, 2024 05:31
Copy link
Member

@paopa paopa left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Reviewed part of the changes. and remaining code will been reviewed in tmr.

Could @onlyjackfrost also take a look at the section on the launcher and Docker? Thanks a lot.


return llm_provider, document_store_provider, engine
return llm_provider, embedder_provider, document_store_provider, engine
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

IMO, we could consider taking a time to survey the IoC and DI lib to avoid having to change some many files everytime when adding a new sort of provider. alternatively, using a dict to instead of the tuple might be a quick solution.🤔

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

in addition, I don't think we have to do it this time. might list in the backlog or issues.

wren-ai-service/README.md Outdated Show resolved Hide resolved
wren-ai-service/src/providers/llm/openai.py Show resolved Hide resolved
wren-ai-service/src/utils.py Show resolved Hide resolved
wren-ai-service/src/providers/llm/ollama.py Show resolved Hide resolved
Copy link
Member

@paopa paopa left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

overall LGTM. thanks for your hard work!

OPENAI_API_KEY: ${OPENAI_API_KEY}
AZURE_CHAT_KEY: ${AZURE_CHAT_KEY}
AZURE_EMBED_KEY: ${AZURE_EMBED_KEY}
LLM_OPENAI_API_KEY: ${LLM_OPENAI_API_KEY}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@cyyeh could you help me understand the differences between LLM_OPENAI_API_KEY and EMBEDDER_OPENAI_API_KEY and the usage of it?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@onlyjackfrost After this PR, all of the configurations of LLM and EMBEDDER are separated. So users can flexibly choose what LLM and Embedder they need using different providers. For example. LLM using OpenAI, Embedder using Fireworks.ai(OpenAI api compatible).

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for sharing

@cyyeh cyyeh force-pushed the feature/ai-service/seperate-llm-provider branch from 05db4e9 to 4793c72 Compare July 2, 2024 05:13
Copy link
Contributor

@onlyjackfrost onlyjackfrost left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@cyyeh cyyeh merged commit a8dded6 into main Jul 3, 2024
7 checks passed
@cyyeh cyyeh deleted the feature/ai-service/seperate-llm-provider branch July 3, 2024 02:52
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
module/ai-service ai-service related
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants