-
Notifications
You must be signed in to change notification settings - Fork 235
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
separate llm and embedder #454
Conversation
2dcb8a7
to
7f553eb
Compare
9140908
to
40a23da
Compare
0ed56b9
to
7adc61c
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Reviewed part of the changes. and remaining code will been reviewed in tmr.
Could @onlyjackfrost also take a look at the section on the launcher and Docker? Thanks a lot.
|
||
return llm_provider, document_store_provider, engine | ||
return llm_provider, embedder_provider, document_store_provider, engine |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
IMO, we could consider taking a time to survey the IoC and DI lib to avoid having to change some many files everytime when adding a new sort of provider. alternatively, using a dict to instead of the tuple might be a quick solution.🤔
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
in addition, I don't think we have to do it this time. might list in the backlog or issues.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
overall LGTM. thanks for your hard work!
OPENAI_API_KEY: ${OPENAI_API_KEY} | ||
AZURE_CHAT_KEY: ${AZURE_CHAT_KEY} | ||
AZURE_EMBED_KEY: ${AZURE_EMBED_KEY} | ||
LLM_OPENAI_API_KEY: ${LLM_OPENAI_API_KEY} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@cyyeh could you help me understand the differences between LLM_OPENAI_API_KEY
and EMBEDDER_OPENAI_API_KEY
and the usage of it?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@onlyjackfrost After this PR, all of the configurations of LLM and EMBEDDER are separated. So users can flexibly choose what LLM and Embedder they need using different providers. For example. LLM using OpenAI, Embedder using Fireworks.ai(OpenAI api compatible).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for sharing
05db4e9
to
4793c72
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
With this PR, users can choose LLM and Embedding Model using different providers; for example. LLM using Ollama, Embedding Model using OpenAI.
updated corresponding documentation: https://github.com/Canner/docs.getwren.ai/pull/22