-
Notifications
You must be signed in to change notification settings - Fork 239
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
LiteLLM ModelClient Integration #269
Comments
the litellm team is not responsive @PhiBrandon do you want to create a pr? Do users really find litellm useful? Let's wait to see if people will upvote for it. I feel it will be more secure to use each provider's sdk (and often they are more supportive on the integration) and easier to maintain rather than introducing a 3rd party proxy Note: Do not create a pr until we put up "ready-for-PR" label |
Based on discord vote this wouldn't be the best use of time at the moment. |
Hi @liyin2015 - could you elaborate ? Did we not respond to you ? (Sorry about that) If it's of interest, I can make a PR to this repo - litellm maintainer |
close as we dont use litellm but integrate each one on our own to have better developer experience |
Description & Motivation
Given the landscape of Inference providers it seems like a good idea to have a way to interact with then using a relatively 'proven' library. Developing a ModelClient for LiteLLM would allow users to interact 100+ LLMs with more coming online everyday.
Pitch
There was a previous attempt at this, that has since been updated.
https://github.com/SylphAI-Inc/AdalFlow/tree/litellm
. This was implemented prior to optimization updates. I want to add this issue here for communication on this ModelClient.Provider POC
Needs active outreach.
Additional context
No response
The text was updated successfully, but these errors were encountered: