Proxy multiple Ollama instances without configuring every model in config.yml? #4025
Unanswered
arthurGrigo
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Is it possible to simply provide the host names of running ollama instances so that litellm fetches the deployed llms from there and offers them through its api?
Beta Was this translation helpful? Give feedback.
All reactions