-
Can anyone point me in the right direction? I'm attempting to use LiteLLM with Ollama but I'm not receiving a model list. I'm thinking something is incorrect with my config.yaml.... Any help is much appreciated!
I would try to config via the new UI but it looks like ollama is not listed there |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 1 reply
-
I figured it out, dummy move on my side. |
Beta Was this translation helpful? Give feedback.
-
Try checking the model name in liteLLM documentation because the name of some ollama models is different in liteLLM |
Beta Was this translation helpful? Give feedback.
I figured it out, dummy move on my side.
Issue was that I did not mount the config.yaml file correctly