Please include Msty as a default local LLM provider out of the box #23848
scotws
started this conversation in
LLMs and Zed Assistant
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I have been trying to get Msty (https://msty.app/) to work as a local LLM provider, and no joy, and the AIs I've asked for help are as confused as I am why it isn't working, even "faking it" as OpenAI as per the docs doesn't work. Since Msty is very easy to get up and running, far more so than say ollama, I would be grateful if it could be added to the list of default providers.
As a stop-gap measure, I would be grateful if somebody who might have gotten it working might share the relevant parts of their configuration file.
Thank you!
Beta Was this translation helpful? Give feedback.
All reactions