Replies: 4 comments
-
I've been meaning to add StarCoder's StarChat: https://twitter.com/RickLamers/status/1659901363076190209 I'm open to adding WizardLM or Vicuna but what we need effectively is an endpoint inference method not something running locally (that would increase complexity of this project by too much). Ideally I'd tap into existing OSS abstractions that the community has built to plug in different models. Probably just LangChain. Any recommendations? |
Beta Was this translation helpful? Give feedback.
-
Someone mentioned https://github.com/go-skynet/LocalAI Should be possible to use that by replacing |
Beta Was this translation helpful? Give feedback.
-
Ooobabooga Webui also supports an openai compatible API. Open the session tab and check the box next to openai. Then, apply and restart the UI. You can then set your url to http://localhost:5001/v1 |
Beta Was this translation helpful? Give feedback.
-
Another way to achieve this https://openrouter.ai/docs (not local, but includes OSS models) |
Beta Was this translation helpful? Give feedback.
-
The title says it all: Can or will it run with local models like WizardLM or Vicuna?
Thanks!
Beta Was this translation helpful? Give feedback.
All reactions