Local LLMs
#173
Replies: 2 comments 2 replies
-
Hi @lilhoser , multi-model support is in PR right now - you should see it in < a week. @dluc, @dmytrostruk |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
When will SK support offline/local LLMs instead of remote ones we must go through OpenAI to access?
Beta Was this translation helpful? Give feedback.
All reactions