You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello!
Regarding this issue, I am currently using LM Studio. When using local-model, it does not work. As I can see in code, in interpreter_lib.py, line 324, variable custom_llm_provider is set to 'openai', so it expects de openai api key. Which has to be the value of this variable when using open-source LLMs as Mistral?
The text was updated successfully, but these errors were encountered:
Thanks! In my case, the issue has been solved by setting:
model = "ollama/llama2"
And removing the variable custom_llm_provider.
Just in case it helps! 😄
Hello!
Regarding this issue, I am currently using LM Studio. When using local-model, it does not work. As I can see in code, in interpreter_lib.py, line 324, variable custom_llm_provider is set to 'openai', so it expects de openai api key. Which has to be the value of this variable when using open-source LLMs as Mistral?
The text was updated successfully, but these errors were encountered: