You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I can connect to local Ollama server on 192.168.1.110:11434 if I use VSC extension Continue , from my Desktop PC, but not with Roo Cline or Cline, it sees the models at that address at settings, but I get API Request Failed and connection error , if I try use any of the Ollama models with Roo Cline
Steps to reproduce
1.I select model from the list and any interaction gives connection error
2. Connects firn and works if I try from the Continue extension , so seems some issue with non-existent API calls to a local Ollama server
3.
Relevant API REQUEST output
No response
Additional context
The text was updated successfully, but these errors were encountered:
I second this issue - when I add ollama servers, often I get connection error or
**[ERROR] You did not use a tool in your previous response! Please retry with a tool use.
Which API Provider are you using?
Ollama
Which Model are you using?
qwen2.5-coder:32b
What happened?
I can connect to local Ollama server on 192.168.1.110:11434 if I use VSC extension Continue , from my Desktop PC, but not with Roo Cline or Cline, it sees the models at that address at settings, but I get API Request Failed and connection error , if I try use any of the Ollama models with Roo Cline
Steps to reproduce
1.I select model from the list and any interaction gives connection error
2. Connects firn and works if I try from the Continue extension , so seems some issue with non-existent API calls to a local Ollama server
3.
Relevant API REQUEST output
No response
Additional context
The text was updated successfully, but these errors were encountered: