You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The issue is that when it has to deal with the failings of inability to use tools, the API calls break the 2048 default limit for Ollama, which needs to be set higher (hopefully not making a custom model with higher context length since most modern LLM have context lengths at 128K or more), and Ollama should be able to add params to the API calls such that this does not happen? https://github.com/ollama/ollama/blob/main/docs/faq.md#how-can-i-specify-the-context-window-size
Regarding tool use they made an update on July (but system flags this as unable to use tools natively, and require emulation through prompting https://ollama.com/blog/tool-support
OpenHands Installation
Docker command in README
OpenHands Version
0.14
Operating System
None
Logs, Errors, Screenshots, and Additional Context
No response
The text was updated successfully, but these errors were encountered:
Hey Tom, at the moment, "tool use" is enabled from our side for Anthropic models and a couple of open AI models. The reason is that tests with several others gave pretty bad results. If we make them user configurable though, that should cover it.
Why do you say that the inability to use tools has something to do with the 2048 limit? Just wondering. I could be wrong, I think the difference isn't high... but I can double check.
@enyst thanks for the reply, if LiteLLM+Ollama has no tool use (please drop the FOSS equivalents if possible), then the current system will do.
Also thanks bot cus this part is important to note #2927 (comment)
Is there an existing issue for the same bug?
Describe the bug and reproduction steps
The issue is that when it has to deal with the failings of inability to use tools, the API calls break the 2048 default limit for Ollama, which needs to be set higher (hopefully not making a custom model with higher context length since most modern LLM have context lengths at 128K or more), and Ollama should be able to add params to the API calls such that this does not happen? https://github.com/ollama/ollama/blob/main/docs/faq.md#how-can-i-specify-the-context-window-size
I mean it is true that OpenHands relies on LiteLLM, but then there definitely is something analogous to "num_ctx" that can can be called through LiteLLM into Ollama as well https://docs.litellm.ai/docs/providers/ollama https://docs.litellm.ai/docs/completion/input#translated-openai-params
Regarding tool use they made an update on July (but system flags this as unable to use tools natively, and require emulation through prompting https://ollama.com/blog/tool-support
OpenHands Installation
Docker command in README
OpenHands Version
0.14
Operating System
None
Logs, Errors, Screenshots, and Additional Context
No response
The text was updated successfully, but these errors were encountered: