Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Ollama failing to set context and use tools #5166

Open
1 task done
TomLucidor opened this issue Nov 21, 2024 · 2 comments
Open
1 task done

[Bug]: Ollama failing to set context and use tools #5166

TomLucidor opened this issue Nov 21, 2024 · 2 comments
Assignees
Labels
bug Something isn't working

Comments

@TomLucidor
Copy link

Is there an existing issue for the same bug?

  • I have checked the existing issues.

Describe the bug and reproduction steps

  1. Use Docker command (with WSL) for setup for 0.14 with the proper params
  2. Use Ollama Desktop for Windows through the official site instead of doing it in WSL
  3. Set Advanced settings like this [Bug]: ollama not wroking #3960 (comment)
  4. Start doing test commands

The issue is that when it has to deal with the failings of inability to use tools, the API calls break the 2048 default limit for Ollama, which needs to be set higher (hopefully not making a custom model with higher context length since most modern LLM have context lengths at 128K or more), and Ollama should be able to add params to the API calls such that this does not happen? https://github.com/ollama/ollama/blob/main/docs/faq.md#how-can-i-specify-the-context-window-size

I mean it is true that OpenHands relies on LiteLLM, but then there definitely is something analogous to "num_ctx" that can can be called through LiteLLM into Ollama as well https://docs.litellm.ai/docs/providers/ollama https://docs.litellm.ai/docs/completion/input#translated-openai-params

Regarding tool use they made an update on July (but system flags this as unable to use tools natively, and require emulation through prompting https://ollama.com/blog/tool-support

OpenHands Installation

Docker command in README

OpenHands Version

0.14

Operating System

None

Logs, Errors, Screenshots, and Additional Context

No response

@TomLucidor TomLucidor added the bug Something isn't working label Nov 21, 2024
@enyst
Copy link
Collaborator

enyst commented Nov 21, 2024

Hey Tom, at the moment, "tool use" is enabled from our side for Anthropic models and a couple of open AI models. The reason is that tests with several others gave pretty bad results. If we make them user configurable though, that should cover it.

Why do you say that the inability to use tools has something to do with the 2048 limit? Just wondering. I could be wrong, I think the difference isn't high... but I can double check.

@TomLucidor
Copy link
Author

@enyst thanks for the reply, if LiteLLM+Ollama has no tool use (please drop the FOSS equivalents if possible), then the current system will do.
Also thanks bot cus this part is important to note #2927 (comment)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants