You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Bug:
So I am running a docker compose instance on my NAS, running ollama using openwebUI. I downloaded you model from huggingface using the command:
ollama run hf.co/acon96/Home-1B-v3-GGUF:Q4_K_M
It downloaded the model succesfully. Now I can talk to it using the openwebUI frontend:
I am able to connect to the model using the ollama integration on my home assistant yellow. I can now ask it questions using my home assistant instance:
However, as soon as I set the conversation agent to be able to control my devices:
Then I get this response:
Sorry, I had a problem talking to the Ollama server: hf.co/acon96/Home-1B-v3-GGUF:Q4_K_M does not support tools
Expected behavior:
I expect the model to answer the command, and execute the command by controlling my devices.
Logs:
I think the response speaks for itself, but if any logs are needed, please say so.
Home assistant version:
core: 2025.1.4
supervisor: 2024.12.3
Ollama version: 0.5.7
The text was updated successfully, but these errors were encountered:
The Home models actually predate tool support in Ollama. That means they only work with the extension provided in this repo (Local LLM Conversation) and not the built in Home Assistant Ollama extension, which relies on tool support.
I'm converting this into a feature request because it would be good to support Ollama tool usage natively.
Bug:
So I am running a docker compose instance on my NAS, running ollama using openwebUI. I downloaded you model from huggingface using the command:
It downloaded the model succesfully. Now I can talk to it using the openwebUI frontend:
I am able to connect to the model using the ollama integration on my home assistant yellow. I can now ask it questions using my home assistant instance:
However, as soon as I set the conversation agent to be able to control my devices:
Then I get this response:
Expected behavior:
I expect the model to answer the command, and execute the command by controlling my devices.
Logs:
I think the response speaks for itself, but if any logs are needed, please say so.
Home assistant version:
core: 2025.1.4
supervisor: 2024.12.3
Ollama version: 0.5.7
The text was updated successfully, but these errors were encountered: