Using Model from Ollama in ChatOpenAI doesnt invoke the tools with bind_tools #21907
Replies: 9 comments 14 replies
-
To address the issue of invoking tools with
from langchain_openai import ChatOpenAI
from langchain.tools import MoveFileTool, format_tool_to_openai_function
llm = ChatOpenAI(
api_key="ollama",
model="llama3:8b-instruct-fp16",
base_url="http://localhost:11434/v1",
)
tools = [MoveFileTool()]
functions = [format_tool_to_openai_function(t) for t in tools]
llm_with_tools = llm.bind_tools(tools=functions)
If following these steps doesn't resolve your issue, please provide more details about the errors or behavior you're experiencing for further assistance.
|
Beta Was this translation helpful? Give feedback.
-
Hi @AtmehEsraa, You cannot use |
Beta Was this translation helpful? Give feedback.
-
See this guide to add ad-hoc too calling capability to models that do not support it natively: https://python.langchain.com/v0.2/docs/how_to/tools_prompting/ Keep in mind that the gap in quality can be very big between a model that's been fine-tuned for tool calling and one that hasn't. For tool calling models see: https://python.langchain.com/docs/how_to/tool_calling/ And a list of such models here: https://python.langchain.com/docs/integrations/chat/ |
Beta Was this translation helpful? Give feedback.
-
Getting similar error when using |
Beta Was this translation helpful? Give feedback.
-
Is there any good solution? |
Beta Was this translation helpful? Give feedback.
-
Same here, it never calls to the functions. Should we use OpenAI API or go to OllamaFunctions? |
Beta Was this translation helpful? Give feedback.
-
You can use it like this https://gist.github.com/x51xxx/4d61e8c675681d165f012a7231d06976 |
Beta Was this translation helpful? Give feedback.
-
Now Ollama with llama3.1 can use tool calling (source), but llama3.1:8b is pretty bad at it. I use |
Beta Was this translation helpful? Give feedback.
-
I found some similar discussions and issues that might help you:
To use the model from Ollama in
Here is the complete code snippet: # Step 1: Install the package
pip install -U langchain-ollama
# Step 2: Instantiate the ChatOllama class
from langchain_ollama import ChatOllama
llm = ChatOllama(model="llama3.1", temperature=0)
# Step 3: Define the tools
from pydantic import BaseModel, Field
class GetWeather(BaseModel):
"""Get the current weather in a given location"""
location: str = Field(..., description="The city and state, e.g. San Francisco, CA")
# Step 4: Bind the tools to the model
llm_with_tools = llm.bind_tools([GetWeather])
# Step 5: Invoke the model with a message
ai_msg = llm_with_tools.invoke("what is the weather like in San Francisco")
# Step 6: Access the tool calls
ai_msg.tool_calls This should allow you to use the model from Ollama in |
Beta Was this translation helpful? Give feedback.
-
Checked other resources
Commit to Help
Example Code
Description
Using Model from Ollama in ChatOpenAI doesnt invoke the tools with bind_tools
System Info
..
Beta Was this translation helpful? Give feedback.
All reactions