-
Checked other resources
Commit to Help
Example Codeasync def get_response(self, messages, model):
llm = OllamaLLM(model=model)
tools = self.tools
memory = self.memory
agent_executor = create_react_agent(llm, tools=tools, checkpointer=memory)
config = {"configurable": {"thread_id": "abc123"}}
response = await agent_executor.run(messages, config)
return response DescriptionWhenever this piece of code is running, it will return with an error: AttributeError: 'OllamaLLM' object has no attribute 'bind_tools' System Infolangchain==0.3.8 macOS sequoia 15.1.1 | Python 3.12.5 |
Beta Was this translation helpful? Give feedback.
Replies: 3 comments
-
For other readers, make sure to upgrade Ollama to the latest version! For example on macOS check |
Beta Was this translation helpful? Give feedback.
-
from langchain_ollama import ChatOllama |
Beta Was this translation helpful? Give feedback.
@qubit999 similar issue #20439