OpenLLM cant connect to local server with error: "module 'openllm' has no attribute 'client'" #25379
Replies: 2 comments 3 replies
-
Hey there, @SmolPandaDev! I'm a bot designed to help you solve bugs, answer questions, and become a contributor. Let me know how I can assist you with the OpenLLM issue. 🤖 To resolve the error "module 'openllm' has no attribute 'client'" and successfully connect to the local LLM using the LangChain community library for OpenLLM, you need to ensure that you are using the correct module and attribute names. Based on the provided code, it appears that the correct module to import is Here is the relevant part of the code that shows the correct import and attribute usage: from langchain_community.llms import OpenLLM
llm = OpenLLM(
model_name='flan-t5',
model_id='google/flan-t5-large',
)
llm.invoke("What is the difference between a duck and a goose?") Make sure you have the pip install langchain_community Then, ensure your code is using |
Beta Was this translation helpful? Give feedback.
-
Hi, im also facing this issue in version 0.3.5 python3 -m pip show langchain-community |
Beta Was this translation helpful? Give feedback.
-
Checked other resources
Commit to Help
Example Code
Description
I'm trying to use the langchain community library for openllm to connect to a locally running LLM and then ask the LLM a simple question.
System Info
Output of
pip3 freeze | grep langchain
:Platform: MacOS (14.5 (23F79))
Output of:
python3 --version
Python 3.11.6
Beta Was this translation helpful? Give feedback.
All reactions