【v0.2】'str' object has no attribute 'model_dump' #22234
Replies: 4 comments 6 replies
-
Hi @sichenyong! I'm here to help you with any bugs, questions, or contributions you have. 😊 To resolve the Here's an example of how to correctly use the import os
from langchain_openai import ChatOpenAI
from langchain_core.messages import HumanMessage, SystemMessage
os.environ['OPENAI_API_KEY'] = 'sk-xxx'
model = ChatOpenAI(openai_api_base="proxy_website")
messages = [
SystemMessage(content="Translate the following from English into Chinese"),
HumanMessage(content="hi!"),
]
response = model.invoke(messages)
print(response) Ensure that you have the correct versions of the dependencies installed. According to the
Additionally, make sure your imports are correct and that you are using the
|
Beta Was this translation helpful? Give feedback.
-
hm, seems like you're using a proxy is that right? do you see this issue when requests are sent to openai directly (without proxy)? |
Beta Was this translation helpful? Give feedback.
-
Hi, can you elaborate furthur on the solution? I'm using the correct |
Beta Was this translation helpful? Give feedback.
-
当你用openai代理时, 尝试在url后面加 |
Beta Was this translation helpful? Give feedback.
-
Checked other resources
Commit to Help
Example Code
Description
I followed the official tutorial to run a simple chat case like this:
url: https://python.langchain.com/v0.2/docs/tutorials/llm_chain/#detailed-walkthrough
I meet some troubles:
System Info
python Python 3.9.13
langchain 0.2.1
langchain-core 0.2.1
langchain-openai 0.1.7
langchain-text-splitters 0.2.0
langsmith 0.1.63
Beta Was this translation helpful? Give feedback.
All reactions