Replies: 5 comments 6 replies
-
The error you're encountering indicates that the In your case, it seems that the
Here's an example of how you can modify your code: from langchain.schema import HumanMessage, convert_message_to_dict
def get_graph(self):
websocket = self.websocket
app_keys = self.app_keys
app_params = self.app_params
chat = LLMService(app_keys, app_params, websocket)
self.streaming = False
chat.update_streaming(self.streaming)
graph_builder = MessageGraph()
graph_builder.add_node("chatbot", self.agent_runnable)
graph_builder.add_node("tools", self.tool_node)
graph_builder.add_edge("tools", "chatbot")
graph_builder.add_conditional_edges(
"chatbot", tools_condition
)
graph_builder.set_entry_point("chatbot")
graph = graph_builder.compile()
# Convert HumanMessage to dictionary
message = HumanMessage(content=input, additional_kwargs={
'input': input,
'dia_da_semana': self.contexto["dia_da_semana"],
'idioma': self.contexto["idioma"],
'email_usuario': self.contexto["email_usuario"],
'data_hoje': self.contexto["data_hoje"]
})
message_dict = convert_message_to_dict(message)
inputs = {
"input": input,
"dia_da_semana": self.contexto["dia_da_semana"],
"idioma": self.contexto["idioma"],
'email_usuario': self.contexto["email_usuario"],
'data_hoje': self.contexto["data_hoje"],
"role": "user",
"content": input,
"messages": message_dict,
}
result = await graph.ainvoke(inputs, debug=True) This should ensure that the |
Beta Was this translation helpful? Give feedback.
-
Thank you @dosu,
Where is convert_message_to_dict?
Tank you
Em ter., 2 de jul. de 2024 às 13:56, dosubot[bot] ***@***.***>
escreveu:
… The error you're encountering indicates that the inputs parameter for the
graph.ainvoke method is not in the expected format. The inputs parameter
should be a dictionary (Dict[str, Any]) containing all the named inputs
specified in Chain.input_keys, including any inputs added by memory [1]
<https://github.com/langchain-ai/langchain/blob/master/libs/langchain/langchain/chains/base.py>
.
In your case, it seems that the message variable is not correctly
formatted. The message should be a dictionary with specific keys and
values. Here is how you can correctly format the message variable:
1. Ensure that message is a dictionary with the correct structure.
2. Use the convert_message_to_dict function to convert HumanMessage to
the expected dictionary format.
Here's an example of how you can modify your code:
from langchain.schema import HumanMessage, convert_message_to_dict
def get_graph(self):
websocket = self.websocket
app_keys = self.app_keys
app_params = self.app_params
chat = LLMService(app_keys, app_params, websocket)
self.streaming = False
chat.update_streaming(self.streaming)
graph_builder = MessageGraph()
graph_builder.add_node("chatbot", self.agent_runnable)
graph_builder.add_node("tools", self.tool_node)
graph_builder.add_edge("tools", "chatbot")
graph_builder.add_conditional_edges(
"chatbot", tools_condition
)
graph_builder.set_entry_point("chatbot")
graph = graph_builder.compile()
# Convert HumanMessage to dictionary
message = HumanMessage(content=input, additional_kwargs={
'input': input,
'dia_da_semana': self.contexto["dia_da_semana"],
'idioma': self.contexto["idioma"],
'email_usuario': self.contexto["email_usuario"],
'data_hoje': self.contexto["data_hoje"]
})
message_dict = convert_message_to_dict(message)
inputs = {
"input": input,
"dia_da_semana": self.contexto["dia_da_semana"],
"idioma": self.contexto["idioma"],
'email_usuario': self.contexto["email_usuario"],
'data_hoje': self.contexto["data_hoje"],
"role": "user",
"content": input,
"messages": message_dict,
}
result = await graph.ainvoke(inputs, debug=True)
This should ensure that the inputs parameter is correctly formatted as a
dictionary, and the message variable is properly converted to the
expected structure [2]
<https://github.com/langchain-ai/langchain/blob/master/libs/community/langchain_community/llms/chatglm3.py>
.
*To continue talking to Dosu <https://dosu.dev>, mention @dosu
<https://github.com/dosu>.*
—
Reply to this email directly, view it on GitHub
<#23772 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAGFVVVU3PM72R4HXLJ4MBLZKLLTXAVCNFSM6AAAAABKH4SE2WVHI2DSMVQWIX3LMV43SRDJONRXK43TNFXW4Q3PNVWWK3TUHM4TSMZZGMYTG>
.
You are receiving this because you authored the thread.Message ID:
***@***.***
com>
|
Beta Was this translation helpful? Give feedback.
-
Im sorry, I would like to know how to import?
Em ter., 2 de jul. de 2024 às 14:47, dosubot[bot] ***@***.***>
escreveu:
… You can find the convert_message_to_dict function in the
libs/community/langchain_community/chat_models/litellm.py file. Here is
the function:
def _convert_message_to_dict(message: BaseMessage) -> dict:
if isinstance(message, ChatMessage):
message_dict = {"role": message.role, "content": message.content}
elif isinstance(message, HumanMessage):
message_dict = {"role": "user", "content": message.content}
elif isinstance(message, AIMessage):
message_dict = {"role": "assistant", "content": message.content}
if "function_call" in message.additional_kwargs:
message_dict["function_call"] = message.additional_kwargs["function_call"]
elif isinstance(message, SystemMessage):
message_dict = {"role": "system", "content": message.content}
elif isinstance(message, FunctionMessage):
message_dict = {
"role": "function",
"content": message.content,
"name": message.name,
}
else:
raise ValueError(f"Got unknown type {message}")
if "name" in message.additional_kwargs:
message_dict["name"] = message.additional_kwargs["name"]
return message_dict
This function converts a HumanMessage to the expected dictionary format
[1]
<https://github.com/langchain-ai/langchain/blob/master/libs/community/langchain_community/chat_models/litellm.py>
.
*To continue talking to Dosu <https://dosu.dev>, mention @dosu
<https://github.com/dosu>.*
—
Reply to this email directly, view it on GitHub
<#23772 (reply in thread)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAGFVVQ5KTWVSDTVGZELYBTZKLRU3AVCNFSM6AAAAABKH4SE2WVHI2DSMVQWIX3LMV43SRDJONRXK43TNFXW4Q3PNVWWK3TUHM4TSMZZG42DM>
.
You are receiving this because you authored the thread.Message ID:
***@***.***
com>
|
Beta Was this translation helpful? Give feedback.
-
@dosu, let me ask differently, I need to pass these data
inputs = {
"input": input,
"dia_da_semana": self.contexto["dia_da_semana"],
"idioma": self.contexto["idioma"],
'email_usuario': self.contexto["email_usuario"],
'data_hoje': self.contexto["data_hoje"],
}
to the graph's ainvoke:
result = await self.graph.ainvoke(inputs, debug=True)
But it is giving an error and I can't find in the documentation how I
should format the input.
Could you please show me exactly how I should pass the data to ainvoke?
Thank you!
Em ter., 2 de jul. de 2024 às 14:47, dosubot[bot] ***@***.***>
escreveu:
… You can find the convert_message_to_dict function in the
libs/community/langchain_community/chat_models/litellm.py file. Here is
the function:
def _convert_message_to_dict(message: BaseMessage) -> dict:
if isinstance(message, ChatMessage):
message_dict = {"role": message.role, "content": message.content}
elif isinstance(message, HumanMessage):
message_dict = {"role": "user", "content": message.content}
elif isinstance(message, AIMessage):
message_dict = {"role": "assistant", "content": message.content}
if "function_call" in message.additional_kwargs:
message_dict["function_call"] = message.additional_kwargs["function_call"]
elif isinstance(message, SystemMessage):
message_dict = {"role": "system", "content": message.content}
elif isinstance(message, FunctionMessage):
message_dict = {
"role": "function",
"content": message.content,
"name": message.name,
}
else:
raise ValueError(f"Got unknown type {message}")
if "name" in message.additional_kwargs:
message_dict["name"] = message.additional_kwargs["name"]
return message_dict
This function converts a HumanMessage to the expected dictionary format
[1]
<https://github.com/langchain-ai/langchain/blob/master/libs/community/langchain_community/chat_models/litellm.py>
.
*To continue talking to Dosu <https://dosu.dev>, mention @dosu
<https://github.com/dosu>.*
—
Reply to this email directly, view it on GitHub
<#23772 (reply in thread)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAGFVVQ5KTWVSDTVGZELYBTZKLRU3AVCNFSM6AAAAABKH4SE2WVHI2DSMVQWIX3LMV43SRDJONRXK43TNFXW4Q3PNVWWK3TUHM4TSMZZG42DM>
.
You are receiving this because you authored the thread.Message ID:
***@***.***
com>
|
Beta Was this translation helpful? Give feedback.
-
@dosu, I think I made some progress!
Let me explain, follow me:
I passed these data:
input = "o que é um velociraptor"
inputs = { "input": input, "dia_da_semana":
self.contexto["dia_da_semana"], "idioma":
self.contexto["idioma"], 'email_usuario':
self.contexto["email_usuario"], 'data_hoje':
self.contexto["data_hoje"], "role": "user",
"content": input, }
result = await self.graph.ainvoke(inputs, debug=True)
It accepted and started executing LangGraph and now gave this error:
[0:tasks] Starting step 0 with 1 task:
- __start__ -> {'content': 'o que é o aidan',
'data_hoje': '2024-07-02',
'dia_da_semana': 'terça',
'email_usuario': ***@***.***',
'idioma': 'Inglês',
'input': 'o que é o aidan',
'role': 'user'}
[0:writes] Finished step 0 with writes to 1 channel:
- __root__ -> {'content': 'o que é o aidan',
'data_hoje': '2024-07-02',
'dia_da_semana': 'terça',
'email_usuario': ***@***.***',
'idioma': 'Inglês',
'input': 'o que é o aidan',
'role': 'user'}
[1:tasks] Starting step 1 with 1 task:
- chatbot -> [HumanMessage(content='o que é o aidan',
additional_kwargs={'input': 'o que é o aidan', 'dia_da_semana': 'terça',
'idioma': 'Inglês', 'email_usuario': ***@***.***', 'data_hoje':
'2024-07-02'}, id='58e7c4ea-ef66-4df5-b2b3-dfa0736111dc')]
[0:tasks] Starting step 0 with 1 task:
- __start__ -> [HumanMessage(content='o que é o aidan',
additional_kwargs={'input': 'o que é o aidan', 'dia_da_semana': 'terça',
'idioma': 'Inglês', 'email_usuario': ***@***.***', 'data_hoje':
'2024-07-02'}, id='58e7c4ea-ef66-4df5-b2b3-dfa0736111dc')]
I couldn't answer your question due to internal problems or connection
issues. Please try again
Error (websocket_endpoint): Expected dict, got [HumanMessage(content='o que
é o aidan', additional_kwargs={'input': 'o que é o aidan', 'dia_da_semana':
'terça', 'idioma': 'Inglês', 'email_usuario': ***@***.***',
'data_hoje': '2024-07-02'}, id='58e7c4ea-ef66-4df5-b2b3-dfa0736111dc')]
INFO: connection closed
NOW, I will show you how my LangGraph is configured, to see if you can
identify the cause of this error:
graph_builder = MessageGraph()
graph_builder.add_node("chatbot", self.agent_runnable)
graph_builder.add_node("tools", self.tool_node)
graph_builder.add_edge("tools", "chatbot")
graph_builder.add_conditional_edges(
"chatbot", tools_condition
)
graph_builder.set_entry_point("chatbot")
graph = graph_builder.compile()
Notice that it is standard Langgraph code.
Thank you!
Em ter., 2 de jul. de 2024 às 14:47, dosubot[bot] ***@***.***>
escreveu:
… You can find the convert_message_to_dict function in the
libs/community/langchain_community/chat_models/litellm.py file. Here is
the function:
def _convert_message_to_dict(message: BaseMessage) -> dict:
if isinstance(message, ChatMessage):
message_dict = {"role": message.role, "content": message.content}
elif isinstance(message, HumanMessage):
message_dict = {"role": "user", "content": message.content}
elif isinstance(message, AIMessage):
message_dict = {"role": "assistant", "content": message.content}
if "function_call" in message.additional_kwargs:
message_dict["function_call"] = message.additional_kwargs["function_call"]
elif isinstance(message, SystemMessage):
message_dict = {"role": "system", "content": message.content}
elif isinstance(message, FunctionMessage):
message_dict = {
"role": "function",
"content": message.content,
"name": message.name,
}
else:
raise ValueError(f"Got unknown type {message}")
if "name" in message.additional_kwargs:
message_dict["name"] = message.additional_kwargs["name"]
return message_dict
This function converts a HumanMessage to the expected dictionary format
[1]
<https://github.com/langchain-ai/langchain/blob/master/libs/community/langchain_community/chat_models/litellm.py>
.
*To continue talking to Dosu <https://dosu.dev>, mention @dosu
<https://github.com/dosu>.*
—
Reply to this email directly, view it on GitHub
<#23772 (reply in thread)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAGFVVQ5KTWVSDTVGZELYBTZKLRU3AVCNFSM6AAAAABKH4SE2WVHI2DSMVQWIX3LMV43SRDJONRXK43TNFXW4Q3PNVWWK3TUHM4TSMZZG42DM>
.
You are receiving this because you authored the thread.Message ID:
***@***.***
com>
|
Beta Was this translation helpful? Give feedback.
-
Checked other resources
Commit to Help
Example Code
Description
Dear friends,
I am trying to migrate my project from AgentExecutor to Langgraph. I am using a very simple code, utilizing all the possible default resources. I am encountering this error when calling graph.ainvoke. I am probably passing the data incorrectly. Could you tell me the correct format expected by ainvoke in this case?
ERROR AT THIS LINE:
Expected dict, got [HumanMessage(content='o que é o aidan', additional_kwargs={'input': 'o que é o aidan', 'dia_da_semana': 'terça', 'idioma': 'Inglês', 'email_usuario': '[email protected]', 'data_hoje': '2024-07-02', 'messages': {'role': 'user', 'content': 'o que é o aidan'}}, id='119b4442-3c6d-4dcc-90e9-72c401477c77')]
INFO: connection closed
System Info
langchain==0.2.6
langchain-community==0.2.6
langchain-core==0.2.10
langchain-experimental==0.0.62
langchain-openai==0.1.13
langchain-qdrant==0.1.0
langchain-text-splitters==0.2.1
Beta Was this translation helpful? Give feedback.
All reactions