Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

BaseMessageChunk cannot merge function key when using open ai tools and streaming. #14853

Closed
2 of 14 tasks
JonatanMedinilla opened this issue Dec 18, 2023 · 3 comments
Closed
2 of 14 tasks
Labels
🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature Ɑ: models Related to LLMs or chat model modules

Comments

@JonatanMedinilla
Copy link

System Info

langchain: latest (0.0.350)
python: 3.10.12

Who can help?

@hwchase17

Information

  • The official example notebooks/scripts
  • My own modified scripts

Related Components

  • LLMs/Chat Models
  • Embedding Models
  • Prompts / Prompt Templates / Prompt Selectors
  • Output Parsers
  • Document Loaders
  • Vector Stores / Retrievers
  • Memory
  • Agents / Agent Executors
  • Tools / Toolkits
  • Chains
  • Callbacks/Tracing
  • Async

Reproduction

Code to reproduce (based on code from docs)

import openai
from langchain.agents import AgentExecutor
from langchain.agents.format_scratchpad.openai_tools import (
    format_to_openai_tool_messages,
)
from langchain.agents.output_parsers.openai_tools import OpenAIToolsAgentOutputParser
from langchain.chat_models import AzureChatOpenAI, ChatOpenAI
from langchain.prompts import ChatPromptTemplate, MessagesPlaceholder
from langchain.tools import BearlyInterpreterTool, DuckDuckGoSearchRun
from langchain.tools.render import format_tool_to_openai_tool
from settings import AppSettings

openai.api_type = AppSettings.OPENAI_API_TYPE or "azure"
openai.api_version = AppSettings.OPENAI_API_VERSION or "2023-03-15-preview"
openai.api_base = AppSettings.AZURE_OPENAI_API_ENDPOINT
openai.api_key = AppSettings.AZURE_OPENAI_API_KEY

lc_tools = [DuckDuckGoSearchRun()]
oai_tools = [format_tool_to_openai_tool(tool) for tool in lc_tools]

prompt = ChatPromptTemplate.from_messages(
    [
        ("system", "You are a helpful assistant"),
        ("user", "{input}"),
        MessagesPlaceholder(variable_name="agent_scratchpad"),
    ]
)
llm = AzureChatOpenAI(
    openai_api_version=AppSettings.OPENAI_API_VERSION,  # type: ignore TODO: I don't know why this is an error despite being in the class
    azure_deployment=AppSettings.AZURE_OPENAI_DEPLOYMENT,
    temperature=0,
    streaming=True,
    verbose=True,
)

agent = (
    {
        "input": lambda x: x["input"],
        "agent_scratchpad": lambda x: format_to_openai_tool_messages(
            x["intermediate_steps"]
        ),
    }
    | prompt
    | llm.bind(tools=oai_tools)
    | OpenAIToolsAgentOutputParser()
)

agent_executor = AgentExecutor(agent=agent, tools=lc_tools, verbose=True)

agent_executor.invoke(
    {"input": "What's the average of the temperatures in LA, NYC, and SF today?"}
)

Logs:

> Entering new AgentExecutor chain...
ic| merged[k]: {'arguments': '{"qu', 'name': 'duckduckgo_search'}
    v: <OpenAIObject at 0x7fdb750c7920> JSON: {
         "arguments": "ery\":"
       }
    type(merged[k]): <class 'dict'>
    type(v): <class 'openai.openai_object.OpenAIObject'>
    isinstance(merged[k], dict): True
    isinstance(v, dict): True
Traceback (most recent call last):
  File "/home/jonatan-medinilla/dev/team-macanudo-ai/backend/test-issue.py", line 56, in <module>
    agent_executor.invoke(
  File "/home/jonatan-medinilla/dev/team-macanudo-ai/backend/.venv/lib/python3.10/site-packages/langchain/chains/base.py", line 89, in invoke
    return self(
  File "/home/jonatan-medinilla/dev/team-macanudo-ai/backend/.venv/lib/python3.10/site-packages/langchain/chains/base.py", line 312, in __call__
    raise e
  File "/home/jonatan-medinilla/dev/team-macanudo-ai/backend/.venv/lib/python3.10/site-packages/langchain/chains/base.py", line 306, in __call__
    self._call(inputs, run_manager=run_manager)
  File "/home/jonatan-medinilla/dev/team-macanudo-ai/backend/.venv/lib/python3.10/site-packages/langchain/agents/agent.py", line 1312, in _call
    next_step_output = self._take_next_step(
  File "/home/jonatan-medinilla/dev/team-macanudo-ai/backend/.venv/lib/python3.10/site-packages/langchain/agents/agent.py", line 1038, in _take_next_step
    [
  File "/home/jonatan-medinilla/dev/team-macanudo-ai/backend/.venv/lib/python3.10/site-packages/langchain/agents/agent.py", line 1038, in <listcomp>
    [
  File "/home/jonatan-medinilla/dev/team-macanudo-ai/backend/.venv/lib/python3.10/site-packages/langchain/agents/agent.py", line 1066, in _iter_next_step
    output = self.agent.plan(
  File "/home/jonatan-medinilla/dev/team-macanudo-ai/backend/.venv/lib/python3.10/site-packages/langchain/agents/agent.py", line 461, in plan
    output = self.runnable.invoke(inputs, config={"callbacks": callbacks})
  File "/home/jonatan-medinilla/dev/team-macanudo-ai/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 1514, in invoke
    input = step.invoke(
  File "/home/jonatan-medinilla/dev/team-macanudo-ai/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 2937, in invoke
    return self.bound.invoke(
  File "/home/jonatan-medinilla/dev/team-macanudo-ai/backend/.venv/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py", line 160, in invoke
    self.generate_prompt(
  File "/home/jonatan-medinilla/dev/team-macanudo-ai/backend/.venv/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py", line 491, in generate_prompt
    return self.generate(prompt_messages, stop=stop, callbacks=callbacks, **kwargs)
  File "/home/jonatan-medinilla/dev/team-macanudo-ai/backend/.venv/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py", line 378, in generate
    raise e
  File "/home/jonatan-medinilla/dev/team-macanudo-ai/backend/.venv/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py", line 368, in generate
    self._generate_with_cache(
  File "/home/jonatan-medinilla/dev/team-macanudo-ai/backend/.venv/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py", line 524, in _generate_with_cache
    return self._generate(
  File "/home/jonatan-medinilla/dev/team-macanudo-ai/backend/.venv/lib/python3.10/site-packages/langchain_community/chat_models/openai.py", line 428, in _generate
    return generate_from_stream(stream_iter)
  File "/home/jonatan-medinilla/dev/team-macanudo-ai/backend/.venv/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py", line 65, in generate_from_stream
    generation += chunk
  File "/home/jonatan-medinilla/dev/team-macanudo-ai/backend/.venv/lib/python3.10/site-packages/langchain_core/outputs/chat_generation.py", line 62, in __add__
    message=self.message + other.message,
  File "/home/jonatan-medinilla/dev/team-macanudo-ai/backend/.venv/lib/python3.10/site-packages/langchain_core/messages/ai.py", line 52, in __add__
    additional_kwargs=self._merge_kwargs_dict(
  File "/home/jonatan-medinilla/dev/team-macanudo-ai/backend/.venv/lib/python3.10/site-packages/langchain_core/messages/base.py", line 128, in _merge_kwargs_dict
    merged[k][i] = self._merge_kwargs_dict(merged[k][i], e)
  File "/home/jonatan-medinilla/dev/team-macanudo-ai/backend/.venv/lib/python3.10/site-packages/langchain_core/messages/base.py", line 114, in _merge_kwargs_dict
    raise TypeError(
TypeError: additional_kwargs["function"] already exists in this message, but with a different type.

Expected behavior

No errors and the same result as without streaming=True.

Last week there was a PR #14613 that fixed the issue #13442. I tested the fix using the same scenario that I shared and it worked as expected. However, today I tested it again and the merge kwargs fails because the types don't match though both values are instances of dict

Copy link

dosubot bot commented Dec 18, 2023

🤖

It's great to see that you've been actively working with LangChain and testing out the fixes for the issues you've encountered. Your dedication to testing and providing feedback is valuable to the LangChain community.

If you believe that the fix from PR #14613 resolved the issue #13442 and that the recent error you encountered is related to the previous fix, you might consider creating a new PR to address this regression. Your insights and contributions can help improve the framework for everyone.

Keep up the good work, and thank you for your commitment to making LangChain better!


This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

If you want to continue the conversation, start your reply with @dosu-bot.

@dosubot dosubot bot added Ɑ: models Related to LLMs or chat model modules 🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature labels Dec 18, 2023
@JonatanMedinilla
Copy link
Author

As far as I can see, this issue happens when we try to merge the dictionaries from tool_calls. Despite both values are instances of dict, the type from the existing value is dict while the type of the new value is OpenAIObject.

However, I ran again the script that reproduces the issue and it fails, but according to the logs, know both types are OpenAIObject.

ic| merged[k]: <OpenAIObject at 0x7fd3a34f0220> JSON: {
                 "name": "duckduckgo_search",
                 "arguments": ""
               }
    v: <OpenAIObject at 0x7fd3a34f0d10> JSON: {
         "arguments": "{\"qu"
       }
    type(merged[k]): <class 'openai.openai_object.OpenAIObject'>
    type(v): <class 'openai.openai_object.OpenAIObject'>
    isinstance(merged[k], dict): True
    isinstance(v, dict): True
    type(merged[k]) != type(v): False

I haven't worked with python for years (8 years to be exact), so I don't really know if the following workaround makes sense. To workaround this issue I've modified the line

elif type(merged[k]) != type(v)

for

elif (type(merged[k]) != type(v)) and (
                isinstance(merged[k], dict) != isinstance(v, dict)
            )

I will proceed with the creation of a PR and wait for feedback.

@JonatanMedinilla
Copy link
Author

After further analysis, I discovered that this problem was related to an old version of the openai package. After upgrading, the problem disappeared.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature Ɑ: models Related to LLMs or chat model modules
Projects
None yet
Development

No branches or pull requests

2 participants
@JonatanMedinilla and others