Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

core[patch]: Remove autoupgrade to addable dict in Runnable/RunnableLambda/RunnablePassthrough transform #20677

Merged
merged 4 commits into from
Apr 23, 2024

Conversation

eyurtsev
Copy link
Collaborator

@eyurtsev eyurtsev commented Apr 19, 2024

Causes an issue for this code

from langchain.chat_models.openai import ChatOpenAI
from langchain.output_parsers.openai_tools import JsonOutputToolsParser
from langchain.schema import SystemMessage

prompt = SystemMessage(content="You are a nice assistant.") + "{question}"

llm = ChatOpenAI(
    model_kwargs={
        "tools": [
            {
                "type": "function",
                "function": {
                    "name": "web_search",
                    "description": "Searches the web for the answer to the question.",
                    "parameters": {
                        "type": "object",
                        "properties": {
                            "query": {
                                "type": "string",
                                "description": "The question to search for.",
                            },
                        },
                    },
                },
            }
        ],
    },
    streaming=True,
)

parser = JsonOutputToolsParser(first_tool_only=True)

llm_chain = prompt | llm | parser | (lambda x: x)


for chunk in llm_chain.stream({"question": "tell me more about turtles"}):
    print(chunk)

# message = llm_chain.invoke({"question": "tell me more about turtles"})

# print(message)

Instead by definition, we'll assume that RunnableLambdas consume the entire stream and that if the stream isn't addable then it's the last message of the stream that's in the usable format.


If users want to use addable dicts, they can wrap the dict in an AddableDict class.


Likely, need to follow up with the same change for other places in the code that do the upgrade

Copy link

vercel bot commented Apr 19, 2024

The latest updates on your projects. Learn more about Vercel for Git ↗︎

1 Ignored Deployment
Name Status Preview Comments Updated (UTC)
langchain ⬜️ Ignored (Inspect) Visit Preview Apr 23, 2024 2:21pm

@dosubot dosubot bot added the size:M This PR changes 30-99 lines, ignoring generated files. label Apr 19, 2024
@eyurtsev eyurtsev changed the title core[patch]: Remove autoupgrade to addable dict in RunnableLambda transofrm core[patch]: Remove autoupgrade to addable dict in RunnableLambda transform Apr 19, 2024
@dosubot dosubot bot added Ɑ: core Related to langchain-core Ɑ: Runnables Related to Runnables 🤖:improvement Medium size change to existing code to handle new use-cases labels Apr 19, 2024
@eyurtsev eyurtsev self-assigned this Apr 22, 2024
@dosubot dosubot bot added size:L This PR changes 100-499 lines, ignoring generated files. and removed size:M This PR changes 30-99 lines, ignoring generated files. labels Apr 22, 2024
@eyurtsev eyurtsev changed the title core[patch]: Remove autoupgrade to addable dict in RunnableLambda transform core[patch]: Remove autoupgrade to addable dict in Runnable/RunnableLambda/RunnablePassthrough transform Apr 22, 2024
@eyurtsev eyurtsev requested a review from baskaryan April 22, 2024 17:01
@dosubot dosubot bot added the lgtm PR looks good. Use to confirm that a PR is ready for merging. label Apr 22, 2024
@eyurtsev eyurtsev merged commit a2cc9b5 into master Apr 23, 2024
96 checks passed
@eyurtsev eyurtsev deleted the eugene/fix_streaming_issue branch April 23, 2024 14:35
hinthornw pushed a commit that referenced this pull request Apr 26, 2024
…ambda/RunnablePassthrough transform (#20677)

Causes an issue for this code

```python
from langchain.chat_models.openai import ChatOpenAI
from langchain.output_parsers.openai_tools import JsonOutputToolsParser
from langchain.schema import SystemMessage

prompt = SystemMessage(content="You are a nice assistant.") + "{question}"

llm = ChatOpenAI(
    model_kwargs={
        "tools": [
            {
                "type": "function",
                "function": {
                    "name": "web_search",
                    "description": "Searches the web for the answer to the question.",
                    "parameters": {
                        "type": "object",
                        "properties": {
                            "query": {
                                "type": "string",
                                "description": "The question to search for.",
                            },
                        },
                    },
                },
            }
        ],
    },
    streaming=True,
)

parser = JsonOutputToolsParser(first_tool_only=True)

llm_chain = prompt | llm | parser | (lambda x: x)


for chunk in llm_chain.stream({"question": "tell me more about turtles"}):
    print(chunk)

# message = llm_chain.invoke({"question": "tell me more about turtles"})

# print(message)
```

Instead by definition, we'll assume that RunnableLambdas consume the
entire stream and that if the stream isn't addable then it's the last
message of the stream that's in the usable format.

---

If users want to use addable dicts, they can wrap the dict in an
AddableDict class.

---

Likely, need to follow up with the same change for other places in the
code that do the upgrade
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Ɑ: core Related to langchain-core 🤖:improvement Medium size change to existing code to handle new use-cases lgtm PR looks good. Use to confirm that a PR is ready for merging. Ɑ: Runnables Related to Runnables size:L This PR changes 100-499 lines, ignoring generated files.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants