Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

langgraph: fix add_node input schema error #1332

Merged
merged 3 commits into from
Aug 22, 2024

Conversation

gbaian10
Copy link
Contributor

@gbaian10 gbaian10 commented Aug 13, 2024

fix: #1331

I used inspect.signature to read the name of the first parameters and determine whether this variable has a type_hint.

And used StopIteration to prevent situations where there are no parameters.
Although in this situation, executing invoke will still result in an error.

If want to catch errors before running the graph, maybe you can check that the number of parameters is not zero?

@gbaian10
Copy link
Contributor Author

gbaian10 commented Aug 13, 2024

Regarding first_parameter_name = next(iter(inspect.signature(action).parameters.keys()))

Also can use list() and then check the length before accessing [0].

I'm not sure which one is better.

Copy link
Contributor

@nfcampos nfcampos left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, do you want to add a test case?

@gbaian10
Copy link
Contributor Author

gbaian10 commented Aug 14, 2024

Thanks, do you want to add a test case?

I plan to add tests using this logic. Do you have any better suggestions?

from typing import TypedDict

from langchain_core.runnables import RunnableConfig
from langgraph.graph import StateGraph

# from libs.langgraph.langgraph.graph.state import StateGraph


class InputState(TypedDict):
    question: str


class OutputState(TypedDict):
    input_state: InputState


def foo(state: InputState) -> OutputState:
    return {"input_state": state}


def bar(state, config: RunnableConfig) -> OutputState:
    # The expected `state` result is `InputState`, but received `RunnableConfig`.
    return {"input_state": state}


def baz(state, config) -> OutputState:
    # The expected `state` result is `InputState`, but received `OutputState`.
    return {"input_state": state}


graph = StateGraph(input=InputState, output=OutputState)
graph.add_node(foo)
graph.add_node(bar)
graph.add_node(baz)

graph.set_entry_point(foo.__name__)
graph.add_edge(foo.__name__, bar.__name__)
graph.add_edge(bar.__name__, baz.__name__)
graph.set_finish_point(baz.__name__)

graph = graph.compile()

for v in graph.stream({"question": "Hello"}, stream_mode="values"):
    print(v)
    if "input_state" in v:
        assert v["input_state"] == {"question": "Hello"}

@gbaian10 gbaian10 requested a review from nfcampos August 16, 2024 04:40
@vbarda
Copy link
Collaborator

vbarda commented Aug 22, 2024

Thank you @gbaian10 -- going to merge!

@vbarda vbarda merged commit 22f5367 into langchain-ai:main Aug 22, 2024
43 checks passed
@gbaian10 gbaian10 deleted the fix/#1331 branch August 22, 2024 16:44
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

The graph incorrectly obtained the input schema
3 participants