Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

langchain[minor]: openai tools structured_output_chain #17296

Merged
merged 8 commits into from
Feb 22, 2024

Conversation

baskaryan
Copy link
Collaborator

No description provided.

Copy link

vercel bot commented Feb 9, 2024

The latest updates on your projects. Learn more about Vercel for Git ↗︎

1 Ignored Deployment
Name Status Preview Comments Updated (UTC)
langchain ⬜️ Ignored (Inspect) Visit Preview Feb 22, 2024 11:34pm

@dosubot dosubot bot added size:L This PR changes 100-499 lines, ignoring generated files. Ɑ: parsing Related to output parser module 🤖:improvement Medium size change to existing code to handle new use-cases 🔌: openai Primarily related to OpenAI integrations labels Feb 9, 2024
@eyurtsev eyurtsev self-assigned this Feb 13, 2024
@ccurme
Copy link
Collaborator

ccurme commented Feb 20, 2024

@baskaryan
Copy link
Collaborator Author

Should we deprecate this? https://github.com/langchain-ai/langchain/blob/master/libs/langchain/langchain/chains/openai_tools/extraction.py

probably. think we should first land #17302 and figure out what, if any, other extraction chains/utils we're adding, and then deprecate the various existing extraction chains

@@ -299,33 +306,46 @@ class Dog(BaseModel):
chain = create_structured_output_runnable(Dog, llm, prompt, mode="openai-json")
chain.invoke({"input": "Harry was a chubby brown beagle who loved chicken"})
""" # noqa: E501
# for backwards compatibility
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i don't think we want to support this outside of openai-functions?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I can push inside, but I usually handle this at top scope, even better would be to fix this via a function level decorator to add a parameter renamer

def __init__(self, key_name: str, **kwargs: Any) -> None:
"""Allow init with positional args."""
# Backwards compatibility for old argument name.
if "return_single" in kwargs:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Adding backwards compatibility

Comment on lines +180 to +185
prompt = ChatPromptTemplate.from_messages(
[
("system", "You are an extraction algorithm. Please extract every possible instance"),
('human', '{input}')
]
)
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

would be nice to have an example without prompt

@baskaryan baskaryan merged commit b0cfb86 into master Feb 22, 2024
40 checks passed
@baskaryan baskaryan deleted the bagatur/openai_tool_structured_output_chain branch February 22, 2024 23:42
al1p pushed a commit to al1p/langchain that referenced this pull request Feb 27, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🤖:improvement Medium size change to existing code to handle new use-cases 🔌: openai Primarily related to OpenAI integrations Ɑ: parsing Related to output parser module size:L This PR changes 100-499 lines, ignoring generated files.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants