Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

core: update load chat prompt template multi mesages #27584

Conversation

jhpiedrahitao
Copy link
Contributor

@jhpiedrahitao jhpiedrahitao commented Oct 23, 2024

Copy link

vercel bot commented Oct 23, 2024

The latest updates on your projects. Learn more about Vercel for Git ↗︎

1 Skipped Deployment
Name Status Preview Comments Updated (UTC)
langchain ⬜️ Ignored (Inspect) Visit Preview Nov 14, 2024 5:21pm

@jhpiedrahitao jhpiedrahitao marked this pull request as ready for review October 23, 2024 15:56
@dosubot dosubot bot added the size:S This PR changes 10-29 lines, ignoring generated files. label Oct 23, 2024
Copy link
Collaborator

@baskaryan baskaryan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

could we add a unit test for the change

@baskaryan baskaryan self-assigned this Oct 24, 2024
@baskaryan baskaryan added the needs test PR needs to be updated with tests label Oct 24, 2024
@dosubot dosubot bot added size:M This PR changes 30-99 lines, ignoring generated files. and removed size:S This PR changes 10-29 lines, ignoring generated files. labels Oct 24, 2024
@jhpiedrahitao
Copy link
Contributor Author

@baskaryan tests and examples for loading added

@jhpiedrahitao
Copy link
Contributor Author

@baskaryan please take a look when you have the time

@ccurme
Copy link
Collaborator

ccurme commented Dec 18, 2024

Closing as I believe this functionality exists:

from langchain_core.prompts import ChatPromptTemplate
from langchain_core.load import dumpd
from langchain_core.load.load import load
import json

prompt = ChatPromptTemplate(
    [
        ("system", "system message is {system_message}"),
        ("user", "user message is {user_message}"),
    ]
)

with open("foo.json", "w") as f:
  json.dump(dumpd(prompt), f)

with open("foo.json", "r") as f:
  reloaded = load(json.load(f))

See guide here: https://python.langchain.com/docs/how_to/serialization/

Let me know if this does not cover your use case. Thank you!

@ccurme ccurme closed this Dec 18, 2024
@jhpiedrahitao
Copy link
Contributor Author

@ccurme I think this not covers loading a full conversation (multiple turns with roles in a template), The current implementation loads only one message into the chat template

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
needs test PR needs to be updated with tests size:M This PR changes 30-99 lines, ignoring generated files.
Projects
Status: Closed
Development

Successfully merging this pull request may close these issues.

3 participants