-
Notifications
You must be signed in to change notification settings - Fork 15.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
core: update load chat prompt template multi mesages #27584
core: update load chat prompt template multi mesages #27584
Conversation
The latest updates on your projects. Learn more about Vercel for Git ↗︎ 1 Skipped Deployment
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
could we add a unit test for the change
…hat_prompt_template' into snova-jorgep/load_chat_prompt_template
@baskaryan tests and examples for loading added |
@baskaryan please take a look when you have the time |
Closing as I believe this functionality exists: from langchain_core.prompts import ChatPromptTemplate
from langchain_core.load import dumpd
from langchain_core.load.load import load
import json
prompt = ChatPromptTemplate(
[
("system", "system message is {system_message}"),
("user", "user message is {user_message}"),
]
)
with open("foo.json", "w") as f:
json.dump(dumpd(prompt), f)
with open("foo.json", "r") as f:
reloaded = load(json.load(f)) See guide here: https://python.langchain.com/docs/how_to/serialization/ Let me know if this does not cover your use case. Thank you! |
@ccurme I think this not covers loading a full conversation (multiple turns with roles in a template), The current implementation loads only one message into the chat template |
with multiple messages (loles) when using file loading (load from config)