Mechansim to load prompt templates from a file (e.g. yaml) #21672
Replies: 4 comments 1 reply
-
Hi @arthurGrigo, If you want to use versioned prompts, you can use the Langsmith hub to host your prompts. from langchain import hub
prompt = hub.pull("maximeperrindev/react-chat") #remplace with your prompt name This will allow you to dynamically store your prompts and use the versioning too. |
Beta Was this translation helpful? Give feedback.
-
Thank you for your suggestion but I think many developer's don't want to to upload their prompts to the hub because of privacy concerns. I will try to post some code here in the next days to outline my proposed approach. |
Beta Was this translation helpful? Give feedback.
-
+1 I'm also looking for this, I think there is a way to load simple prompts from yaml but it doesn't work when you have something complex like a ChatPromptTemplate that's composed of multiple messages such as system message prompt, chat_history and human message prompt. This code here only looks at the first message in the list and assumes that's the entire prompt template |
Beta Was this translation helpful? Give feedback.
-
are there new developments on this? |
Beta Was this translation helpful? Give feedback.
-
Is there a suggested way for how to manage your prompt template strings in your code?
Often times I see people writing their templates in string variables within the py script.
I personally think it would make sense to separate code and prompt templates from each other. This would also allow to easily switch between prompt versions without blowing up your code.
Is there a mechanism in langchain to load prompt templates from a file? (I would suggest yaml)
If there is no mechanism yet I could write one and create a pull request.
Beta Was this translation helpful? Give feedback.
All reactions