Expose prompt variables in callback #17080
AveshCSingh
started this conversation in
Ideas
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Checked
Feature request
Consider this simple prompt with variables context and question:
When this prompt is used within a chain and a CallbackHandler is provided, the callback interfaces exposes the prompt passed to the LLM, but interpolates variables. For example, suppose you define:
MyCallbackHandler sees the LLM input:
This feature request is to provide callbacks with the PromptTemplate and variables, for example, the following could be passed to
on_chat_model_start
:I am new to LangChain, so please let me know if there's a more natural interface to pass this information to the CallbackHandler.
Motivation
Tuning RAG applications involves prompt engineering, updating chunking, updating retrieval, re-ranking, etc.
By separating out the context from the variables, applications that aid in RAG tuning can use LangChain callback handlers to identify which variables are being modified between experimental "branches".
Proposal (If applicable)
No response
Beta Was this translation helpful? Give feedback.
All reactions