Replies: 1 comment
-
To build a query pipeline that can take multiple keys in the prompt template, you can use the from llama_index.core.prompts.base import PromptTemplate
from llama_index.core.prompts.prompt_type import PromptType
# Define the prompt template
prompt_str = "answer the query according to the context. ##context:{context},##query:{query}"
prompt_tmpl = PromptTemplate(template=prompt_str, prompt_type=PromptType.CUSTOM)
# Example data to fill the template
data = {
"context": "This is the context information.",
"query": "What is the answer to the query?"
}
# Build the query pipeline
formatted_prompt = prompt_tmpl.template.format(**data)
print(formatted_prompt) This will output:
This approach allows you to dynamically insert multiple keys into your prompt template using the |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
How to build a query pipeline that can take multiple keys in the prompt template?
prompt_str = "answer the query according to the context. ##context:{context},##query:{query}"
prompt_tmpl = PromptTemplate(prompt_str)
Beta Was this translation helpful? Give feedback.
All reactions