Replies: 1 comment 4 replies
-
To include a simple prompt template in the
Here's an example based on your provided code: from langchain.chains import create_citation_fuzzy_match_runnable
from langchain_openai import ChatOpenAI
from langchain_core.prompts import ChatPromptTemplate, SystemMessage, HumanMessagePromptTemplate
# Define your prompt
prompt = ChatPromptTemplate(
[
SystemMessage(
"Answer the user's questions based on the below context:\n\n{context}"
),
HumanMessagePromptTemplate.from_template("{input}")
]
)
# Create the fuzzy match runnable chain
llm = ChatOpenAI() # Ensure you have an instance of your language model
chain = create_citation_fuzzy_match_runnable(llm)
# Invoke the chain with your question and context
answer = chain.invoke({"question": question, "context": context})
print(str(answer)) This setup uses the To continue talking to Dosu, mention @dosu. Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other |
Beta Was this translation helpful? Give feedback.
-
My goal is include simple prompt template in fuzzy_match_runnable. Is that possible?
My prompt:
My goal is bring prompt in to fuzzy chain.
Beta Was this translation helpful? Give feedback.
All reactions