Langchain for document chat with references #7305
Replies: 1 comment
-
Answer generated by a 🤖 AnswerI understand that you're looking to use LangChain for a chat application that can provide answers based on specific documents and also make references or citations to those documents. Additionally, you're interested in self-hosting a model for use with LangChain, rather than relying on a proprietary service. Yes, this is absolutely possible. LangChain is a Python framework designed for developing applications powered by language models. It allows you to assemble language model components into chains, which can be used for applications like the one you're describing. You can use the document retriever component of LangChain to fetch documents and extract information from them. This information can then be used to provide answers in your chat application, along with references to the document from which the information was extracted. Regarding self-hosting a model, LangChain is designed to be modular and flexible. This means you can use your own models with it. You can host your model and use it with LangChain, eliminating the need to rely on a proprietary service. Here's a basic example of how you might set this up: from langchain.chains import create_citation_fuzzy_match_chain
from langchain.chat_models import ChatOpenAI
# Initialize LangChain with your model
llm = ChatOpenAI(temperature=0, model="gpt-3.5-turbo-0613")
# Create a citation fuzzy match chain
chain = create_citation_fuzzy_match_chain(llm)
# Now you can use LangChain to answer questions based on your documents
question = "What did the author do during college?"
context = """
My name is Jason Liu, and I grew up in Toronto Canada but I was born in China.
I went to an arts highschool but in university I studied Computational Mathematics and physics.
As part of coop I worked at many companies including Stitchfix, Facebook.
I also started the Data Science club at the University of Waterloo and I was the president of the club for 2 years.
"""
result = chain.run(question=question, context=context)
print(result) Please note that this is a simplified example and the actual implementation may vary depending on your specific requirements and the specifics of your model and documents. For more detailed examples and use cases, I recommend referring to the LangChain documentation and the end-to-end examples provided. If you find this solution helpful and believe it could benefit others, feel free to make a pull request to update the documentation. This way, other users who might have similar questions can benefit from your contribution. Sources
This response is meant to be useful, save you time, and share context. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. |
Beta Was this translation helpful? Give feedback.
-
I would like to use Langchain for a chat that answers based on documents we make available to a model with Langhchain. However, I would also like to give answers within the chat that make references/citations to the respective document, from which they got the information from.
Ideally, I also would like to self-host a model that is used by Langchain, i.e., I do not want to rely on a proprietary service.
Is such a thing possible? If so, how?
Beta Was this translation helpful? Give feedback.
All reactions