This project demonstrates how to build a question-and-answer chatbot using Langflow, with a document loaded from local memory. Instead of OpenAI, we used Mistral AI for the LLM model.
Langflow is a visual framework for building multi-agent and RAG applications. It is Open-source, Python-powered, fully customizable, LLM and vector store agnostic
Important Note : Detailed instructions on how to install and build on Langflow are contained on their github link here.
- Langflow installed and running
# Make sure you have >=Python 3.10 installed on your system.
python -m pip install langflow -U
Then, run Langflow with:
python -m langflow run
A successful install and run would show the access link as shown below, which will open in your default browser. I ran the installation through vscode, but feel free to leverage your favorite IDE.
- Mistral AI API key created
You will need to create a Mistral AI account to generate an API key. Please note that Mistral is only free for a limited time after you create an account. Access the console through the link here
From the Langflow dashboard, click New Project. Select Document QA from the template option. You also have an option to selected a blank canvas to include each of the components listed below.
This creates a basic chatbot flow with the following components:
- Chat Input
- Prompt
- Mistral AI (replacing the OpenAI component)
- Chat Output
- Files
The Files component loads a file from your local machine into the Prompt component as {Document}. The Prompt component is instructed to answer questions based on the contents of {Document}. Including a file with the prompt gives the Mistral AI component context it may not otherwise have access to.
- To create an environment variable for the Mistral AI component, in the Mistral AI API Key field, click the Globe button, and then click Add New Variable.
- In the Variable Name field, enter mistral_api_key.
- In the Value field, paste your Mistral AI API Key.
- Click Save Variable.
- In the Files component, click within the Path field.
- Select a local file, and then click Open.
- The file name appears in the field.
- Click the Run button. The Interaction Panel opens, where you can converse with your bot.
- Type a message and press Enter.
For this example, we loaded a document on US 482 Regulations regarding methods to determine taxable income in connection with a transfer of intangible property. We asked, "What is an intangible? can you give me some examples?" The bot responded with an accurate summary based on the contents of the document.
Builders have a couple of options to share workflows:
- Export workfow in json format. Once exported, worflow can be run
results = run_flow_from_json("path/to/flow.json", input_value="What is an intangile?")
This project showcases how to leverage Langflow and Mistral AI to create a functional Document QA chatbot that can provide context-aware answers based on the content of a loaded document, such as the US 482 Regulations on methods to determine taxable income in connection with a transfer of intangible property.
Feel free to reach out if you have any questions/suggestions or ideas for a project. Thanks for reading!