LLMChat is a minimalist application designed to test and interact with your LLM in a user-friendly way. Seamlessly integrate local and GitHub-based knowledge to enhance your AI's contextual capabilities. 🌟
- Interactive Interface: Use LLMChat like ChatGPT but tailored to your specific knowledge base. 💬
- Custom Knowledge Sources: Link local folders and GitHub repositories to create a dynamic, up-to-date context for your LLM. 📂
- Privacy-Friendly: Runs locally, ensuring complete control over your data. 🔒
To simplify deployment, you can use Docker Compose to run both the frontend and backend.
- Install Docker and Docker Compose.
- Ensure that Ollama is running locally on your machine and accessible at
http://localhost:11434
(default configuration).
- Clone the repository:
git clone https://github.com/Bessouat40/LLMChat.git
cd LLMChat
- Start the application with Docker Compose:
docker-compose up --build
The application will be accessible at:
- Frontend:
http://localhost:3000
- Backend API:
http://localhost:8000
- Install dependencies and start the backend:
python -m pip install -r api_example/requirements.txt
python api_example/main.py
- Install dependencies and start the frontend:
npm i && npm run start
LLMChat leverages RAGLight to index and process knowledge bases, making them available for your LLM to query. It supports:
- GitHub repositories 🧑💻
- Local folders with PDFs, code, and more 📄
- Setting Up a Pipeline:
from raglight.rag.simple_rag_api import RAGPipeline
from raglight.models.data_source_model import FolderSource, GitHubSource
pipeline = RAGPipeline(knowledge_base=[
FolderSource(path="<path to folder>/knowledge_base"),
GitHubSource(url="https://github.com/Bessouat40/RAGLight")
], model_name="llama3")
pipeline.build()
response = pipeline.generate("What is LLMChat and how does it work?")
print(response)
You can find an API example in the api_example/main.py
file. This shows how the backend handles requests and interacts with the LLM.
🚀 Get started with LLMChat today and enhance your LLM with custom knowledge bases!