Skip to content

CDoc lets you chat with your documents using local LLMs, combining Ollama, ChromaDB, and LangChain for offline, secure, and efficient information extraction. Perfect for researchers, developers, and professionals seeking quick insights from their documents.

Notifications You must be signed in to change notification settings

ChatDocDev/CDoc

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

45 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

CDoc: Chat with Your Document

CDoc empowers you to have a conversation with your documents using local large language models (LLMs) and the power of Ollama, ChromaDB, and LangChain.

Key Features:

  • Chat with Documents: Ask questions and get answers directly from your documents.
  • Local LLM Support: Leverage the capabilities of local LLMs for offline document interaction.
  • ChromaDB Support: Store and manage document metadata efficiently with ChromaDB.
  • LangChain Integration: Streamline information extraction from documents through LangChain.

Target Users:

  • Researchers and students seeking an efficient way to interact with research papers.
  • Developers and programmers looking to analyze code documentation.
  • Professionals wanting to extract key information from contracts and legal documents (Optional with OpenAI API).

Installation

Prerequisites:

Installation Steps:

  1. Clone the repository:

    git clone https://github.com/ChatDocDev/CDoc
    
  2. Navigate to the project directory:

    cd CDoc
    
  3. Open project directory in VSCode

    code .
    

or any other code editor

  1. Install dependencies from requirements.txt

    pip install -r requirements.txt
    
  2. Pull the required models from Ollama

    • Download & install Ollama if not installed

    • Open terminal & run these command to pull the required models into local machine

      For llama3

      ollama pull llama3:latest
      

      For nomic-embed-text

      ollama pull nomic-embed-text:latest
      
    • Insure both models are downloaded

      ollama ls
      

      Screenshot 2024-08-26 at 12 36 17 PM

    • Serve Ollama

      ollama serve
      

      goto localhost:11434 & you should get Ollama is running

      Screenshot 2024-08-26 at 12 59 48 PM

  3. BACKEND

    go to backend directory

    cd backend
    

    create db folder for storing Chromadb files

    mkdir db
    

    Start Chromadb server:

    chroma run --path db --port 8001
    

    Screenshot 2024-08-26 at 1 12 53 PM

    Open new terminal and go into backend folder(hint: cd backend) & Run backend server:

    python backend.py
    

    Screenshot 2024-08-26 at 1 23 53 PM

  4. FRONTEND

    Open new terminal and go to frontend folder

    cd frontend
    

    Run frontend.py

    streamlit run frontend.py
    

    Screenshot 2024-08-26 at 1 26 54 PM

Screenshot 2024-08-26 at 1 30 19 PM

About

CDoc lets you chat with your documents using local LLMs, combining Ollama, ChromaDB, and LangChain for offline, secure, and efficient information extraction. Perfect for researchers, developers, and professionals seeking quick insights from their documents.

Topics

Resources

Stars

Watchers

Forks

Languages