Skip to content

Latest commit

 

History

History
313 lines (257 loc) · 15.7 KB

DEVELOPMENT.md

File metadata and controls

313 lines (257 loc) · 15.7 KB

Development

Welcome to the Earth Defenders Assistant development documentation. This guide provides essential information for developers contributing to our platform, which leverages whatsapp-web.js for creating customizable WhatsApp bots. Our system employs a flexible plugin architecture to support a wide range of applications, enabling the deployment of personalized bots tailored to diverse community needs.

Getting Started · Architecture · Contributing · Deployment

Getting Started

This project uses Turborepo as a monorepo architecture to manage multiple packages and applications. The codebase is split between TypeScript (for core processing, database interactions, and frontend applications) and Python (for AI/ML services and natural language processing).

Prerequisites

  • Node.js (v20 or later)
  • Bun (v1.1.33) - NodeJS manager
  • Python (v3.11)
  • uv - Python manager
  • Git
  • Docker (optional, for containerized deployment)

Installing

Make sure you have the necessary dependencies. The following commands will install the required development tools:

curl -o- https://raw.githubusercontent.com/nvm-sh/nvm/v0.40.1/install.sh | bash # install Node Version Manager
curl -fsSL https://bun.sh/install | bash # Install Bun NodeJS package manager and runtime
curl -LsSf https://astral.sh/uv/install.sh | sh # Install uv Python package manager

Install Node.js v20 and pin it using nvm:

nvm install 20
nvm use 20
nvm alias default 20

And Python 3.11 and pin it using uv:

uv python install 3.11
uv python pin 3.11

External Services

The platform's core services can be run either locally or accessed via cloud providers during development. While local setup instructions are available in the deployment section, running all services locally can be resource-intensive. Each service provider offers a free tier that you can use by creating an account, making cloud-based development a viable alternative for resource-constrained environments.

Service Purpose Required
Supabase Backend-as-a-Service (BaaS) platform Yes
Trigger.dev Workflow automation and job scheduling Yes
Langtrace LLM observability and monitoring For AI Dev

Optional Monitoring Services

Service Purpose
OpenPanel Analytics and data visualization
Upstash Redis-compatible database and caching
Sentry Error tracking and performance monitoring

Installation

Clone this repo locally with the following command:

git clone https://github.com/digidem/earth-defender-assistant.git
cd earth-defenders-assistant
  1. Install dependencies using bun:
bun i
  1. Copy .env.example to .env and update the variables.
# Copy .env.example to .env for each app
cp packages/simulator/.env.example packages/simulator/.env
cp packages/jobs/.env.example packages/jobs/.env
cp apps/api/.env.example apps/api/.env
cp apps/ai_api/.env.example apps/ai_api/.env
cp apps/dashboard/.env.example apps/dashboard/.env
cp apps/landingpage/.env.example apps/landingpage/.env
cp apps/whatsapp/.env.example apps/whatsapp/.env
cp deploy/trigger-stack/.env.example deploy/trigger-stack/.env
cp deploy/langtrace-stack/.env.example deploy/langtrace-stack/.env
  1. Before starting the development server, ensure you have the required environmental variables in place:
  • Connect to Trigger instance by setting the correct TRIGGER_PROJECT_ID and TRIGGER_API_URL variables in the packages/jobs/.env file from Local Trigger or Cloud Trigger
  • Add the correct TRIGGER_SECRET_KEY to apps/whatsapp/.env and packages/simulator/.env from Local Trigger or Cloud Trigger (docs)
  • Add the correct SUPABASE_SERVICE_ROLE_KEY to the packages/jobs/.env from Local Supabase or Cloud Supabase
  • Add valid CEREBRAS_API_KEY or OPENAI_API_KEY to apps/ai_api/.env from Cerebras or OpenAI
  • (optional) Add the correct LANGTRACE_API_KEY to the .env in apps/ai_api/.env and packages/simulator/.env
  • Add a valid GROQ_API_KEY to packages/simulator/.env from Groq Console
  • Add the correct SERPER_API_KEY to the apps/ai_api/.env
  1. Finally start the development server from either bun or turbo:
// Basic development
bun prepare:db // prepares database and pre-commit hooks
bun dev // starts simulator, Supabase api and Trigger.dev jobs
// Other available commands
bun dev:all // starts all services in development mode
bun dev:simulator // starts a user simulation using a LLM
bun dev:api // starts the Supabase API service in development mode
bun dev:jobs // starts the Trigger.dev service in development mode
bun dev:ai // starts the AI API service in development mode (uses Python uv)
bun dev:whatsapp // starts the WhatsApp service in development mode
bun dev:dashboard // starts the dashboard in development mode
bun dev:landingpage // starts the landing page in development mode
bun dev:email // starts the email service in development mode
bun dev:docs // starts the documentation service in development mode
bun deploy:trigger // deploys local Trigger.dev instance using Docker
bun deploy:langtrace // deploys local LangTrace instance using Docker
// Database
bun migrate // run Supabase migrations
bun seed // run Supabase seed
  1. Running bun dev starts three key development services in parallel:
  • The simulator (@eda/simulator) which simulates a community member interacting with the system
  • The API service (@eda/api) which handles core backend functionality using Supabase (will run a local Supabase instance using Docker)
  • The jobs service (@eda/jobs) which processes background tasks using Trigger.dev
  • The AI API service (@eda/ai-api) which handles AI-related functionality exposing plugins through a FastAPI server.

Access the Applications:

Development should be done primarily through the simulator, which creates realistic scenarios of community members interacting with the system. The simulator enables testing conversations between two AI agents - one representing a community member and another representing the assistant.

If using a local Trigger.dev instance for job processing, first run bun deploy:trigger before bun dev. For local LangTrace observability, deploy with bun deploy:langtrace.

Architecture

This project follows a modular architecture, organized as a Turborepo monorepo. Key components include:

  • Apps: Front-end applications such as the landingpage, user dashboard and WhatsApp web interface.
  • Packages: Shared libraries and utilities used across applications, including analytics, email templates, background jobs, key-value storage, logging, simulation tools, database clients, TypeScript types, and UI components.
  • Plugins: Extensible AI service modules and plugins that can be integrated into the core applications, currently including grant analysis and evaluation tools with potential for additional domain-specific plugins.
  • Deploy: Utilize Docker Compose and scripts for deploying services used by the applications.

Tech Stack

Turborepo - Build system
Biome - Linter, formatter
Supabase - Authentication, database, storage
Trigger.dev - Background jobs
FastAPI - Python web framework
Langtrace - LLM monitoring and evaluation
Starlight - Documentation
Upstash - Cache and rate limiting
React Email - Email templates
Sentry - Error handling/monitoring
OpenPanel - Analytics

Directory Structure

.
├── apps                         # App workspace
│    ├── ai_api                  # Python FastAPI for exposing and calling AI plugins
│    ├── api                     # Supabase (API, Auth, Storage, Realtime, Edge Functions)
│    ├── dashboard               # User dashboard
│    ├── landingpage             # Product Landing Page
│    ├── whatsapp                # Whatsapp Web instance
│    └── docs                    # Product Documentation
├── packages                     # Shared packages between apps
│    ├── analytics               # OpenPanel analytics
│    ├── email                   # React email library
│    ├── jobs                    # Trigger.dev background jobs
│    ├── kv                      # Upstash rate-limited key-value storage
│    ├── logger                  # Logger library
│    ├── simulator               # Simulates scrapping and user interactions
│    ├── supabase                # Supabase - Queries, Mutations, Clients
│    ├── types                   # Shared TypeScript type definitions
│    ├── typescript-config       # Shared TypeScript configuration
│    └── ui                      # Shared UI components (Shadcn)
├── deploy                       # Deploy workspace
│    ├── langtrace               # Langtrace stack
│    ├── briefer-stack           # Briefer stack components
│    ├── supabase-stack          # Supabase (API, Auth, Storage, Realtime, Edge Functions)
│    ├── training-stack          # LLM-training framework stack
│    ├── trigger-stack           # Trigger.dev stack components
│    └── zep-stack               # Zep stack components
├── plugins                      # Plugin workspace
│    └── grant_plugin            # The AI grant plugin for EDA
├── tooling                      # Shared configuration that are used by the apps and packages
│    └── typescript              # Shared TypeScript configuration
├── .cursorrules                 # Cursor rules specific to this project
├── biome.json                   # Biome configuration
├── turbo.json                   # Turbo configuration
├── LICENSE
└── README.md

Project Diagram

    flowchart TD

    %% Define styles for different components
    classDef userStyle fill:#A0C4FF,stroke:#003F5C,stroke-width:1.5px,color:#000
    classDef externalStyle fill:#BDB2FF,stroke:#6A4C93,stroke-width:1.5px,color:#000
    classDef botStyle fill:#C9F0FF,stroke:#69C0FF,stroke-width:1.5px,color:#000
    classDef processStyle fill:#FFC6FF,stroke:#9D4EDD,stroke-width:1.5px,color:#000
    classDef databaseStyle fill:#FFFFB5,stroke:#FFCA3A,stroke-width:1.5px,color:#000
    classDef serviceStyle fill:#B9FBC0,stroke:#43AA8B,stroke-width:1.5px,color:#000
    classDef llmStyle fill:#FFD6A5,stroke:#F9844A,stroke-width:1.5px,color:#000
    classDef externalAppStyle fill:#FFADAD,stroke:#D00000,stroke-width:1.5px,color:#000

    %% User Interaction
    subgraph UserInteraction["User Interaction"]
        direction LR
        User["User"]:::userStyle
        WhatsApp["WhatsApp"]:::externalStyle
        Bot["Earth Defenders Assistant"]:::botStyle
    end

    %% Orchestration & Processing
    subgraph Orchestration["Orchestration & Processing"]
        direction TB
        MessageProcessor["Message Processor"]:::processStyle
        TriggerDev["Trigger.dev<br>(Events Monitoring)"]:::externalAppStyle

        subgraph Supabase["Supabase DB"]
            direction TB
            receivedMessages["receivedMessages"]:::databaseStyle
            toSendMessages["toSendMessages"]:::databaseStyle
            MemoryDB["Graph Memory<br>(Supabase DB)"]:::databaseStyle
        end
    end

    %% Memory Management
    subgraph MemoryLayer["Long Term Memory"]
        direction TB
        Zep["Zep"]:::externalAppStyle
    end

    %% LLMTraining
    subgraph LLMTraining["Training and Fine-Tuning"]
        direction TB
        LLM-training["LLM-training"]:::externalAppStyle
    end


    %% Plugins
    subgraph LLMProcessing["Plugins"]
        direction TB
        Plugins["Plugins<br>(Intent Classification,<br>Tool Calling)"]:::processStyle
        LLMOperations["LLM Operations"]:::llmStyle
        Tools["Tools"]:::serviceStyle
        Briefer["Briefer<br>(Human Readable DB)"]:::externalAppStyle
        ExternalTools["External Tools"]:::serviceStyle
        Langtrace["Langtrace<br>(Monitoring and Evaluation)"]:::externalAppStyle
    end

    %% Connections
    User -->|Communicates| WhatsApp
    WhatsApp -->|Routes| Bot
    Bot -->|Orchestrates| TriggerDev
    TriggerDev -->|Adds Messages to| receivedMessages
    receivedMessages -->|Read by| MessageProcessor
    MessageProcessor -->|Processes| Plugins
    Plugins -->|Call| LLMOperations
    Plugins -->|Call| Tools
    LLMOperations -->|Call| Tools
    LLMOperations -->|Feedback to| Langtrace
    MessageProcessor -->|Memory Update| Zep
    Zep <-->|Stores and Retrieves| MemoryDB
    Zep -->|Add to prompt|LLMOperations
    LLM-training -->|Inference|LLMOperations
    Tools -->|Call|Briefer
    Tools <-->|Call|ExternalTools
    Tools -->|Queue message|toSendMessages
    Briefer -->|Feed data|LLM-training
    Langtrace -->|Feed data| LLM-training
    Bot -->|Responds| WhatsApp
    WhatsApp -->|Delivers| User
Loading

Contributing

We welcome contributions from the community. To get started:

  1. Fork the repository.
  2. Create a new feature branch (git checkout -b feature/your-feature-name).
  3. Commit your changes (git commit -m 'Add your feature message').
  4. Push to the branch (git push origin feature/your-feature-name).
  5. Open a Pull Request.

Please adhere to the project's Code of Conduct and ensure all tests pass before submitting your pull request.

Deployment

For detailed deployment instructions, please refer to our comprehensive Deployment Guide. This guide provides step-by-step instructions for setting up and deploying each component of our stack.