This repository contains the backend code for an email automation system powered by AI. The system is designed to automate email processing, summarization, and sentiment analysis using machine learning models. It leverages the capabilities of AI to improve email communication efficiency.
- Flask: A micro web framework for Python used to create the RESTful API.
- Python: he programming language used for development.
- PostgreSQL: A powerful, open-source object-relational database system.
- Docker: Used for containerizing PostgreSQL to ensure consistent environments.
- Ollama: A tool for running the LLAMA model used in the application.
The application is architected into several components to ensure modularity, scalability, and efficiency:
- API Layer (Flask): Handles HTTP requests and routes them to appropriate services.
- Database (PostgreSQL): Stores emails, summaries, and other related data.
- AI Models: Utilizes models like GPT-4o, LLAMA for various AI tasks such as summarization and sentiment analysis.
- Cron Jobs: Scheduled tasks that perform background processing of emails, such as sentiment analysis and automated replies.
- Python 3.x
- Docker
- Ollama tool for LLAMA model
- PostgreSQL installed on Docker
$ python3 -m venv flask_env
$ source flask_env/bin/activate
$ pip3 install -r requirements.txt
Run the following script to start the PostgreSQL container, initialize the database schema, and start the Flask application:
$ sh ./scripts/start-dev.sh
Note: Make sure Docker is running on your machine. If not, download and install Docker here.
To use the open-source LLAMA model for AI tasks, ensure you have the Ollama tool installed. Download it here. Install the LLAMA model using:
$ ollama run llama3.2
Although it's recommended to use ./scripts/start-dev.sh
, you can also start the application manually:
$ python3 run.py
Warning: Using start-dev.sh
restarts the PostgreSQL container, leading to data loss. Use this method only for initial setup.
The application includes several background processes managed by cron jobs to handle ongoing tasks like sentiment analysis:
- Sentiment Analysis: Uses GPT model for analyzing email sentiment.
- Email Summarization: Generates concise summaries for emails.
To run these cron jobs, use the make
command. For example, to start the sentiment analysis cron job:
$ make run_sentiment_analysis_gpt
These scripts are located in the cron_job
directory and are designed to run indefinitely, rechecking and processing data every 5 minutes.
ER-diagram of the DDL schema: