A standalone Python Flask web application that bridges Strapi CMS and local Ollama LLM models to provide content internationalization.
- Integrate with Strapi v5+ using the REST API
- Leverage local Ollama LLM models for translation
- Configure different models for different target languages
- Batch translate content with a user-friendly UI
- Monitor translation job progress in real-time
The translation system consists of three main components:
- Flask Web App: Provides a web UI and endpoints for configuration and triggering translations
- Strapi CMS: The content source and destination
- Ollama LLM Service: Hosts language models locally for translation
- Python 3.8+
- Strapi v5+ with internationalization enabled
- Ollama with at least one model installed
-
Clone this repository:
git clone https://github.com/yourusername/strapi-ollama-translator.git cd strapi-ollama-translator
-
Create a virtual environment and install dependencies:
python -m venv venv source venv/bin/activate # On Windows: venv\Scripts\activate pip install -r requirements.txt
-
Configure environment variables: Create a
.env
file with the following variables:STRAPI_BASE_URL=http://localhost:1337 STRAPI_API_TOKEN=your_strapi_api_token STRAPI_SOURCE_LOCALE=en OLLAMA_BASE_URL=http://localhost:11434
-
Run the application:
python app.py
-
Access the web UI at
http://localhost:5000
- In the web UI, go to the "Configuration" page.
- For each target language, select an appropriate Ollama model.
- Click "Save Configuration" to apply the changes.
- Go to the "Translate" page in the web UI.
- Select a content type to translate.
- Choose whether to translate all entries or specific ones.
- Select target languages for translation.
- Click "Start Translation" to begin the process.
- Monitor the progress on the status page.
GET /models
: List available Ollama modelsGET/POST /config
: Get or update model configurationsGET /content-types
: List content types from StrapiGET /entries/<content_type>
: List entries for a content typePOST /translate
: Trigger a translation jobGET /status
: Get current job status
- The system fetches content from Strapi in the source language
- For each target language, it sends the content to the selected Ollama model
- The model generates translations for each text field
- The system saves the translated content back to Strapi using the appropriate locale
MIT
- Strapi: https://strapi.io/
- Ollama: https://ollama.ai/