This Ansible role automates the installation and configuration of Docker, NVIDIA container toolkit, Ollama LLM WebUI, Open-WebUI with CUDA support, and Nginx reverse proxy. It includes GPU detection, Docker container management, and SSL configuration for the Nginx reverse proxy to expose the WebUIs securely.
When complete, you should be able to reach the OpenWebUI interface with https://IP.ADDRESS.HERE
.
- Ubuntu (or a compatible distribution)
- NVIDIA GPU
- Ansible with
community.docker
collection
To install a basic Ansible distribution with the community.docker
module from which to install the project:
cd ~
apt install pipx
pipx install --include-deps ansible
ansible-galaxy collection install community.docker
The Ansible ./inventory
file assumes the server is called "docker-server". If you want to continue using that alias, edit your Ansible server's /etc/hosts
file so it's reachable, and copy your ansible servers rsa pub key to it.
Example:
- Edit /etc/hosts
sudo vi /etc/hosts
# IP to openWebUI alias on Ansible server
192.168.x.x ai docker-server
- Generate an ssh key
cd ~
ssh-keygen
<enter>
<enter>
- Copy your newly generated rsa pub key to docker-server
ssh-copy-id docker-server
Popular models can be configured in vars/main.yml
. Uncomment the models you want to run or add new ones:
Example:
ollama_models:
- ollama run llama3.1
- ollama run dolphin-llama3
# Add more models as needed
To deploy the setup, run the following command:
ansible-playbook -i inventory playbook.yml
The role includes handlers to restart Docker and Nginx containers:
Restart_Docker
: Restarts the Docker service.Restart Nginx container
: Restarts the Nginx container.
The main tasks performed by this role include:
- Adding Docker GPG key and repository.
- Installing required Ubuntu packages.
- Setting up NVIDIA GPU support if detected.
- Downloading and verifying Docker Compose.
- Ensuring required directories exist.
- Generating self-signed SSL certificates for Nginx.
- Copying Nginx and Docker Compose configuration files.
- Deploying the Docker Compose stack.
- Waiting for services to start.
- Installing LLMs into the Ollama container if specified (no models loaded by default)
This project is licensed under the MIT License.
Contributions are welcome! Please submit a pull request or open an issue to discuss any changes.
Special thanks to the contributors and the open-source community for their support.