Skip to content

Commit

Permalink
docs: update docker image urls (#243)
Browse files Browse the repository at this point in the history
* Change old docker image url to the one that is relevant to this repo

* Change old docker image url to the one that is relevant to this repo in docker_images.md

* Fixed mixup in gpu and cpu versions
  • Loading branch information
DelovoiDC authored Jan 7, 2025
1 parent 49fa9b9 commit d8c2224
Show file tree
Hide file tree
Showing 2 changed files with 7 additions and 7 deletions.
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -170,7 +170,7 @@ You can also try out Coqui TTS without installation with the docker image.
Simply run the following command and you will be able to run TTS:

```bash
docker run --rm -it -p 5002:5002 --entrypoint /bin/bash ghcr.io/coqui-ai/tts-cpu
docker run --rm -it -p 5002:5002 --entrypoint /bin/bash ghcr.io/idiap/coqui-tts-cpu
python3 TTS/server/server.py --list_models #To get the list of available models
python3 TTS/server/server.py --model_name tts_models/en/vctk/vits # To start a server
```
Expand Down
12 changes: 6 additions & 6 deletions docs/source/docker_images.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,11 +7,11 @@ You can use premade images built automatically from the latest TTS version.

### CPU version
```bash
docker pull ghcr.io/coqui-ai/tts-cpu
docker pull ghcr.io/idiap/coqui-tts-cpu
```
### GPU version
```bash
docker pull ghcr.io/coqui-ai/tts
docker pull ghcr.io/idiap/coqui-tts
```

## Building your own image
Expand All @@ -25,14 +25,14 @@ You can pass any tts argument after the image name.

### CPU version
```bash
docker run --rm -v ~/tts-output:/root/tts-output ghcr.io/coqui-ai/tts-cpu --text "Hello." --out_path /root/tts-output/hello.wav
docker run --rm -v ~/tts-output:/root/tts-output ghcr.io/idiap/coqui-tts-cpu --text "Hello." --out_path /root/tts-output/hello.wav
```
### GPU version
For the GPU version, you need to have the latest NVIDIA drivers installed.
With `nvidia-smi` you can check the CUDA version supported, it must be >= 11.8

```bash
docker run --rm --gpus all -v ~/tts-output:/root/tts-output ghcr.io/coqui-ai/tts --text "Hello." --out_path /root/tts-output/hello.wav --use_cuda
docker run --rm --gpus all -v ~/tts-output:/root/tts-output ghcr.io/idiap/coqui-tts --text "Hello." --out_path /root/tts-output/hello.wav --use_cuda
```

## Start a server
Expand All @@ -41,14 +41,14 @@ Start the container and get a shell inside it.

### CPU version
```bash
docker run --rm -it -p 5002:5002 --entrypoint /bin/bash ghcr.io/coqui-ai/tts-cpu
docker run --rm -it -p 5002:5002 --entrypoint /bin/bash ghcr.io/idiap/coqui-tts-cpu
python3 TTS/server/server.py --list_models #To get the list of available models
python3 TTS/server/server.py --model_name tts_models/en/vctk/vits
```

### GPU version
```bash
docker run --rm -it -p 5002:5002 --gpus all --entrypoint /bin/bash ghcr.io/coqui-ai/tts
docker run --rm -it -p 5002:5002 --gpus all --entrypoint /bin/bash ghcr.io/idiap/coqui-tts
python3 TTS/server/server.py --list_models #To get the list of available models
python3 TTS/server/server.py --model_name tts_models/en/vctk/vits --use_cuda
```
Expand Down

0 comments on commit d8c2224

Please sign in to comment.