See live action here.
This repo serves a pre-trained character-level RNN model that synthesizes text given a user-prompt.
The model is running on CPU can be deployed with a custom flex instance on Google Cloud Compute (GCP). Model serving and backend functions are accomplished with async python ASGI framework called Starlette along with ASGI server uvicorn.
- navigate to the app root directory
- activate the conda environment with Python3
- build docker image:
docker build -t rnnbot . && docker run --rm -it -p 8080:8080 rnnbot
- go to http://localhost:8080
(Note: make sure to use port 8080 on GCP Flex Runtime)
- navigate to the app root directory
- set the gcloud configuration and deploy the app:
gcloud config set project YOUR-PROJECT-ID gcloud app deploy
- once it's deployed, you can open the webapp by running this in the terminal:
or on https://[YOUR-PROJECT-NAME].appspot.com
gcloud app browse