- Ensure you have Python 3.6/3.7, Node.js, and npm installed.
Make sure you are in the backend
folder:
cd backend/
Install a virtual environment:
# If using venv
python3 -m venv venv
. venv/bin/activate
# If using conda
conda create -n write-with-gpt2 python=3.7
conda activate write-with-gpt2
# On Windows I use Conda to install pytorch separately
conda install pytorch cpuonly -c pytorch
# When environment is activated
pip install -r requirements.txt
python app.py
To run in hot module reloading mode:
uvicorn app:app --host 0.0.0.0 --reload
Runs on http://localhost:8000. You can consult interactive API on http://localhost:8000/docs.
Configuration is made via environment variable or .env
file. Available are:
- MODEL_NAME:
- to use a custom model, point to the location of the
pytorch_model.bin
. You will also need to passconfig.json
throughCONFIG_FILE
. - otherwise model from Huggingface's repository of models, defaults to
distilgpt2
.
- to use a custom model, point to the location of the
- CONFIG_FILE: path to JSON file of model architecture.
- USE_GPU:
True
to generate text from GPU.
To convert gpt-2-simple model to Pytorch, see Importing from gpt-2-simple:
transformers-cli convert --model_type gpt2 --tf_checkpoint checkpoint/run1 --pytorch_dump_output pytorch --config checkpoint/run1/hparams.json
This will put a pytorch_model.bin
and config.json
in the pytorch folder, which is what you'll need to pass to .env
file to load the model.
Make sure you are in the frontend folder, and ensure backend API is working.
cd frontend/
npm install # Install npm dependencies
npm run start # Start Webpack dev server
Web app now available on http://localhost:3000.
To create a production build:
npm run build
serve -s build # not working, did not setup redirection to API
Miniconda/Anaconda recommended on Windows.
conda command : conda install pytorch cudatoolkit=10.2 -c pytorch
.
If you install manually, you can check your currently installed CUDA toolkit version with nvcc --version
. Once you have CUDA toolkit installed, you can verify it by running nvidia-smi
.
Beware: after installing CUDA, it seems you shouldn't try to update GPU driver though GeForce or else you'll have to reinstall CUDA toolkit ?
- Write With Transformer
- React-Quill-Demo
- How To Create a React + Flask Project
- How to Deploy a React + Flask Project
- Interactive Playground - Autosave
- Mentions implementation
- Cloning Medium with Parchment
- gpt-2-cloud-run
- How To Make Custom AI-Generated Text With GPT-2
- How to generate text without fintetune?
- aitextgen
- Setting up your PC/Workstation for Deep Learning: Tensorflow and PyTorch — Windows
- CUDA Installation Guide for Microsoft Windows