Code for Finetuning Pretrained Transformers into Variational Autoencoders, presented at Insights Workshop @ EMNLP 2021.
-
Download all data (penn, snli, yahoo, yelp) from this repository.
-
Change data path in
base_models.py
accordingly.
- Install dependencies.
pip install -r requirements.txt
- Run phase 1 (encoder only training):
./run_encoder_training snli
- Run phase 2 (full training):
./run_training snli <path_to_checkpoint_from_phase_1>
python evaluate_all.py -d snli -bs 256 -c <path_to_config_file> -ckpt <path_to_checkpoint_file>