Skip to content

OkYongChoi/distil-nlp-models-training

Repository files navigation

Train and Use Distil NLP Models

With Distil models, can train and check whether the models are trained well very quickly

Dataset: "emotion" dataset from HuggingFace Hub.

  1. Fine Tuning DistilBERT for Text Classification
  • Classify texts to emotions from the trained model
  1. Fine Tuning DistilGPT2 for Text Generation
  • Generate texts from the trained model

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published