Skip to content

Latest commit

 

History

History
13 lines (7 loc) · 597 Bytes

File metadata and controls

13 lines (7 loc) · 597 Bytes

weaviate-toxic-comment-classifier

The dataset used for this example is available here: https://www.kaggle.com/datasets/akashsuper2000/toxic-comment-classification

  1. Run the command 'pip install -r requirements.txt' in directory to install all required dependencies
  2. Make sure the docker file is running
  3. Run the add_data.py to create schema and add data
  4. Then run script.py and enter text in the input field, Then click classify to classify the comment as toxic or Non toxic.
simplescreenrecorder-2022-04-13_16.52.51.mp4