Skip to content

Trying to build a model that can detect different types of toxicity like threats, obscenity, insults, and identity-based hate better

Notifications You must be signed in to change notification settings

bharatv007/Kaggle-Toxic-Comment

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 

About

Trying to build a model that can detect different types of toxicity like threats, obscenity, insults, and identity-based hate better

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published