EddieBot icon indicating copy to clipboard operation
EddieBot copied to clipboard

[FEATURE] Improve notifications by context checking

Open eddiejaoude opened this issue 2 years ago • 0 comments

Description

The toxicity model detects whether text contains toxic content such as threatening language, insults, obscenities, identity-based hate, or sexually explicit language

https://github.com/tensorflow/tfjs-models/blob/master/toxicity/README.md

Screenshots

Screenshot 2022-10-27 at 23 45 35

Additional information

No response

eddiejaoude avatar Oct 27 '22 22:10 eddiejaoude