Run images through an inappropriate content filter #11795
Labels
area: spamchecks
Detections or the process of testing posts. (No space in the label, is because of Hacktoberfest)
status: confirmed
Confirmed as something that needs working on.
type: feature request
Shinies.
Given that we see inappropriate content (porn, CSAM, etc.) posted as images, it would be good to run those images through an inappropriate content filter in order to detect such things. My expectation would be that an external service would be used, but there may also be packages available which process the image locally. Substantial investigation as to what's available, and what we might get free access to, would need to be done. We should at least investigate if there was a way for us to use the same service that SE is using.
The text was updated successfully, but these errors were encountered: