afransen
Senior Member
A friend of mine worked at a company that made image detection software (mainly to flag images of nude people/genitalia for social media). Even with AI, there is a lot of human sweat put into training these models and classifying the data used to train them. And that was for adults. I can only imagine what it would be like to work on detection for child sexual abuse materials or other graphic abuse. I believe it when I hear that a lot of content moderators get PTSD. The sheer amount of content being generated on social media makes it nearly impossible to moderate. We may be forced with the choice of sacrificing privacy (like requiring verified identity tied to accounts to allow sharing of content) to stand a chance of dealing with this.