Menu Close

TikTok moderators say they were trained with child sexual abuse content

A Forbes report raises questions on how TikTok’s moderation crew handles child sexual abuse materials — alleging it granted broad, insecure entry to unlawful pictures and movies.

Employees of a third-party moderation outfit referred to as Teleperformance, which works with TikTok amongst different firms, declare it requested them to assessment a disturbing spreadsheet dubbed DRR or Daily Required Reading on TikTok moderation requirements. The spreadsheet allegedly contained content that violated TikTok’s tips, together with “hundreds of images” of youngsters who were nude or being abused. The workers say a whole bunch of individuals at TikTok and Teleperformance might entry the content from each inside and outdoors the workplace — opening the door to a broader leak.

Teleperformance denied to Forbes that it confirmed workers sexually exploitative content, and TikTok stated its coaching supplies have “strict access controls and do not include visual examples of CSAM,” though it didn’t affirm that every one third-party distributors met that customary.

The workers inform a distinct story, and as Forbes lays out, it’s a legally dicey one. Content moderators are routinely compelled to deal with CSAM that’s posted on many social media platforms. But child abuse imagery is illegal within the US and should be dealt with rigorously. Companies are imagined to report the content to the National Center for Missing and Exploited Children (NCMEC), then protect it for 90 days however decrease the quantity of people that see it.

The allegations right here go far past that restrict. They point out that Teleperformance confirmed workers graphic pictures and movies as examples of what to tag on TikTok, whereas enjoying quick and free with entry to that content. One worker says she contacted the FBI to ask whether or not the apply constituted criminally spreading CSAM, though it’s not clear if one was opened.

The full Forbes report is effectively price a learn, outlining a scenario the place moderators were unable to maintain up with TikTok’s explosive progress and advised to observe crimes towards kids for causes they felt didn’t add up. Even by the difficult requirements of debates about child security on-line, it’s an odd — and if correct, horrifying — scenario.