TikTok moderators say they were trained on child sexual abuse content

0

A Forbes The report raises questions about how TikTok’s moderation team handles child sex abuse material – alleging it granted wide and insecure access to illegal photos and videos.

Employees of a third-party moderation company called Teleperformance, which works with TikTok among others, say it asked them to review a disturbing spreadsheet called DRR or Daily Required Reading on TikTok moderation standards. The spreadsheet allegedly contained content that violated TikTok guidelines, including “hundreds of images” of naked or abused children. Employees say hundreds of people at TikTok and Teleperformance could access content from inside and outside the office, opening the door to a wider leak.

Teleperformance denies Forbes that it showed employees sexually exploitative content, and TikTok said its training materials had “strict access controls and did not include visual examples of CSAM”, although he did not confirm that all third-party providers adhere to this standard. “Content of this nature is abhorrent and has no place on or off our platform, and we aim to minimize moderator exposure in accordance with industry best practices. TikTok’s training materials has strict access controls and does not include visual examples of CSAM, and our child safety team investigates and reports to NCMEC,” said TikTok spokesperson Jamie Favazza. The edge in a report.

Employees tell a different story, and as Forbes states, it is legally risky. Content moderators are regularly required to manage the CSAM which is published on many social media platforms. But child abuse images are illegal in the United States and should be handled with care. Companies are supposed to report content to the National Center for Missing and Exploited Children (NCMEC) and then retain it for 90 days but minimize the number of people who see it.

The allegations here go well beyond that limit. They say Teleperformance showed employees photos and graphic videos as examples of what to tag on TikTok, while playing fast and free with access to that content. An employee says she contacted the FBI to ask if the practice constituted a criminal dissemination of CSAM, although it is not clear if one was opened.

Full Forbes The report is well worth reading, describing a situation in which moderators have been unable to keep up with the explosive growth of TikTok and have been asked to monitor crimes against children for reasons they claim , did not match. Even by the complicated standards of online child safety debates, this is an odd situation – and if accurate, horrifying.

Updated August 6, 9:30 a.m. ET: Added statement from TikTok.

Share.

Comments are closed.