Despite Teleperformance’s promise to leave the business after shareholder outrage last year, employees are still analyzing TikTok’s most upsetting content, including child sexual assault.
According to those acquainted with the situation who requested to remain anonymous because they were revealing private information, the Paris-based contractor has approximately 500 individuals working for TikTok in Tunisia. Some of them spend their days searching at detrimental films posted to the social network. According to them, they include pictures of violence, gore, and sexual assault of children and animals.
Teleperformance said it would exit the “highly egregious part of the trust and safety business” in November, weeks after a report alleged Colombian employees moderating TikTok content were subject to occupational trauma from looking at harmful content. This led to a Colombian probe into labor practices and the biggest drop in share price in more than three decades.
The revelations highlight the challenges that social media companies and their content review teams face in protecting their users from extremely disturbing material. While AI tools can screen some of the content, they aren’t good enough to replace human judgment entirely. This means companies like TikTok, Meta Platforms Inc. and Alphabet Inc. still rely on teams of people, often low-paid contractors, to review and remove posts. Repeated exposure to extreme material has been linked to emotional and psychological distress.
At the time, Chief Executive Officer Daniel Julien said Teleperformance would continue to offer content moderation services, but its workers wouldn’t review the most extreme posts, such as child-abuse images. It would work with its clients to find “suitable alternatives for its current business in the field,” the company said in a statement. The announcement was praised by analysts who had grown concerned about possible ESG risks.
Teleperformance Chief Financial Officer Olivier Rigaudy said in an interview that the company’s position hadn’t changed since November and that it was honoring existing contractual commitments with clients. He declined to comment on when individual contracts end, but said they typically last two to three years.
Rigaudy said that the company was also working to define what “highly egregious” content means, depending on different cultures, laws and customers. The process is “extremely complicated” because it involves 40 clients, each with 30 or 40 contracts, he said.
TikTok did not respond to requests for comment.
Workers in Tunis are moderating content posted by users in the Middle-East and North Africa, through a contract that started around last summer, the people said. Tunisia has recently become a hub for TikTok moderation in the region, with the work split between Teleperformance and another subcontractor, Concentrix Corp. A representative for Concentrix declined to comment.
A portion of the Tunis-based TikTok moderators review queues of videos in a restricted-access room. One of the queues can include highly egregious content, one of the people said. Videos are filtered into the different queues by an AI system, the person added, so that more highly trained staff review the most offensive material.
Employees can talk to on-site therapists, whose presence is a requirement from TikTok, the people said. They work in 9-hour shifts and earn around 900 to 1200 dinars ($290 to $385) a month, depending on experience and bonuses for working at night, the people said.
The job includes planned breaks and is favored by some employees over talking to customers in Teleperformance’s more traditional call center business, one of the people said.