Staff at Teleperformance are still reviewing TikTok’s most disturbing content, including child sexual abuse, despite the company’s pledge to exit the business following shareholder backlash last year.
(Bloomberg) — Staff at Teleperformance are still reviewing TikTok’s most disturbing content, including child sexual abuse, despite the company’s pledge to exit the business following shareholder backlash last year.
The Paris-based contractor, which operates call centers and content moderation teams for some of the world’s largest companies, has about 500 employees working for TikTok in Tunisia, some of whom spend their days looking at harmful videos posted to the social network, according to people familiar with the matter, who asked not to be identified discussing non-public information. These include child sexual abuse, animal abuse, gore and violent images, they said.
Teleperformance said it would exit the “highly egregious part of the trust and safety business” in November, weeks after a report alleged Colombian employees moderating TikTok content were subject to occupational trauma from looking at harmful content. This led to a Colombian probe into labor practices and the biggest drop in share price in more than three decades.
The revelations highlight the challenges that social media companies and their content review teams face in protecting their users from extremely disturbing material. While AI tools can screen some of the content, they aren’t good enough to replace human judgment entirely. This means companies like TikTok, Meta Platforms Inc. and Alphabet Inc. still rely on teams of people, often low-paid contractors, to review and remove posts. Repeated exposure to extreme material has been linked to emotional and psychological distress.
At the time, Chief Executive Officer Daniel Julien said Teleperformance would continue to offer content moderation services, but its workers wouldn’t review the most extreme posts, such as child-abuse images. It would work with its clients to find “suitable alternatives for its current business in the field,” the company said in a statement. The announcement was praised by analysts who had grown concerned about possible ESG risks.
Teleperformance Chief Financial Officer Olivier Rigaudy said in an interview that the company’s position hadn’t changed since November and that it was honoring existing contractual commitments with clients. He declined to comment on when individual contracts end, but said they typically last two to three years.
Rigaudy said that the company was also working to define what “highly egregious” content means, depending on different cultures, laws and customers. The process is “extremely complicated” because it involves 40 clients, each with 30 or 40 contracts, he said.
TikTok did not respond to requests for comment.
Workers in Tunis are moderating content posted by users in the Middle-East and North Africa, through a contract that started around last summer, the people said. Tunisia has recently become a hub for TikTok moderation in the region, with the work split between Teleperformance and another subcontractor, Concentrix Corp. A representative for Concentrix declined to comment.
A portion of the Tunis-based TikTok moderators review queues of videos in a restricted-access room. One of the queues can include highly egregious content, one of the people said. Videos are filtered into the different queues by an AI system, the person added, so that more highly trained staff review the most offensive material.
Employees can talk to on-site therapists, whose presence is a requirement from TikTok, the people said. They work in 9-hour shifts and earn around 900 to 1200 dinars ($290 to $385) a month, depending on experience and bonuses for working at night, the people said.
The job includes planned breaks and is favored by some employees over talking to customers in Teleperformance’s more traditional call center business, one of the people said.
More stories like this are available on bloomberg.com
©2023 Bloomberg L.P.