Skip to content

TikTok has a small army of mileuristas in Barcelona to review its “abusive, violent and inappropriate” content

26 mayo, 2021

TikTok has landed in Barcelona. But he has not done it just with the small (albeit growing) team of employees dependent on his London office: he has done it too. outsourcing the content moderation work of the platform.

Now it has a small army of moderators in Barcelona, ​​subcontracted by ‘contact center’ companies. The newspaper La Información has told, using Linkedin, more than one hundred jobs, but TikTok has not released that data.

A 4,000 square meter office rented by Majorel (one of the main companies in the contact center sector) houses the moderators of the Chinese network in a Barcelona district (22 @) with a large presence of technology companies.

But many of these moderators are hired part-time for salaries of 12,000 euros gross on average, which go up to 20,000 in the case of full-time contracts.

Total, or 2,000 euros per month for carrying out a work with a high psychological exhaustion, as it frequently exposes workers to content not suitable for sensitive people, in many cases related to illegal activities (pedophilia, terrorism, hate speech, etc.).

Depressions and poor working conditions: Several former Facebook employees talk about their experience moderating content

The job offers published in recent months on the Internet allow us to get an idea of ​​the worker’s profile: professionals who do not require a university degree, but have a high level of English and a second language.

But the ads also give us an idea of ​​the demands of your work: “As a moderator, will work by examining and monitoring the sites for abusive, violent and inappropriate content“. And that is not all:

“Looking at images, videos, articles, audio clips may seem like a fun and easy task, but applicants are urged to note that the content, in some cases, will contain strong and abominable images and is not recommended for people who easily offend. “

Only between January and September of this year, TikTok has removed more than 104 million ‘inappropriate’ videos, of which 90% were deleted before they could be seen by other users … and even so it was not able to prevent the video of a suicide from going viral a few weeks ago.

Last May, we learned that Facebook had agreed to compensate its 11,000 content moderators with 52 million euros Americans as a way to compensate for the psychological problems developed in their job, which in some cases had led to post-traumatic stress.


(Marco Verch Professional Photographer)

CLUBHOUSE THE SOCIAL NETWORK of VOICE messages What is how to enter and how to use it

The moderation of social networks, a work problem

Social networks often rely on their algorithms to carry out a first automated content filter. But artificial intelligence still doesn’t work miracles (even less in a case like TikTok, totally based on video), and human supervision is still necessary.

That’s where content moderators come in, initially hired directly by the platforms themselves. until their demand started to skyrocket, and outsourcing was chosen.

TikTok and children's social media terrify me (and maybe working in this area doesn't help)

Four months ago, the University of New York published a report in which it analyzed the content moderation systems of various networks, and -among other aspects- how the outsourcing of this function had influenced its operation through third-party providers such as Accenture, Genpact, Cognizant, Majorel itself, etc.

The report cites statements by Sarah Roberts (UCLA professor and author of ‘Behind the Screen: Content Moderation in the Shadows of Social Media’) in which they state that the strategy of betting on outsourcing responds to a search for “plausible denial”.

Two of the problems detected in the study were the following:

  • Content moderators do not receive proper medical advice and care despite being exposed to “toxic” content online.

  • The work environment is not the most conducive for content moderators to make the best decisions.

To change that, It is proposed that social platforms not only stop the outsourcing of moderators, but bet on turning them into full-time workers, raising their salaries, doubling the number of moderators and providing them on-site medical care to assess how it affects them deal with the most disturbing content.

Note: From Genbeta we tried to contact Majorel for more information about the working conditions of TikTok moderators, without success.

Via | Information & TikTok & Medianama

Image | Marco Verch Professional Photographer (via Flickr)