Imani (25 years old) is a TikTok content moderator in Morocco. She was shocked at the scene of a young man brutally torturing a cat. Two years later, that horrible video still haunts her.
Imani works for TikTok’s Middle East and North Africa affiliates through a third party company Majorel. Her job is to moderate the most gruesome content on the platform, including suicides and child abuse.
In September 2020, when she was offered a job at Majorel with a meager salary of 2 USD / hour, she thought it was a gift from heaven because it was a difficult time during the pandemic. However, she did not know that this job would cause such psychological damage. Currently, although she has retired from work, she still suffers some of its effects.
And Imani is not alone. Nine other content moderators who work for Majorel have described to Insider the serious psychological problems they face because of this job. All say that Majorel and TikTok have imposed a work environment with constant supervision and near-impossible norms.
Spreading malicious content
“The scary thing about this job is that you slowly get affected, you don’t even realize it. At first, you think it’s not a big deal but it turns out it affects you a lot ,” Wisam – shared by a former content moderator working for Majorel.
Before TikTok, Wisam was a veteran Facebook content moderator, employed through the same outsourcing company. He said that as TikTok becomes more popular, the amount of harmful content on the platform is likely to increase rapidly.
Although TikTok uses artificial intelligence to moderate content, the technology is notoriously ineffective in languages other than English. As a result, workers are still employed to moderate harmful content videos on the platform.
Hiring third-party moderators gives giants like Meta or ByteDance (the parent company of TikTok) a reasonable excuse to disclaim liability when workers complain about working conditions or complaints. Psychological problems caused by work.
Majorel often encourages employees to find other jobs if they can’t stand the work of malicious content moderation. However, they often choose to stay because it is difficult to find similar job opportunities.
“Impossible” targets
Samira (23 years old) joined Majorel in July 2020 with the task of moderating TikTok Lives. At first, she was assigned to watch 200 videos an hour and had to maintain an accuracy of up to 95%. Three months later, her manager increased her quota to the point where she only had about 10 seconds to watch a video.
“The new target is almost impossible. They don’t see us as humans but robots ,” Samira said.
Other former employees also said their targets were difficult to achieve. When they failed to meet the requirements, they were reprimanded and cut the $50 cash bonus. In addition, content moderators at Majorel have less time off than their American counterparts. Unreasonable work division is also one of the problems complained about.
“Sometimes, I work from noon until midnight, 12 hours straight. The manager often adds or changes my shift without notice. That’s the main reason why I quit ,” Imani said.
Since leaving Majorel nearly a year ago, Samira has begun a psychological recovery process. After quitting her job, she spent six months healing and moving west, to the coastal city of Agadir. Here, she worked as a middle and high school English teacher. With this job, she earns double the salary received as a content moderator for TikTok.
Serious psychological problem
After watching videos containing malicious content, Samira began seeing a home health counselor once a month. However, each time, she met a different person from the previous one. Therefore, the effect is not as expected.
Several other former employees also said that the company’s health counselors could not help them deal with their psychological problems. “Monthly meetings rarely help us feel better with a job that is constantly exposed to harmful content every day ,” one person said.
Besides, they are also afraid that the advisors will report back to the human resources department without their consent. This can be detrimental to future work.
For its part, a Majorel spokesperson told Insider that the company funded the cost of content moderators to meet with psychologists outside the company. However, at least four former employees said they were not provided with any such service.
Source: CafeF
Source: Vietnam Insider