Ex-Facebook moderator sues over workplace trauma
An ex-Facebook employee has sued the social media giant for failing to protect moderators who view disturbing content. Exposure to images of child abuse and gruesome killings had caused PTSD, the former staffer said.
Internet giant Facebook is facing a potential class-action lawsuit in the United States for allegedly failing to protect moderators who have to view disturbing content including beheadings and sexual abuse.
Selena Scola, a former content moderator and contract employee, suffered post-traumatic stress disorder after taking a job at Facebook in June 2017 for a nine-month period, said her law firm Burns Charest on Monday.
The lawsuit, filed in a California court on Friday, alleges that she and others were bombarded daily with thousands of "videos, images and livestreamed broadcasts of child sexual abuse, rape, torture, bestiality, beheadings, suicide and murder."
"Facebook is ignoring its duty to provide a safe workplace and instead creating a revolving door of contractors who are irreparably traumatized by what they witnessed on the job," Korey Nelson, Scola's lawyer, said.
Our client is asking Facebook to set up a medical monitoring fund to provide testing and care to content moderators with PTSD," lawyer Steve Williams added.
In Germany, Facebook has outsourced the work to two companies, CCC (Competence Call Center) in Essen and Arvato in Berlin.
The law firm Burns Charest is seeking class-action status for the lawsuit. Facebook said it was evaluating the claims and said that it took the supporting of its employees extremely seriously.
"I remember the first decapitation video — that's when I went out and cried a little," a 28-year-old employee remembered. "Now I've gotten used to it so it's not so bad anymore."
One of the team leaders also said at the time that employees had to report themselves in order to receive psychological support. "As a team leader, I don't know whether someone needs care or not."
In a forum post from July, the social media giant said that it had a growing team of 7,500 content moderators and four clinical psychologists in three regions who designed "resiliency programs" for those who worked with disturbing content.
It added that all content reviewers, whether full-time employees or contractors, had access to mental health resources.
Comments
Post a Comment