Facebook Sued by Former Content Moderator Over ‘Debilitating PTSD’

Photo: Getty

Facebook likes to outsource its most egregious issues, and weeding out the platform’s gruesome and graphic content is no exception. On Friday, a former Facebook content moderator sued the social network, alleging that she suffers from psychological trauma as a result of the job.

Selena Scola was a content moderator contracted by Facebook from around June 2017 through March of this year. Scola filed a class-action lawsuit against the company—as first reported by Motherboard—claiming that it doesn’t provide its contractors who view a flood of deeply troubling content every day with the necessary training, safety, or medical support needed. The complaint also states that Scola developed and still suffers from “debilitating PTSD” from her time as a contractor for Facebook.

“Ms. Scola’s PTSD symptoms may be triggered when she touches a computer mouse, enters a cold building, watches violence on television, hears loud noises, or is startled,” the lawsuit states. “Her symptoms are also triggered when she recalls or describes graphic imagery she was exposed to as a content moderator.”

The class-action lawsuit was filed on behalf of all residents of California who worked as content moderators for Facebook in the last three years. This will likely cover a large scope of people, given Facebook has contracted thousands of moderators and plans to double its safety and security team by the end of this year to 20,000—that team includes contractors.

The lawsuit states that Scola seeks a “Facebook-funded medical monitoring program” that would “include a trust fund to pay for medical monitoring and treatment” whenever necessary for her and any contractors joining the class-action as well as injunctive relief and attorney fees. “Facebook needs to mitigate the harm to content moderators today and also take care of the people that have already been traumatized,” Steve Williams of the Joseph Saveri Law Firm said in a press release.

This is hardly the first account of a content moderator developing severe mental health issues due to the horrendous nature of the job. Facebook is one of the largest platforms in the world and by virtue is inundated with an overwhelming amount of violating content, but many other social platforms are guilty of offloading the task of sifting through it to ill-supported contractors. Supporting the mental health needs of the ones doing the dirty work is not asking a lot from companies worth billions.

We have reached out to Facebook for comment and will update with a response.

[Motherboard]


Date:

by