Facebook Moderator Litigation

Facebook reportedly receives more than a million reports of potentially objectionable content each day. The social media giant relies on content moderators to view and remove a range of offensive, horrifying images and videos from public view. These independent contractors are bombarded with shocking and uncensored depictions of child sexual abuse, rape, torture, bestiality, beheadings, suicide and murder.

reported-imageAs a result, content moderators are increasingly being diagnosed with post-traumatic symptoms that include anxiety, depression and insomnia. Facebook and other internet service providers voluntarily established industry standards for training, counseling, and supporting content moderators more than a decade ago. But unfortunately, Facebook does not follow the workplace safety guidelines the company helped create.

If you or someone you know was previously employed or currently work as a content moderator and have experienced these symptoms, we want to hear from you.

We are pursuing class-action litigation to assure that these workers have access to effective treatment and better work environments, and we are driven to hold Facebook to the commitments it made. That includes the establishment of a medical monitoring fund to provide effective testing and care to content moderators with PTSD, mitigating the harm to current content moderators, and taking care of the people who have already been traumatized.

Please note that in some cases a non-disclosure agreement may affect your ability to join this litigation, and we can review those options with you.

Please keep in mind that legal actions are time-sensitive and subject to strict statutes of limitation. To explore your legal options, we invite you to schedule a complimentary, no-obligation case review with our team of experienced litigators. Please call 866-398-8842 or complete the form on this page to speak to an attorney.