Google YouTube Moderator Litigation

YouTube reportedly receives more than a million reports of potentially objectionable content each day. The giant platform relies on content moderators to view and remove a range of offensive and horrifying videos from public view. These independent contractors are bombarded with shocking and uncensored depictions of child sexual abuse, rape, torture, bestiality, beheadings, suicide, and murder.  

As a result, content moderators are increasingly being diagnosed with post-traumatic symptoms that include anxiety, depression, and insomnia. Google (YouTube’s parent company) and other internet service providers voluntarily established industry standards for training, counseling, and supporting content moderators more than a decade ago. But unfortunately, YouTube does not follow the workplace safety guidelines its parent company helped create.

If you or someone you know was previously employed or currently work as a content moderator and have experienced these symptoms, we want to hear from you. We are pursuing class-action litigation to assure that these workers have access to effective treatment and better work environments, and we are driven to hold YouTube to the commitments it made. That includes the establishment of a medical monitoring fund to provide effective testing and care to content moderators with PTSD, mitigating the harm to current content moderators, and taking care of the people who have already been traumatized. Please note that in some cases a non-disclosure agreement may affect your ability to join this litigation, and we can review those options with you. Please keep in mind that legal actions are time-sensitive and subject to strict statutes of limitation. To explore your legal options, we invite you to schedule a complimentary, no-obligation case review with our team of experienced litigators (who were instrumental in reaching a groundbreaking settlement on these issues with Facebook). Please call 866-398-8842 or e-mail to learn more about the case.