Microsoft Online Moderators Allege Viewing Explicit Images Gave Them PTSD

Two Microsoft online moderators have filed suit against the company, saying they were forced to watch child porn and other offensive and disturbing images and videos to the extent that they began to exhibit symptoms of Post-Traumatic Stress Disorder (PTSD).  When they asked for help, they say in a McClatchy DC report, they received negative reviews of their performance.

The two, members of the Online Safety Team, allege they could not transfer to another division for 18 months, they were not adequately trained, were not provided with the level of counseling they needed, and were denied workman’s compensation for medical leave time they took in order to take a break and reduce their PTSD-related symptoms.  The compensation requests were denied because OSHA said their conditions were not an occupational disease.

When contacted by McClatchy, Microsoft responded to the allegations by saying, in part, “The health and safety of our employees who do this difficult work is a top priority. Microsoft works with the input of our employees, mental health professionals and the latest research on robust wellness and resilience programs to ensure those who handle this material have the resources and support they need, including an individual wellness plan.”  The company also pointed to programs such as the Compassion Fatigue Referral Project and its mandatory support sessions for members of the Online Safety Team and the Digital Crimes Unit.

As Microsoft says in its response to the charges, “This work is difficult, but critically important to a safer and more trusted internet. The health and safety of our employees who do this difficult work is a top priority…”  Mzinga moderators know they may encounter offensive content at any time, and they are trained in the proper responses.  Here are a few of our best practices:

— Our moderators are warned about and can handle the content they may see

— Our moderators are trained to rapidly escalate offensive content to the proper law enforcement authorities

— As soon as a text, picture, or video tips the scales, it is removed immediately without having to look at it any further

— If they feel overwhelmed, they can trade shifts with another moderator, consult with the lead moderator or the director about how to handle their feelings, or they can work on a different project.

Removing offensive content is part of keeping the Internet safe.  It is Mzinga’s goal to keep our moderators safe as well.