Facebook moderation PTSD

Facebook moderators have to deal with a large amount of nasty content that is extremely traumatic, working as a facebook moderator could literally be the worse job in the world.

The constant reviewing of the darkest content on the internet that has been reported can bring a heavy depression and post traumatic stress to a moderator.

The dark content on facebook can include violence, murder, suicide, exploitation and dark pornographic content.

The moderators have access to mental health support but many employees explain it is not enough and suffer with mental illness from working as a moderator for facebook long after they leave the job.

Many employees show symptoms of trauma after a short time being employed by facebook, monitoring the dark content can make the employees numb to the negative emotional trauma constantly sustained from reviewing the content.

The non disclosure agreement signed when starting the job makes it hard to confide in friends and family of employees, and speaking about the job itself can be legally difficult.

The condition are strict in the work place to uphold the non disclosure agreement, individuals users accounts and their details can not be recorded or released to the public, the legal justice system is usually involved with the details.

The most worrying part of the job is being attacked by people seeking revenge for dealing with post and account banning.

The work environment can be very dark, with employees making dark jokes to help cope with the nastiness they see on a daily basis, when one video alone can be traumatic, with only fellow employees to confide in, trauma based relationships form between employees looking for serotonin and dopamine high to counteract the low.


References:

BBC - Written article.
The Verge - Written article.
Vice - YouTube Video.

Related Articles: