In general, I tend to be optimistic about the human condition but on occasion, I come across stories that shake that sense of positivity. The radio program The World had a segment on December 12th about the trauma suffered by content moderators tasked by Facebook with viewing videos on the site to see if they should be removed. Having to watch video after video of the most appalling things in rapid succession resulted in many of them suffering psychologically and Facebook did not seem to have in place sufficient resources to help them deal with it. Some of the moderators, who are contractors and not Facebook employees, are now suing Facebook. One of them Chris Gray, who worked in its Dublin office, was interviewed on the program.
Gray started working as a content moderator in July 2017. He was one of the thousands hired to moderate flagged content on Facebook following a series of high-profile incidents. In April 2017, a Cleveland, Ohio man uploaded a video of himself gunning down an elderly stranger on the street. It stayed on Facebook for hours. Within days, a man in Thailand livestreamed the murder of his baby daughter on Facebook Live.
At first, Gray’s job was to keep pornography off the site. When a user or Facebook’s technology flagged a post that seemed to be in violation of Facebook’s “Community Standards,” it would go to Gray, or to someone else on his team who would review the video, photo or text and decide what to do with it — take it down, mark it with a warning or leave it up.
“After a few months, I was moved to the high-priority queue, which is hate speech, graphic violence, bullying. Really all the nasty stuff you want to act on very quickly,” Gray said. “I really don’t like to talk in detail about [the things I was reviewing, but it included] executions. Terrorists beheading people. Ethnic cleansing in Myanmar. Bestiality. I mean, you name it. All the worst of humanity, really.”
On busy days, Gray would walk into work to find 800 of these posts waiting in his queue. On good days, it was closer to 200. He had to sift through quickly, but also carefully because Facebook was auditing the decisions he was making — and keeping a score. Gray was working 37.5 hours a week, making about $14 per hour.
There was much about this story that was depressing. That Facebook is an evil company goes without saying. That people have been committing the most atrocious acts of violence from time immemorial is also sadly familiar. What really gets to me is the conjunction of these two things, that this social media platform enables people to make these horrible videos available online and that there seem to be enough people who want to watch them. For me, listening to this radio report describing clinically some of the videos was sickening enough. I cannot imagine watching them, let along actively seeking them out. And yet, it seems that people do want to see such things and there are people willing to meet that need.
Psychologist John Suler identifies six factors of this Online Disinhibition Effect that he says may explain why online behavior seems to be much worse that in the physical world. He says that “people self-disclose or act out more frequently or intensely than they would in person” and that these online disinhibition effects are due to dissociative anonymity, invisibility, asynchronicity, solipsistic introjection, dissociative imagination, and minimization of authority.
It’s well known that people say and do things in cyberspace that they wouldn’t ordinarily say or do in the face-to-face world. They loosen up, feel more uninhibited, express themselves more openly. Researchers call this the “disinhibition effect.” It’s a double-edged sword. Sometimes people share very personal things about themselves. They reveal secret emotions, fears, wishes. Or they show unusual acts of kindness and generosity. We may call this benign disinhibition.
On the other hand, the disinhibition effect may not be so benign. Out spills rude language and harsh criticisms, anger, hatred, even threats. Or people explore the dark underworld of the internet, places of pornography and violence, places they would never visit in the real world. We might call this toxic disinhibition.
On the benign side, the disinhibition indicates an attempt to understand and explore oneself, to work through problems and find new ways of being. And sometimes, in toxic disinhibition, it is simply a blind catharsis, an acting out of unsavory needs and wishes without any personal growth at all.
In the January 2020 issue of Harper’s Magazine (p. 23-30), in an article titled Click Here to Kill Brian Merchant suggests that this might play a role in the coming into being of online murder markets that can be found on the dark web where people can anonymously take out contracts on other people and pay with cryptocurrencies like bitcoin. Many of these sites are frauds that simply take the money and don’t carry out the executions but their existence and the fact that so many people actually send money to them suggests that such a disinhibition factor is at play, since many of those people would not dream of going out and seeking killers for hire in the real world.
Shuler says that we should not think that the online behavior is revealing the ‘real’ person under the mask that we wear in the everyday world and that what is going on is more complicated. “Rather than thinking of disinhibition as the revealing of an underlying “true self,” we can conceptualize it as a shift to a constellation within self-structure, involving clusters of affect and cognition that differ from the in-person constellation.”
That last bit, that online behavior may not indicate people’s ‘true’ nature gives a glimmer of hope for those of us who may despair at the behavior we see online.