Facebook has rightly been charged with allowing its platform to be used to spread hate and even fomenting violent ethnic conflicts in Myanmar, Sri Lanka, and other countries, most recently in Ethiopia. It has moderators who make decisions based on internal guidelines about whether offending posts should be removed and their posters penalized but the process has been criticized for being opaque and the results erratic. In response to criticisms that this was inadequate, the company has created an additional structure that would adjudicate more controversial cases involving hate speech or speech that foments violence.The company created a body that has been called a ‘Facebook Supreme Court’ (FSC) consisting of a wide array of people from around the world that would review difficult cases to see if the decisions of the company were justified.
One of the first high profile cases referred to the FSC was that of Donald Trump who was suspended indefinitely after the January 6th insurrection. The FSC said that the company could not leave the suspension as indefinite and in response the company said that he was suspended for two years starting from the original date, and that it would be reviewed after that date to see if the suspension should be lifted, renewed, or the ban made permanent, like Twitter did.
Mr Trump said the move was “an insult” to the millions who voted for him in last year’s presidential election.
Facebook’s move comes as the social media giant is also ending a policy shielding politicians from some content moderation rules.
It said that it would no longer give politicians immunity for deceptive or abusive content based on their comments being newsworthy.
In addition to Facebook, which has over two billion monthly users, Mr Trump has also been banned from Twitter, YouTube, Snapchat, Twitch and other social media platforms over the January riot.
So who makes up the FSC and how does it work? The program Radiolab had a good episode explaining this issue, discussing how it is being set up and what its scope is going to be. During the program, the reporter looked at one case and that is the conflict that is raging in Ethiopia that shows how the same people went from using Facebook as a democratizing influence at one time to overthrow a dictatorial regime to a hate-generating machine the next, allowing those people to use the platform to spread lies and hate against a minority community that greatly inflamed passions.
The show describes how Facebook ran trials of the FSC consisting of six global workshops in different continents where they invited experts to weigh in using test cases. They found that things that passed muster in one region as being merely comedic were found to be remove-worthy in another region because of the tensions that existed there, showing how hard it is going to be to arrive at a universal set of guidelines for what should be removed from the site.
While the entire program is good, I found the detailed explanation of the roots of the Ethiopian conflict that runs from 20:30 until 31:40 to be particularly enlightening. It is a common story of how much harm can be wrought by a majority community that has the power when it gets into its collective head that they are the persecuted group at the hands of a minority.