Thank you for taking the time to report something


It’s social media day on Slate; they also have a piece about how crappy Facebook is at doing anything about bullying and harassment…except that they don’t put it that harshly, and they should.

A woman in Texas is suing them for doing nothing whatsoever about a report she sent them that someone was posting fake porn pictures of her. Yeah that sounds like Facebook. I suppose they sent her that form letter that says “we saw your report, your reports help us make Facebook safe and welcoming, we’re ignoring your report, we have no reason in fact we didn’t even look at it even though we just said we did, have a nice day.”

Facebook (which I have advised in connection with my work as a member of the Anti-Cyberhate Working Group) and other content providers should heed her lawsuit’s message. Ali’s claims express dissatisfaction with the enormous, unchecked power that digital gatekeepers wield. Her suit essentially says: Hey Facebook, I thought that you had a “no nudity” and “no harassment” policy. Other people reporting abuse got results, why not me?Why would you take down photos of women breastfeeding but not doctored photos portraying me as engaged in porn without my permission? Did my complaint get lost in a black hole or was it ignored for a reason?

Facebook could have alleviated a lot of Ali’s frustration by actually responding to her when she first made contact. 

Ya, and they don’t do that. I along with a whole bunch of other people reported a horrible page on Facebook a few days ago that was set up for no purpose in the world other than to mock and bully and harass Melody Hensley. We all got instant replies from the algorithm, saying what I just said they said.

We reviewed your report of Getting PTSD from tumblr posts without trigger warnings

Thank you for taking the time to report something that you feel may violate our Community Standards. Reports like yours are an important part of making Facebook a safe and welcoming environment. We reviewed the Page you reported for harassment and found it doesn’t violate our Community Standards.

That’s what they say. That’s what they always say. Just for one problem, it’s a lie – they did not review the page; they didn’t even see my report. The reply came instantly; it’s obviously automated. Why the hell do they say they reviewed the page when they didn’t? Having done that, why do they then insult us further by saying “Reports like yours are an important part of making Facebook a safe and welcoming environment”?

With great power comes great responsibility, and Facebook needs to improve its terms-of-service enforcement process by creating an official means of review that includes notifying users about the outcome of their complaints….Facebook can also improve the enforcement process by ensuring that reports of certain abuse—like harassment, nude images, and bullying—get priority review over others, such as spam. When users are filing complaints, they should be prompted to provide information that would better help staff identify those requiring immediate attention.

All true, and none of that is what happens now, as you know if you’ve ever reported anything to Facebook. They just throw it out and send you an insulting pack of lies 2 seconds later.

Bottom line: Facebook needs to start explaining its decisions when users file complaints, no matter the result. Ali should have been told whether or not Facebook viewed what happened to her as a violation. She should have been told whether or not it would be taking the content down, or what the next step would be. And to ensure the fairness of the process, Facebook should not only notify users of decisions but also permit them to appeal. Of course, Facebook is not our government; it does not have to grant individuals any due process under the law. But it should have an appeals procedure anyway, because when people perceive a process to be fair, they are more inclined to accept its results.

It also needs to stop telling lies about reviewing the page when they didn’t review the page.

Comments

  1. quixote says

    Back in the Stone Ages when social media first started up, I thought it sounded interesting. But then I had a brief look at the Terms of Service which was enough to tell me I had no rights. Zero. None. “Well,” I thought, “people will never stand for that. Congress will pass something so these companies have some legal obligations to their users. I can wait till that happens.”

    Still waiting. Meanwhile the lack of rights has gone right through the looking glass into a real rip-off.

    They’re making money off the victimization of women. It brings in eyeballs! Why on earth would they want that to stop?

  2. Katie Anderson says

    There was a page specifically created to claim Miri should be murdered and it took them about a week to take it down even though people started reporting it within minutes. I got the same form letter you quote saying it didn’t violate their standards…

  3. says

    Yup, I remember that, that’s another time I got their form letters. The most staggering one was the Ugandan page inciting murder of gay men complete with graphic pictures of murdered men. They did finally take it down but it took a LOT of reporting.

  4. Katie Anderson says

    And the countless times I’ve reported people using homophobic slurs while claiming that all LGBT people are mentally ill and therefore terrible people. That somehow doesn’t violate the policy against attacking people for sexual orientation or for mental health conditions.

  5. Foxcanine says

    I’m sure it won’t make much of a difference, from what I’ve read here, but where can I lend a voice? I rarely use Facebook so I’m not familiar with where to go for complaints.

  6. Katie Anderson says

    The only option I’ve seen is to leave “site feedback” where they make sure you know they probably won’t bother responding right on the form you fill out. I haven’t tried too hard to track down better methods though.

  7. doublereed says

    The only time I’ve seen them remove comments I’ve reported is if they contained a slur.

  8. doublereed says

    And by “contained a slur” I really just mean the n-word. Other slurs seem to go through.

    It’s really just baffling.

  9. Foxcanine says

    Done, and I sent the slate article to all my friends. Not that there’s a lot of them.

  10. Blanche Quizno says

    That’s why my husband and I will never be on Facebook.

    Has *nothing* to do with there being far more people I DON’T want to find me than people I’d like to catch up with at this point in my life. Nope O_O

  11. says

    What did you expect from a system created by a bunch of arrogant college kids, for the primary purpose of indiscriminately gathering up personal information and making it available to everyone, without waiting for anyone’s actual consent?

    Also, what did you expect from a system that the NSA tapped into because it was already doing their (illegal) work for them?

  12. Matt Penfold says

    One of these days a case in going to end up in court, and FB are not going to look good when they would be forced to admit that they ignore most reports of inappropriate content.

  13. John Morales says

    Matt @12, call me cynical, but I suspect it would be more costly for Facebook—with over a billion users—to actually have humans diligently vetting all reports than to possibly cop the odd fine.

    (They’re in the business to make money, not to be ethical)

  14. johnthedrunkard says

    Back to the ‘hiring employees’ problem again. As money-making engines, the social networks generate profit because they are almost entirely automated. Screening for a word like ‘nigger’ can be arranged with no difficulty. (Though how that would work with Hip/Hop sites is a mystery).

    To deal with issues like threats and harassment, actual human beings have to be involved, doing actual work and making actual decisions. And the corporate model of Interwebz Money Mill will crash in flames to avoid that step.

  15. says

    To deal with issues like threats and harassment, actual human beings have to be involved, doing actual work and making actual decisions. And the corporate model of Interwebz Money Mill will crash in flames to avoid that step.

    Or make that step affordable by selling out to whoever wants to buy them out and use them as propaganda organs.

Leave a Reply

Your email address will not be published. Required fields are marked *