The trolls feed on your contempt

We had some discussion here a few years ago about implementing some scoring method for comments — there were some proponents who thought it would be a useful way to get community input. I’ve always been dead-set against it. It turns out I have scholarly justification now.

Abi Sutherland discusses a psychology paper at Making Light, which examined the effect of up- and down-voting on large user communities at CNN, IGN, Breitbart (oops, there’s a dollop of poison in the database), and allkpop, a Korean entertainment site. Cheng, Danescu-Niculescu-Mizil, and Leskovec proposed to test a prediction of the operant conditioning model, that peer feedback would lead to a gradual improvement in the quality of posts. That’s not what they saw.

By applying our methodology to four large online news communities for which we have complete article commenting and comment voting data (about 140 million votes on 42 million comments), we discover that community feedback does not appear to drive the behavior of users in a direction that is beneficial to the community, as predicted by the operant conditioning framework. Instead, we find that community feedback is likely to perpetuate undesired behavior. In particular, punished authors actually write worse in subsequent posts, while rewarded authors do not improve significantly.

It’s a kind of backlash — downvote a commenter, and they don’t see it as a suggestion to change for the better, but instead see it as a challenge: those other assholes need to change to agree with me, so I’m going to rage even more at them. We don’t do downvoting here, but I certainly do weigh in with the landslide election of the banhammer, and you would not believe how much furious fulminating I get in email and in the spam queue over that. Well, maybe you would — you see lots of those jokers then charging out to other sites complain about the vast injustice done to them.

From the authors’ conclusion:

In contrast to previous work, we analyze effects of feedback at the user level, and validate our results on four large, diverse comment-based news communities. We find that negative feedback leads to significant changes in the author’s behavior, which are much more salient than the effects of positive feedback. These effects are detrimental to the community: authors of negatively evaluated content are encouraged to post more, and their future posts are also of lower quality. Moreover, these punished authors are more likely to later evaluate their fellow users negatively, percolating these undesired effects through the community.

Sutherland has also experienced this phenomenon.

More than once, I’ve watched groups of people gather on particular LiveJournals, blogs, and chatrooms to spin up their energy and hone their arguments, then go back to the “main” venues to continue the discussion. These side-channels act as adjuncts to the visible conversation, where people not actively participating can research claims, suggest arguments, and feed support and affirmation to those who are.

This is not, in itself, a good thing or a bad thing; it’s just how conversations work on the internet at the moment. I’ve participated in it, both unconsciously and knowingly, trying to move the “group mind” in the directions that I find best and most ethical.

But when you apply the study conclusions to the internet as a whole, you get exactly what we’re seeing now: communities like Reddit and 4chan are criticized (negative feedback), and begin to see themselves as persecuted. Their worst sides gain strength. The volume of negative output increases, and the gleeful nastiness drives out thoughtful, balanced conversation, even within the communities themselves.

This psychology stuff sure seems kind of useful, doesn’t it?


  1. Rob R says

    This doesn’t sound too surprising. When presented with a vast amount of evidence against them, the typical response isn’t, “Oh, wow, I guess I was wrong”, it’s “Waaah, everyone’s conspiring against me!” followed by increased drive to fight back.

  2. mx89 says

    You could have a sort of downvote system where the number of downvotes doesn’t show up anywhere until it reaches a certain threshold, and the post is then hidden. It would probably save a bit of time for moderation purposes.

    Of course, since FTB and this blog in particular is targeted by masses of drooling reactionaries, they’d probably just spam it and hide everyone’s posts, so maybe that’s not a good idea.

    Hubski has it so that you can only upvote people, and a limited number of times at that. I find that works reasonably well.

  3. Donnie says

    Pz Myers just admits [evolutionary] psychology is beneficial!

    **** ducks and runs away ***** quickly!

  4. blf says

    Has any attempt been made to test the hypothesis that not providing any feedback somehow alters the situation. No feedback would mean no comment voting/rating, no replies (not even obliquely), no(?) banning, nothing, for/about troll’s comments.

  5. Snoof says

    mx89 @ 2

    Hubski has it so that you can only upvote people, and a limited number of times at that. I find that works reasonably well.

    Yeah, on reading through the paper, it seems they only investigated communities with both upvote and downvotes for comments. I wonder if upvote-only communities differ, especially in light of the fact that people who received no feedback are the most likely to leave.

  6. says

    Isn’t all of this true without a scoring system? Sure, a textual tongue-lashing takes more time to type, but communities often discourage positive “me, too” feedback, so it seems like the negative feedback would actually be emphasized.

  7. JScarry says

    In the tech community, Slashdot and StackOverflow have voting systems that seem to work well. Slashdot is a site that invites opinions and commentary on tech news. It has lots of trollish comments that I never see because they have been moderated away. I don’t know if the trolls get more trollish, but you very rarely see their comments. They’ve been around forever so any attempts to game the system have probably been handled—4chan and reddit attract techy guys who would try to break the comment system just for the lulzs.

    StackOverflow is more technically oriented and specifically bans opinions. On that site users are judged by the quality of their answers. The more people that like you answer, the higher your ranking. I don’t like getting negative feedback there, so I am careful about the accuracy of my answers and go back and edit the ones where I get dinged.

    I would be interested in seeing how either of these two scoring methods would work on FTB.

  8. anteprepro says

    Saying that is just trolls oversimplifies things. If you happen to just get swarm downvoted by assholes in otherwise neutral territory, or even just somehow stumble into full fledged asshole territory, you are probably gonna get angry and feel like you weren’t judged fairly. Your options are either explain yourself calmly and hope to somehow get more appreciated, fume elsewhere and scuttle away, or fight back. A lot of the times you thrive on the War of the Thumbs. You know that you will get a massive amount of thumbs down from one side but want to be insightful or clever enough to get even more approving thumbs from your side and some fencesitters. I’ve been that guy and I’ve seen it happen to others on a wide variety of websites that have comment ratings. Honestly, it is stressful. You essentially feel like you are trying to impress a fickle and arbitrary audience that will only involve itself by judging your comments with a random and uninformative click of a button. And you feel like you are somehow involving yourself in conversation by rating comments, approving of one and downvoting another, when really such judgments are nearly useless and easily taken as slacktivism. I kind of prefer not having it. If you have input, leave a comment. Don’t just click a “boo, don’t like” button.

  9. says

    mx89 #2

    Hubski has it so that you can only upvote people, and a limited number of times at that. I find that works reasonably well.

    My tuppence-worth: an upvote says “I agree, but have nothing to add.” No different really, from a QFT. It lets the commenter know their comment is appreciated even if not replied to. A downvote, on the other hand, says “I disagree but can’t be bothered to tell you on what points, or why,” so it’s basically useless and uninformative.

  10. anteprepro says


    Yeah, on reading through the paper, it seems they only investigated communities with both upvote and downvotes for comments. I wonder if upvote-only communities differ, especially in light of the fact that people who received no feedback are the most likely to leave.

    Which is weird, because the one biggest fucking social media sites in existence uses that voting system. So it is just plain odd that they didn’t think to or didn’t bother looking into that angle.

  11. mildlymagnificent says

    My tuppence-worth: an upvote says “I agree, but have nothing to add.” No different really, from a QFT. It lets the commenter know their comment is appreciated even if not replied to.

    That’s how it works on one site I use. It’s a good way to not clog up the comments with lots of “I agree” or “That’s clever” comments.

  12. says

    And RawStory is an ‘upvote only’ community…Plenty of trolls there…but I don’t mind it as much since one only ‘counts’ positives…Rewards for approval but no demerits for lack of approval…Plus the quality of the pwning of trolls can be downright inspiring…

  13. says

    Q: Does the phenomenon of trolling even EXIST on MRA sites? Are any of ‘us’ trolling there? And what would that even look like. It’s as if trolling can only be trolling if you’re position is morally reprehensible…IOW, is there an analog for the reverse?

  14. Cuttlefish says

    Ouch–the paper has a major theoretical flaw (just starting reading it now, it may well be methodologically sound); the authors do not understand operant conditioning. They assume that upvotes are positive reinforcers, and downvotes are positive punishers. Reinforcers and punishers, though, are defined functionally–a downvote could very easily be a positive reinforcer, and indeed it looks like that is the case. (“The trolls feed on your contempt” is pretty much a way of saying exactly that.) So the “beneficial effects predicted by this theory” are actually predicted by a misunderstanding.

    I also see a cringeworthy conclusion: “Our findings here reveal that negative feedback does not lead to a decrease of undesired user behavior, but rather attenuates it.” First, “negative feedback” appears to be the wrong phrase–“downvotes” is accurate, but may not actually be negative feedback. Second… come on. “Attenuates” is a synonym for “leads to a decrease”. I would not accept this in an undergrad paper.

  15. frog says

    I wonder how much it depends on the venue? Over at [a well known fashion blog I won’t name because I don’t want to send assholes over to it], people will downvote posts that contain body- or slut-shaming, and there are usually at least a few replies to the offending comment explaining where the poster went wrong. (Note: Body-shaming of men also gets community disapproval. Which is how it should be.)

    This seems to work exactly as people assume it should: there are few such posts. The commenting system has threaded replies that tend to shove most of the comments off the first page very quickly, so that may also curtail the problem–no point trolling people who’ll never see it. It is also helped by having active moderation.

  16. jaybee says

    I’ve been on reddit for about 18 months, mostly programming and technology subreddits. I don’t “work” to gain karma, but I do notice with bafflement what comments garner no karma and which get tons. And if the feedback I’m getting is erratic, it isn’t going to condition me one way or the other in how I reply.

  17. anbheal says

    @10Daz — I think upvotes often indicate not simply agreement, but “hey, nicely put, that was funny!” Or insightful, or based on some good quick side research into relevant stats, or that the linked video was hysterical, or that they brought up a very salient point that hadn’t been mentioned before. But often just clever phrasing. For example, here at FTB, I would love to be able to either upvote or reply-to some of the funniest commenters (modusoperandi always comes to mind), as opposed to my “you win the thread!” comment showing up 80 comments below. Ya know, just a little applause from the peanut gallery. I actually almost feel badly for modusoperandi sometimes, because some of his/her wry two-line gotchas are so perfect, just sublime in their economic wit, that it seems a shame s/he’s left there hanging.

  18. says

    Yahoo’s commenting system automatically hides comments that receive enough downvotes. You only see them if you choose to see them. Given that downvoting is done just as much by organized trolls or those who can’t handle disagreement (e.g. positive comments about news stories of Michael Sam bring drafted got HUGE numbers of downvotes), it’s not really an effective system.

  19. speed0spank says

    Hmm..reminds me of every time I leave a kind/positive comment on one of Rebecca Watson’s youtube vids. I’m sure to get a lot of feedback from assholes who are still hung up on elevators. They never change my mind because I only glance at them when my email notifies me they responded to me. Well, that and they are really really stupid.
    Makes me feel like a troll…a kindness troll?

  20. Al Dente says

    I used to post at the MSNBC news blog. I became used to having my liberal comments downvoted by conservatives and libertarians. Few of them would try to refute my arguments but getting 20 or 30 downvotes would be discouraging. If you disagree with me then tell me why and how I’m wrong, just don’t click on a down-arrow which doesn’t say anything but “I disagree.”

  21. lutzifer says

    at one of the chaos computer club conferences in the past, somebody showed another way to deal with trolls. By having a heuristic on how likely a comment is a troll comment, the website changed the probability with which the user would be able to solve the captcha code. So, basically, if troll => raise costs for actually posting (depending on the content, some had to try to get it posted several times in a row). It was only semi-effective, but i really love the idea :D

  22. markbrown says

    From Just For Fun: The Grand List of Forum and Community Laws

    The Average Rating Sine-Curve:

    On a site with score voting, a user or work of high caliber will gradually rise in its average rating score until it rises high enough on the charts to draw attention to itself, at which point fans of other works or friends of other users will vote it back down so their preferred entity has the better score. Once it gets knocked back down, said fans will forget about it, allowing its rating to gradually climb back up until the cycle repeats.
    Corollary: Due to this effect, nothing can stay on an all-time top-rated chart permanently, as being on the chart will attract people who will vote it down to make room for their own preferred entities.
    The Up And Away Corollary: If there is only an upvote button and not a downvote button, works will attract more upvotes by simply being on the top-rated chart until they gain an unsurmountable lead, even when it is blatantly obvious to an unbiased observer that the work is outdated.
    The deviantART Corollary: When a work is promoted to a spotlight position, the comments section will fill up with complaints about why this work was chosen instead of a certain other work of higher quality. If there is even a grain of truth in this statement, the conversation is likely to degrade into vicious attacks on the work until the creator deletes it or quits the site.

    Rule of Ratings:

    Any time score voting is presented on a site, most voters will only give out the maximum or minimum possible scores, depending on whether they liked or hated whatever is being voted on. Any scores between the two might as well not exist.
    Corollary: If a poster cites a composite rating from a site such as Meta Critic or Rotten Tomatoes in support of his/her position, the poster in opposition will call into question the methods used by such sites.
    Corollary 2: Anyone who does give an intermediate rating may draw hostility from both sides for ‘not being able to form an opinion’, particularly if they give a rating exactly in the middle of the scale.
    Corollary 3 (The Front Page Corollary): Anything that gets featured on the front page will immediately get a ratings boost regardless of its actual quality, due to a flood of people who give the maximum rating to anything that doesn’t suck out loud (and some things that do).

  23. insertsymbolshere says

    In order for feedback to work as intended, the criticism has to be received in a constructive way. The community has to be built with and keep a positive tone, which is near impossible with a public forum. Negative criticism given in a constructive environment, I’d think, would be taken as intended and used to improve. Most of the big forums on the internet are each-for-his-own type places, though, so negative feedback just sets up a fight that everybody vies to win at any cost. People would either improve or be kicked out for bad behavior. The problem with large groups is that there’s simply no ability to monitor them well enough to end problems before they blow up the way you can with smaller groups. It’s not so much about keeping discordant people out as it is about creating a culture where people choose not to poke a touchy situation in the first place, and then will also shut down spats themselves. If your people egg on a fight, no amount of moderating will save your community.

  24. nomuse says

    What about progressive disemvowelling?

    Each strike or downvote, a vowel is removed from the troll’s posts. When they persist, on to other letters. It wouldn’t moderate their behavior any…but it would be funny.

  25. lorn says

    Back in the early 80s there were cases when a completely unregulated forum spontaneously became a self-regulating ecosystem. The chances of this happening were increased when a larger group of people posted regularly and no one person, clique, viewpoint was allowed to dominate without people who cared about the forum making sure countervailing views were present and/or critique of assumptions and dogma emerged.

    Nobody had any administrative power. The tools were limited but powerful, patience, careful attention to what people are trying to say, even if it is buried under a rant, and an insightful question and a few kind words often had profound effects. Not a few drive-by attempts at disruption ended up as constructive long term contributors.

    On the other side take-downs were also useful. Carefully figuring out what a bullying poster values and how they want to be seen. Collecting cases where they violate their own standards. Waiting for the right day, then unloading on them. Some fled forever. Most returned after a few days, chastened for a time. Generally, the bullies that stayed, mellowed. They still had their point of view and pet peeves but they were not so sensitive, not so vicious or hurtful, they developed something of a sense of humor about their own bias, and sometimes even showed grudging respect for their ideological opponents. Most people seemed to grow and learn.

    I haven’t seen that sort of community develop in a forum for a long time.

  26. Brony says

    Makes sense to me.

    Just getting an up or down vote is not very informative. It’s just a “like”/”dislike”. Without knowing why a comment was so disliked or liked a commenter can’t do very much with it. If someone got specific reasons for why their comment was not liked that might help.

    I wonder if they looked into if there was anything specific preventing down-voted commenters from matching any criticism they got in text with the down-votes? People don’t always say why they don’t like something (or can’t say it very well), but I’m sure it did happen.

  27. ck says

    It might be amusing for a bit, but the trolls always look for a way around such countermeasures. It wouldn’t take long until ţĥɘ pȍșțș ƨťɑrț lȍȏǩȋɳɡ lȉƙȅ ťḧїṧ.

  28. sugarfrosted says

    Math Stackexchange and Math Overflow are the only places where I’ve seen this work, but that’s not a free for all thing. On reddit it leads to a people just patting each other on the back for agreeing with each other.

  29. mickll says

    You couldn’t see a better example of this than the so-called “Gamergate” online hate campaign against game designer Zoe Quinn.

    After literally begging media outlets to cover them the “movement” finally got gaming websites and other pop-culture depositories like Cracked and Vice to notice them. Said websites then reported them to be exactly what they were, the hideous malformed offspring of Encyclopedia Dramatica and the 9/11 truther movement! A paranoid, whiny bunch of tossers who’s stated motivations of only being in it for journalistic integrity was a laughably thin veneer for yet another “no gurlz allowdz” internet hate campaign.

    If you thought this negative feedback would slow them down you’d be dead wrong. Over at the cracked website commenters are still pouring in to comment on Zoe Quinn’s article “5 Things I Learned as the Internet’s Most Hated Person” about her experience to announce that it isn’t about misogyny and also that Zoe is a “cheating whore.”

    Other commenters have been telling them they are being ridiculous, they just come back and be ridiculous louder and harder because now they’ve graduated from Being In On a Secret Truth to Being Oppressed by Da Man because people aren’t taking them seriously, obviously.

  30. David Marjanović says

    offspring of Encyclopedia Dramatica and the 9/11 truther movement!

    *bursts out giggling* Wow, that sounds awful!

  31. says

    …authors of negatively evaluated content are encouraged to post more, and their future posts are also of lower quality.

    Also known as The Dawkins Effect.

  32. says

    I’ve been on several fora that have utilized da buttons in different ways. It always comes down to a popularity contest, which is utterly useless if the supposed purpose of said fora is meaningful discussion.

  33. nonlinear feedback says

    In the tech community, Slashdot and StackOverflow have voting systems that seem to work well. Slashdot is a site that invites opinions and commentary on tech news. It has lots of trollish comments that I never see because they have been moderated away. I don’t know if the trolls get more trollish, but you very rarely see their comments. They’ve been around forever so any attempts to game the system have probably been handled (…)

    Oh wow, it’s amazing that you think either of these sites are examples of great moderation. They’re both cesspits which have driven off virtually all intelligent posting in precisely the way this study describes.

    This is not just because upvote/downvote systems have inherent problems, but also because neither site does anything at all to prevent setting up sockpuppet accounts. A common abuse of Slashdot’s system is to use a sock to write tons of sycophantic posts to farm “karma” for the sock account, which can then be expended on moderating down anybody the troll doesn’t like.

    Slashdot posters are aware enough of this problem that many reflexively dismiss newer accounts (Slashdot makes the account ID number visible, and has assigned them in order starting from #1, so a small number means the account’s really old). Low-UID accounts have been known to be sold off on ebay. That’s pretty fucked up, don’t you think? And I can’t count the number of times I’ve seen valid points dismissed just because they were made by a high UID, or worse yet (according to Slashdot groupthink) by an “anonymous coward” (Slashdot’s insulting term for anybody who posts anonymously, which is tremendously ironic because registered accounts are pseudonymous and so easy to create).

    You wouldn’t see very many anonymous coward posts because they start out at a score of 0, by the way.

    And sorry, you’re naive if you think Slashdot has lifted a finger to stop such gaming of the system. Same with HN. The very design of both sites reflects the naive nerdy optimism of their founders and primary audience, the conceit that surely all the world’s problems are easily solved by intelligent hackers writing a few lines of genius code. If you can assign a number to a thing and write a simple algorithm (in this case, “if score is less than X do not display this post”) you’re done, right? No need for messy human intervention in moderation! Crowd sourced wisdom!

    (See also: the entire corporate structure of Google.)

    The way to get upvoted on Slashdot is to agree with the mindset of the toxic posters who have come to dominate the site. The endgame of the Slashdot system isn’t a utopia of great, well-informed opinions and commentary, it’s a neverending spiral of fights between rival circlejerk camps which drives out the good posters. I suppose if you agree with one of the currently dominant mindsets on the type of stories you tend to click on, you might think it’s great, but honestly you need to broaden your horizons a little if that’s the case.

  34. nonlinear feedback says

    Gah. I misread StackOverflow for Hacker News in that comment. Please disregard any implication that SO is super bad, it has its problems but it’s not nearly as bad as Slashdot or HN.

  35. says

    There used to be a message board system that gave the mods/ sysop the option of tagging someone as a Bad User or something like that.

    This bad user would start finding their posts were hanging, their logins were failing, pages would be taking forever to load, and so forth. This would discourage them from posting because “The forum is slow and buggy”. Of course the software was just emulating being slow and buggy. It worked fine for everyone else.

    I doubt this sort of complex code would be something that could be applied to your standard blog. But I find great joy in using FaceBook’s “block user” button and would like it to be more widely available.

  36. says

    What are the Horde’s opinions on “hellbanning”– that is, having the comment system appear as if it is accepting and posting a troll’s comments but not actually displaying them to anyone else? The intent being to starve the trolls of the attention they crave while simultaneously wasting their time. It seems to me there’s a certain satisfaction to be had from a troll feverishly spewing invective that is disappearing into the bit-bucket as fast as they write it, but it doesn’t seem to be a popular option in the places that implement it (Hacker News, for example)

  37. says

    This shows the uselessness of downvoting rather well. Comment spotted at of all places, the Daily Fail:

    Cut the BS, Dorset, 17 hours ago

    How stupid can this be. Religion should be banned from ALL state funded schools, period !

    Cut the BS, Dorset, 15 hours ago

    To all your red flaggers, religion is a personal matter not a group indoctrination exercise and should only be discussed in the home or your place of worship. If you disagree, then debate it rather than just disagreeing !

  38. leftwingfox says

    This might explain why Disqus changed their voting system so only up-votes are visible. down-votes are only used to calculate sorting of “Best” comments.