Buzzfeed has fired its “viral politics editor”, Benny Johnson, for numerous (read: forty one so far?) instances of plagiarism. Buzzfeed isn’t some bedroom-based conveyor belt of clickable content, it’s a major site, employing many people, producing original content and sometimes actual journalism.
However, Buzzfeed, as a whole, is an entity existing in an ethical quandary with content creation.
Consider when it reproduced a photographer’s original picture, without his permission; it created an entire article based on another person’s idea, using unimpressive attribution to dismiss plagiarism concerns; it’s used out of context Tweets, without permission, from non-racist satirists alongside racists; it’s entered murky waters (though didn’t do too bad a job), when it published a list of Tweeted replies about rape survivors.
Linking to some of these example, Our Bad Media writes:
“…Buzzfeed “reporters” have either copied and pasted articles or just lifted individual tweets, photos, or other social media without paying a cent to those actually bringing in the pageviews. It doesn’t appear to be an issue of any concern at the top, either—despite recent reports of new “editorial standards,” editor-in-Chief Ben Smith recently told a tweeter who didn’t like their work being appropriated that they could always just “take it down.”
In 2012, Gawker’s Adrien Chen pointed out another editor doing a poor job of attribution (if at all), to others’ content very few would see.
“Consider the output of BuzzFeed senior editor Matt Stopera. Stopera’s one of BuzzFeed’s most popular editors; he makes regular appearances on Headline News and was the subject of a Businessweek profile, which lauded his ability to assemble massively viral lists at lightning speed. “It suggests somebody has cracked a code,” wrote Businessweek.
A key part of that code is copying and pasting chunks of text into lists without attribution. For example Stopera’s “13 Things You Probably Didn’t Know About the Movie ‘Clueless” is comprised almost solely of sentences copied from the IMDB trivia page for Clueless, with no sign that they are anything but his own words.”
Chen points out many examples of Stopera’s “code”.
Stopera is still at Buzzfeed, his last post was a day ago.
Back to Johnson: He was not a product of something gone wrong at Buzzfeed, only the brightest example of what the content machine does.
Buzzfeed itself is the product of poor ethics we’re all struggling with. Something as simple of manual Retweeting is deserving of scorn, as Jeb Lund points out. Buzzfeed editors after all are not afraid of participating in digital public shaming, even though their target was a harmless nobody and their own followers numbering more than 100,000 (what could possibly go wrong?).
Buzzfeed is a problem, but the bigger problem is what it represents. We are all of us content creators, to larger and smaller degrees, we’re all public figures. With social media accounts or columns in widely read websites, we all should be taking responsibility for what we’re showing; for how we’re showing it; for how we go about responding to that same content. Most of us don’t have the benefit of editors or a three layers of committee that gives the go-ahead to publication; we RT and add “Ha!”, we look at reddit and laugh and put the best gifs up, we try explain complicated political situations using Jurassic Park gifs.
Sure, Buzzfeed’s actual content snaking around others’ work seems pointless. But that again is what we all do: we want to just laugh at the pics, we want to all weep at the lovely list, we want to hate link to idiot articles and leave it at that, we want to send our followers to bully naysayers and detractors.
But these might be wrong. Just because nobody is stopping us, just because we aren’t being fired because “creating content” isn’t our job, that doesn’t mean it’s right for us to continue these sorts of behaviours. Digital isn’t a morally neutral area; social media isn’t a hub of amorality where people with no families, friends, or feelings exist. Things matter. Humans interacting, wherever that might be, comes with moral responsibility for just what is being interacted with and how.
What I’m pointing to isn’t one problem, clearly, but it seems to be a lot of symptoms of being unreflective of how we conduct ourselves digitally/online. (I don’t even need to talk about abusive behavior, threats and trolls, do I? I’m more addressing the average person, who wouldn’t send rape and death threats.) I think we can do better, if we start caring about what we’re dealing with in terms of content. We need not be paranoid or become so stunted by whether we’re attributing correctly we just never post – it only means learning good habits, like using Donotlink.com, like trying to source images (or requesting services of those who are bloody good at it).
The habits however require the very thing being discarded in the first place: time. People are quite ready to share stories or articles, without even reading beyond the headline; we hit the RT button, we find pics and use them in our articles because we’ve got a deadline and article number requirements; and so on. Inactivity is death to websites – original content is essential to relevance, yet quality requires time. And time creating is time that the content isn’t actually existing, meaning relevance decreases, clicks go elsewhere.
This is not easy. This very blogpost is kind of part of the problem – though I’d like to defend it by adding that most of these points I’ve either written about extensively before or have been contemplating for a long time. What we should want is more reflection. We’ll fail, but at least we’ll know why; at least we know what to look for. The problem isn’t the Benny Johnson’s of the Internet, it’s your friend sharing and spreading nonsense; it’s us being unreflective about this new technology and new behaviour. We can all do better.