I received a text from a friend in Sri Lanka who forwarded a link to a YouTube video and asked for my ‘professional opinion’ on whether it was credible, even though I am not a professional when it comes to analyzing such things. Even without looking at it I suspected that it was not credible because like many people, my friend is pretty credulous about things that are passed around on Facebook, and other social media, and gets easily alarmed. His last query to me a year ago was about the miracle of fish falling from the sky which consisted of a doctored video that was obviously fake. (He is also very religious.)
So what was this new thing that he wanted my opinion on? It was the hoax about the coronavirus being caused by the new 5G network.
New Agers, right-wingers, and QAnon conspiracy theorists think global elites are using 5G to spread the coronavirus pandemic.
The paranoia about 5G — the industry term for the fifth generation of wireless communications infrastructure — has risen for the last few years, but as the world battles the pandemic, a baseless hoax has spread that the technology that runs cellphones could secretly be causing the outbreak.
Then the next day, I get another text asking about another YouTube video in which someone named Dr. Shiva Ayyadurai spoke to the host on an internet site called the Next News Network where he claimed that the entire vaccination program is part of an evil global conspiracy and that the ‘deep state’ is hiding the truth about the coronavirus. Ayyadurai is a notorious self-promoter who makes all manner of outlandish claims that have been found to be unfounded. Next News Network specializes in creating and promoting all manner of bizarre conspiracy theories. When you see the two together you can be pretty sure that it is rubbish.
This article explains how the algorithms used by YouTube quickly drive up the views of such sites.
Company insiders tell me the algorithm is the single most important engine of YouTube’s growth. In one of the few public explanations of how the formula works – an academic paper that sketches the algorithm’s deep neural networks, crunching a vast pool of data about videos and the people who watch them – YouTube engineers describe it as one of the “largest scale and most sophisticated industrial recommendation systems in existence”.
Lately, it has also become one of the most controversial. The algorithm has been found to be promoting conspiracy theoriesabout the Las Vegas mass shooting and incentivising, through recommendations, a thriving subculture that targets children with disturbing content such as cartoons in which the British children’s character Peppa Pig eats her father or drinks bleach.
Lewd and violent videos have been algorithmically served up to toddlers watching YouTube Kids, a dedicated app for children. One YouTube creator who was banned from making advertising revenues from his strange videos – which featured his children receiving flu shots, removing earwax, and crying over dead pets – told a reporter he had only been responding to the demands of Google’s algorithm. “That’s what got us out there and popular,” he said. “We learned to fuel it and do whatever it took to please the algorithm.”
[Former YouTube software engineer Guillaume] Chaslot explains that the algorithm never stays the same. It is constantly changing the weight it gives to different signals: the viewing patterns of a user, for example, or the length of time a video is watched before someone clicks away.
The engineers he worked with were responsible for continuously experimenting with new formulas that would increase advertising revenues by extending the amount of time people watched videos. “Watch time was the priority,” he recalls. “Everything else was considered a distraction.”
He was especially worried about the distortions that might result from a simplistic focus on showing people videos they found irresistible, creating filter bubbles, for example, that only show people content that reinforces their existing view of the world. Chaslot said none of his proposed fixes were taken up by his managers. “There are many ways YouTube can change its algorithms to suppress fake news and improve the quality and diversity of videos people see,” he says. “I tried to change YouTube from the inside but it didn’t work.”
Even if YouTube is pushing such messages, I don’t know what to make of people who are willing to believe that there is some link between communications technology and the spread of a biological virus. Is it because we talk of computer ‘viruses’ that they think that the two kinds of viruses are somehow similar physically and not merely that we use the term ‘virus’ as a metaphor for malware to illustrate the manner in which they spread?
In one sense, I am glad that some of my friends and relatives ask me for my opinion so that I can try and prevent them from passing on nonsense to others. But at one point, you would hope that they would become able to instinctively recognize when something looks suspiciously like nonsense and investigate such things for themselves because while the internet contains a lot of rubbish, it also can provide good information provided you know where to look.