The lunatics in Silicon Valley are in a panic about Artificial General Intelligence (AGI), but they can’t really explain why. They are just certain that the results would be dire, and therefore justifies almost-as-dire responses to an existential threat. Just listen to Eliezer Yudkowsky…or better yet, don’t listen to him.
Consider a recent TIME magazine article by Eliezer Yudkowsky, a central figure within the TESCREAL movement who calls himself a “genius” and has built a cult-like following in the San Francisco Bay Area. Yudkowsky contends that we may be on the cusp of creating AGI, and that if we do this “under anything remotely like the current circumstances,” the “most likely result” will be “that literally everyone on Earth will die.” Since an all-out thermonuclear war probably won’t kill everyone on Earth—the science backs this up—he thus argues that countries should sign an international treaty that would sanction military strikes against countries that might be developing AGI, even at the risk of triggering a “full nuclear exchange.”
Well, first of all, it’s a sign of how far TIME magazine has declined that they’re giving space to a kook like Yudkowsky, who is most definitely not a genius (first clue: anyone who calls himself a genius isn’t one), and whose main claim to fame is that he’s the leader of the incestuous, babbling Less Wrong cult. But look at that “logic”!
- AGI will kill everyone (Evidence not shown)
- All-out nuclear war will only kill almost everyone
- Therefore, we should trigger nuclear war to prevent AGI
Yudkowsky then doubled-down on the stupidity.
Many people found these claims shocking. Three days after the article was published, someone asked Yudkowsky on social media: “How many people are allowed to die to prevent AGI?” His response was: “There should be enough survivors on Earth in close contact to form a viable reproductive population, with room to spare, and they should have a sustainable food supply. So long as that’s true, there’s still a chance of reaching the stars someday.”
That’s simply insane. He has decided that A) the primary goal of our existence is to build starships, B) AGI would prevent us from building starships, and C) the fiery extermination of billions of people and near-total poisoning of our environment is a small price to pay to let a small breeding population survive and go to space. I’m kinda wondering how he thinks we can abruptly kill the majority of people on Earth without triggering an extinction vortex. You know, this well-documented phenomenon:
Jesus. Someone needs to tell him that Dr Strangelove was neither a documentary nor a utopian fantasy.
But, you might say, that is so incredibly nuts that no one would take Yudkowsky seriously…well, except for TIME magazine. And also…
Astonishingly, after Yudkowsky published his article and made the comments above, TED invited him to give a talk. He also appeared on major podcast’s like Lex Fridman’s, and last month appeared on the “Hold These Truths Podcast” hosted by the Republican congressman Dan Crenshaw. The extremism that Yudkowsky represents is starting to circulate within the public and political arenas, and his prophecies about an imminent AGI apocalypse are gaining traction.
Keep in mind that this is the era of clickbait, when the key to lucrative popularity is to be extremely loud about passionate bullshit, to tap into the wallets and mind-space of paranoid delusionists. Yudkowsky is realizing that being MoreWrong pays a hell of a lot better than LessWrong.