Layers and layers of grifting


In its heyday, Silicon Valley was flush with money (OK, it still is), and was a magnet for tech talent…and also for the con artists who wanted to skim off the surplus. Among the many frauds is Eliezer Yudkowski and his LessWrong community. Here’s a fascinating letter that boldly airs the accusations.

The Machine Intelligence Research Institute (MIRI) in Berkeley and its sister organization, the Center for Applied Rationality (CFAR), claim to be organizations dedicated to AGI safety and the art of human rationality. However, these organizations are not what they make themselves out to be, and MIRI is in fact defrauding its donors through misleading promises and an ongoing cover-up of statutory rape, blackmail, and fraud.

MIRI was founded as the Singularity Institute in 2000 by Eliezer Yudkowsky, a prominent thought leader in the Less Wrong “rationalist” community. Less Wrong is a community blog (also founded by Yudkowsky,) that claims to be dedicated to the art of human rationality. Yudkowsky has written over 300 blog posts, several books, and a popular fanfiction that gained him the status of a minor celebrity, especially among fellow “rationalists.” The “rationalist” community in the Bay Area, which is fairly tight-knit, features group housing, where members live together and sometimes work together.

Yudkowsky’s ideas had members of the Less Wrong community convinced that Yudkowsky would bring about singularity and save the world. Yudkowsky has positioned himself as one of a small class of people focusing on global catastrophic risks, while teaching his followers that the ethical thing to do is to either work directly on AGI, or get a high-paying job and donate to MIRI. Many of the people in the Less Wrong community took Yudkowsky and MIRI very seriously. Until recently, most of MIRI’s donations came from within that community, with some members donating tens of thousands of dollars to the cause. The original team at MIRI was composed partially of top contributors to the community’s blogs, like Nate Soares, Luke Meuhlhauser, and Anna Salamon.

While these bloggers weren’t as popular and admired as Yudkowsky, they were widely trusted by the community. And at events like the Workshop on AI Safety Strategy (WAISS) run by CFAR, people would suggest ideas like taking out life insurance that would pay out to MIRI, and then committing suicide to further the cause.

Working at MIRI conferred a kind of special respect as someone who Yudkowsky and other prominent members thought were worthy to save the world. To young teenagers enamored with the Less Wrong community and its stated ideals, it’s easy to see how kids could’ve been coerced into having sex with adults working at MIRI.

Yeah. They actually suggested that members of the community with mental health concerns take out life insurance policies, that it would be a greater benefit to the high ethical causes of the group than their continued existence. They also recruited (dare I say “groomed”?) teenagers to join up and frolic with their middle-aged leaders.

Testimony from community members alleges that several underage young teenagers were having sex with the 30- and 40- year old researchers working at MIRI, especially between 2007 and 2014. The list of people accused includes (but is not limited to) the director and senior research fellow, and a former executive director who is now working at the Open Philanthropy Project, a major donor to MIRI.

Perhaps relatedly, this was going on around the time MIRI was accepting significant financial support to the tune of $50,000 from Jeffrey Epstein, even after his conviction for prostituting children. Yes, that Epstein.

MIRI’s entanglement in statutory rape was one of the worst kept secret in the Bay. Many rationalists are keen to say that the age of consent is an arbitrary number, and favor treating young people as fully-grown adults. So, it wasn’t much of a secret that older individuals in their 30s and 40s were having sex with teenagers. And perhaps that’s why when nearly a hundred people in the Bay Area Rationalist Community Safety Discussion group (BARCSD) saw insiders discussing whether the perpetrators had successfully evaded the statute of limitations for statutory rape in reference to MIRI, the members of BARCSD collectively shrugged. Additionally, one of the former underage teenagers was in the group, and admitted to having sex with much older adults, though this failed to interest the BARCSD.

Multiple comments were edited or deleted once members realized legal consequences were possible.

This is a group that communicated online extensively, and it’s revealing that now it’s sinking in that they’ve left a trail behind them, and are busy deleting old posts that might incriminate them.

Yudkowski always did seem like a pretentious phony to me, and it’s good to see that some people are catching on.

Comments

  1. remyporter says

    “We’re a society dedicated to rationality and are also singulatarians.”

    “Right, no, the contradiction is obvious, but… why are you fucking children?”

  2. StevoR says

    ^ remyporter : Make that raping them by definition.

    Well, almost by definition – I guess if the teenager is over the legal age of consent there (18?) it is technically not statutory rape but it is still absolutely unethical and exploiting and wrong.

  3. wzrd1 says

    Hey, what better echo chamber than a captive community as an echo chamber? It’s continually reinforcing.

    As for buying life insurance, then suicide for payout, most life insurance refuses to pay out for suicide. Indeed, the only life insurance I’m aware of that currently does is servicemembers group life insurance, which is required to by statute if they want to be allowed to sell to armed forces service members. That’s also felonious, as it’s quite literally conspiracy to commit fraud.

    Don’t get me started on AGI. It’s basically still the equivalent of the singularity and cold fusion. As yet again proved by the chatbot hype turning dot bomb.
    “My chatbot is 100% accurate, 100% of the time, just ask it”.
    Meanwhile, the reality is, ELIZA is just as bright.

  4. StevoR says

    Hmm… Checking out the wikipage :

    https://en.wikipedia.org/wiki/LessWrong

    Reveals that this is the mob that came up with “Roko’s Basilisk” and “..LessWrong played a significant role in the development of the effective altruism (EA) movement,[23] and the two communities are closely intertwined.”

    A good quote from there :

    “David Auerbach wrote in Slate “the combination of messianic ambitions, being convinced of your own infallibility, and a lot of cash never works out well, regardless of ideology, and I don’t expect Yudkowsky and his cohorts to be an exception. I worry less about Roko’s Basilisk than about people who believe themselves to have transcended conventional morality.”[11]

    Which given the whole having sex with teenagers thing now seems ..yeah. Pretty spot on.

  5. Allison says

    Whenever I hear people talking about being “rational” and how important “reason” is to them, I know that they’re really just trying to convince themselves (and hopefully other people) that their irrationally or non-rationally arrived-at beliefs are capital-T Truth. Like formal logic, using reason to figure things out only really works in simple, limited cases. It’s no good for the bigger things.

    For one thing, human beings aren’t very good at rational thought. For another, even if they were, rationality needs correct data, and there is almost never enough data to make a useful rational conclusion, and what data there is is usually pretty unreliable; so in their attempt to arrive at a “rational” conclusion, they mostly get the wrong answer.

    (FWIW, in mathematics, the field I’m most familiar with, nobody gets results using reason. You generally start off with intuition, then try to figure out whether your conjecture actually works, usually by a lot of mathematical experimentation, and only when you’re pretty sure that you have a correct conclusion do you start assembling a “proof”, i.e., a logical argument why other mathematicians should believe you. And even then, sometimes people find out that something they thought was “proven” turns out to be false.)

  6. says

    Yes, age of consent is an arbitrary number, because it has to be. Human beings aren’t mass produced with identical characteristics. X might be physically, mentally, and emotionally mature enough at 15 to have sex with an adult, but most 15 year olds aren’t X. And if you’re a 30 something man who wants to have sex with a 15 year old maybe you aren’t mature enough to be having sex.

  7. ardipithecus says

    It’s not just mental maturity. Childbirth mortality in teens is nearly 1/3 higher than for 20+. Under 15 it skyrockets from there. Yet so many think that the best way to protect teens is by keeping them ignorant.

  8. billseymour says

    Allison @7:

    Whenever I hear people talking about being “rational” and how important “reason” is to them, I know that they’re really just trying to convince themselves (and hopefully other people) that their irrationally or non-rationally arrived-at beliefs are capital-T Truth.

    Well, I, too, try to be rational; but I’m not out there claiming that I’m successful at it since I understand how easy it is to fool myself.

  9. says

    It’s been ten years since I left the cult. I mean, I was always more of a lurker than an active participant, but I did donate money to SIAI and MIRI, so yikes on me.

    EY’s writing about intelligence was, in retrospect, incredibly sketch. On one hand, he described intelligence as the ability to steer the future (i.e. manipulate the people and environments around you) in order to attain your goals. That’s wrong, but in a way that was subtle enough that it took me a long time to articulate why. On the other hand, he described intelligence as being able to think quickly, to the point that over and over again in his writing he chastized Einstein and other physicists for not just sitting down for an evening and solving quantum gravity from first principles. Even at the time, I knew something was wrong with that view, because I’m a trained Computer Scientist and know about algorithmic time complexity and about undecidability. But now I also see that, if you make lightning decisions and then never revisit or question them, you stay in the cult. He paid lip service to the idea of changing your mind about things, but suggested that you could reprocess every thought you’ve ever had in light of new evidence over a single evening of quiet contemplation.

    And, yeah, his HPMOR fanfic was kinda the final straw. The longer it went on, the more I realized that his moral and ethical views were monstrous.

  10. jenorafeuer says

    My feeling is that Yudkowski himself was always more of a true believer high on his belief of his own superiority, not a grifter himself; but the entire Less Wrong mode of thought was absolutely prime for creating people who firmly believed they were smarter than everybody else, which in turn made them very easy to fool and unwilling to admit it. Which means that affinity scams and grift were always absolutely going to happen.

    I once described him as ‘Exhibit A that we need to teach engineers more philosophy so they stop trying to reinvent it badly’.

  11. vereverum says

    I thought AGI was Adjusted Gross Income
    these people should be careful messing in the IRS’s bailiwick.

  12. drsteve says

    Shout out to Dr. Elizabeth Sandifer’s Neoreaction: A Basilisk. Essential reading for anyone interested in seeing just how deeply weird the rabbit hole goes for this lot.

  13. Jemolk says

    Allison @7 — I think formal logic is a more powerful tool than you’re giving it credit for. That’s kinda the problem, actually. With accurate premises and meticulous avoidance of fallacies, it can take our understandings of each of the many academic disciplines and create a unified understanding of reality that improves our understanding and generates accurate predictions. Mess up even one of those, though, and it’s extremely easy to be led wildly off-base into lala land. It’s also extremely easy to fall into a wide variety of traps in your logic, as well as to misunderstand the science. Philosophy also lacks a formal process that you can utilize like the scientific method to separate yourself from your biases, so you have to be exceptionally meticulous in a way that people generally aren’t good at being. Still, if you’re careful and listen meticulously to the people whose data you’re drawing from, you can use formal logic to extrapolate valid and useful answers well beyond what our experiments can actually directly test.

  14. rrhain says

    @12, Chronos: Indeed. I’ve been slowly hate-reading HPatMoR. Some friends suggested it when it was first coming out and there are some very interesting points in there (given the wizarding monetary system, why has no muggle-born ever figured out how arbitrage works?) they were the flecks of corn in an otherwise steaming turd pile. Nobody in the sloppy narrative has any redeeming qualities, especially the protagonist. For a supposedly “rational” person, the character of Harry is blitheringly irrational regarding his ability to self-criticize and his immediate willingness to go along with a clearly bad idea simply because he doesn’t like the institution (breaking Bellatrix out of Azkaban because, and this is true, Azkaban is a torture chamber) shows how much he’s a Marty Stu for Yudkowski and how inflated Yudkowski’s ego is.

  15. wzrd1 says

    Trump is suing Cohen for $500 million. Now, what the lawyers filed and what Trump says, well, if Trump does his usual, Trump will likely look at obstruction charges, as Cohen’s long been on the witness list.
    Trump really needs to learn, if you want to get out of the hole, first put down the shovel. He keeps digging in deeper. I can easily see him talking himself away from just the likely fines and straight into a prison cell.

  16. says

    Allison @7: The other HUGE problem with all those self-consciously-pretend-rationalists is that, in order to be totally 100% “rational,” they base a lot, if not all, of their reasoning ONLY on facts that can be quantified. Which may be okay when one is reasoning about physics, math, or the like; but when one is reasoning like that about human interactions of any scale, one inevitably ends up basing all rational arguments only on and around numerical quantities like IQ and, mostly, money; with the assumption, spoken or not, that because it’s a number it must be irrefutably real. This is how we end up with clueless (or dishonest) libertarians equating money with rationality and thinking our huge web of financial/business interactions make up a super-intelligent force that guides all people toward the most rational choices all the time (with the richest and most powerful being the arbiters of rationality); and “race realists” banging on forever about IQ like it’s the only valid measure of human mental capacity.

  17. wzrd1 says

    The Libertarians simply refuse to acknowledge history and the lessons learned from history, thinking that repeating the same things that caused problems will suddenly achieve magically different results without the regulations we put in place as part of lessons learned previously.

    As for the IQ banging crowd, the first question I hit them with is, which IQ category? I’ve yet to meet a single one that could name even one of the categories and their significance or relevance to what point that they were trying to make (there can be none within the context of what they’re blathering about).

  18. says

    wzrd1: I’m not even sure the libertarians are aware that those things had already been done before, let alone what the consequences were.

  19. jo1storm says

    @24:

    Bingo! Talk with libertarians long enough (especially anarcho-capitalists) and eight times out of ten they up reinventing feudalism. Badly. The remaining two times they end up with armed warlord dystopia.

    The question that makes them stumble the most is “How do you deal with a-holes? What if somebody breaks the rules and decides not to fulfill their contract?” and then it turns out their non aggression principle is not quite a rule, more of a guideline really… Either that or lots of wishful thinking.

  20. GerrardOfTitanServer says

    In my experience, most libertarians take their non-aggression principle seriously until it comes for (mandatory) taxes to support property rights, including minimal courts, police, army, etc. Then there’s the outright anarchists who are against mandatory taxes, but in my experience they’re the minority. Dunno. Personal experience IMHO.

  21. Kagehi says

    Yeah.. Kind of like a lot of things that bugged me when I “was” that age, being generally more rational (or at least thinking I was) than most of my peers, there is a bloody difference between a) knowing an age is somehow arbitrary, and b) having either the ability to actually make activities, including sex, safe, making sure its not exploitative – and even “adults” sometimes can’t avoid having this happen to them – and having an actual way, outside of, “She/He is cute, so I am going to assume they are mature enough.” (hint, this isn’t a method that works) to determine if someone is mature enough.

    Again, if we had some clear, non-arbitrary, functional method of making such a determination, half the ADULTS on the planet would fail the test, never mind “teens”. So, so, so ick!

    Also.. We need a new term for these clowns, that will stick. Ameritarians maybe? Talking about them and using libertarian is a bit like lumping Evangelicals in as Christian along side pseudo-pagan liberal churches. It renders the word itself utterly meaningless, and you have to keep adding qualifiers like, “Oh, I mean Southern Baptists, but not the ones that go around protesting at funerals!” Or, in the case of libertarians, its more like, “Oh, no.. I don’t mean the ones that actually like government and believe in some regulation, but the crazies that don’t. You know, the American ones.”

  22. says

    Or, in the case of libertarians, its more like, “Oh, no.. I don’t mean the ones that actually like government and believe in some regulation, but the crazies that don’t. You know, the American ones.”

    I’m sticking with “libertarians” because everyone I’ve ever heard taking that label is among “the crazies that don’t,” at least until you really press them hard to the wall with a dose of reality. They all support the same basic principles and ideology, and they’re all wrong for the same reasons, regardless of how “diverse” a bunch they claim to be.

    In my experience, most libertarians take their non-aggression principle seriously until…

    The “non-aggression principle” is nothing more than the bully’s logic: the bully gets to say or do whatever he wants, but the second anyone tries to stop him with any physical force (i.e., slapping his hand away from their wallets), then THEY are in the wrong, because they’d committed an act of “aggression,” and the bully gets to play the peace-loving victim. It’s how the powerful, who have all manner of means to rob and cheat the rest of us, protect themselves against any effective retaliation from the powerless, who often have nothing but their own fists.

    Bingo! Talk with libertarians long enough (especially anarcho-capitalists) and eight times out of ten they up reinventing feudalism.

    Sometimes with a dose of “divine right.” All the while claiming their ideology is descended from John Locke, whose writings very clearly oppose and disprove said ideology.

  23. GerrardOfTitanServer says

    The “non-aggression principle” is nothing more than the bully’s logic: the bully gets to say or do whatever he wants, but the second anyone tries to stop him with any physical force (i.e., slapping his hand away from their wallets), then THEY are in the wrong, because they’d committed an act of “aggression,” and the bully gets to play the peace-loving victim.

    I disagree. I don’t think anyone would agree with that. I’ve never heard a libertarian say anything so silly.

    The bigger problems is that you can’t enforce reasonable precautions against foreseeable violence, and you cannot coerce people to act for the public good. Vaccination is still my favorite go-to example. We should have mandatory vaccination. In some / most US states, we do practically have mandatory vaccination for all schoolchildren. Some US states don’t even have religious exemptions. That’s the way it should be. It’s also a violation of the non-aggression principle because it’s forced on you and it’s violence to do so considering that there’s a one to a ten million chance of adverse side effects of the vaccine. I’ll be at the front of the line willing to do that violence to others, e.g. forced vaccinations under the law, because it’s way way better for everyone if we did it.

  24. says

    I’ve never heard a libertarian say anything so silly.

    They don’t say it that way, but that’s still what their “principle” amounts to.

    We should have mandatory vaccination. … That’s the way it should be. It’s also a violation of the non-aggression principle because it’s forced on you and it’s violence to do so…

    So now you’re labeling ANY form of legal mandate or requirement “violence” and then calling it “a violation of the non-aggression principle.” That kinda proves my point about said principle being nothing more than the bully’s rationale, thinly disguised. In the example you cite, any selfish moron who doesn’t want to take a simple precaution for his own and his neighbors’ safety gets to go out in public unquestioned, and if he gets anyone else sick with a debilitating or deadly disease (how’s that for “a violation of the non-aggression principle?”), tough shit.

    …you cannot coerce people to act for the public good.

    Actually, yes, you can, and in many cases we HAVE to, because many (if not most) people won’t chose to make any sacrifices for any public good unless they see their neighbors coerced to make the same sacrifices.

  25. GerrardOfTitanServer says

    Why do you think I’m defending libertarianism? I just thoroughly attacked it.

    The thing with vaccines is the freerider problem. If everyone else takes it, then there’s no need for you to take it (more or less). That’s why we’ll probably always need some amount of government coercion to make enough people take vaccines.

    Actually, yes, you can

    Sigh, I meant “if one assumes the horrible non-aggression principle (which I don’t), then X follows”. Sorry.

  26. imback says

    Just remember there are uncountably more chances to be irrational than rational, and that applies to real people as well as real numbers. ;-)

  27. Kagehi says

    @33 Except, they think this is like, “Getting a free ride.”, when its more like, “I don’t want to wear a life vest until the boat is sinking,” then… for some reason they didn’t pack enough life vests (because X percent won’t wear them), and/or, its suddenly, “I need that life vest! I don’t care if there are now not enough of them!” I.e., getting that “free ride” by kicking someone else off the already full bus.

  28. wzrd1 says

    The most ludicrous Libertarian outside of Rand was, of all people, a National Guard Command Sergeant Major, who objected to any and all taxes or collective expenses – while he was happily suckling upon the taxpayer’s teat with his unit deployed to Djibouti, making a fair bit more money than he’d have earned in his civilian job.
    Hypocrisy is easy when one lives deep inside of doublethink.