Content note: I’m not going out of my way to spoil the game, but I’ll mention some aspects of one of the endings.
Eliza is a visual novel by Zachtronics–a game studio better known for its programming puzzle games. It’s about the titular Eliza, an AI that offers counseling services. The counseling services are administered through a human proxy, a low-paid worker who is instructed to read out Eliza’s replies to the client. It’s an exploration of the value–or lack thereof–of AI technology, and the industry that produces it.
As a professional data scientist, media representation of AI is a funny thing. AI is often represented as super-intelligent–smarter than any human, and able to solve many of the world’s problems. But people’s fears about AI are also represented, often through narratives of robot revolutions or surveillance states. Going by the media representation, it seems like people have bought into a lot of the hype about AI, believing it to be much more powerful than it is–and on the flipside, fearing that AI might be too powerful. Frankly a lot of these hopes and fears are not realistic, or at least not apportioned correctly to the most likely issues.
Eliza is refreshing because it presents a more grounded vision of AI, where the problems with AI have more to do with it not being powerful enough, and with the all-too-human industry that produces it.
The technology of Eliza
Eliza resides within the subfield of AI known as natural language processing (NLP). NLP is notoriously difficult, which creates a gulf between AI that is technically impressive, and AI that makes a good product. If Eliza were real, it would be a technical achievement–although not outside the reach of current technology. Personally, I was impressed with how consistent it is in correctly parsing clients’ words, and not producing total nonsense half the time (have you ever tried speaking to Siri?). On the other hand, being a good counseling service is a much higher bar, and Eliza is clearly not a very good product.
To illustrate what NLP is like, I might start with the most basic NLP algorithm: word counting. Suppose we’re trying to determine whether someone is being positive or negative (a problem known as sentiment analysis). We simply count the number of each distinct word, and we have a dictionary that tells us how positive or negative each word is. Sum up all the positivity, subtract the negativity, and you have your result. This is effective at judging large bodies of text with statistical accuracy, but has obvious limitations when it comes to judging any individual statement.
Eliza appears to have this exact algorithm running in the background. As the client speaks, especially positive or negative words are highlighted for the proxy to see. The clients, in the course of speaking normally, perpetually foil the algorithm, inserting lots of positive words into negative statements and vice versa. One surmises that Eliza only has this sentiment analysis for show, and doesn’t actually use it for anything; otherwise, Eliza would be a lot dumber than it is.
Another algorithm that Eliza appears to use is a recommendation algorithm. At the end of each session, Eliza shifts to a rigid script where it recommends a meditation program and a drug. How good these recommendations are is difficult to say, since the proposed treatments are fictional. However, it’s easy to doubt the advice, given that Eliza recommends a drug to almost every client, and the meditation program is just a product sold by the same company that owns Eliza.
So far, I’m just describing the less impressive of Eliza’s components. Eliza also needs to convert sound data into text, synthesize with biological measurements of the client, identify the main point that the client is making, and choose the best stock phrasing to reflect that point back at the client. In a way, Eliza is doing little more than repeatedly asking “And how did that make you feel?” but an impressive amount of effort was needed to make Eliza say something more like “I understand you are having problem [X]… How does that make you feel?”
The highs and lows of Eliza
Most of the disadvantages of Eliza are already quite obvious, but let’s talk about the advantages. First and foremost, Eliza is cheap. Most of the clients come in do so because they can’t afford a human counselor. Eliza does in fact seem to help people who would otherwise fall through the cracks.
Eliza is also more accessible in other ways. I’ve heard that the whole process of bringing yourself to look for counseling, and then finding a counselor that works for you is very difficult. It seems like going to Eliza has a lower barrier, because it’s an AI that is program-bound to be non-judgmental, and messy human relationships never enter into it. (The human proxies might judge you, but they’d be fired for judging you out loud.) And though Eliza’s counseling is mediocre, it is at least very consistent, a known constant.
And Eliza’s ability (or illusion) of listening seems to provide value to at least some people. But, perhaps the credit mostly belongs to the proxies, who are actually listening? Hard to say.
In any case, listening can only do so much. It’s obvious that many of the clients need real advice, and not just a recommendation for a meditation app or drug. They need life coaching, they need a second perspective on a social situation, they need money, they need an escape from cis-heteronormativity. Some of these problems, Eliza can’t address because Eliza isn’t smart enough. Other problems couldn’t be addressed no matter how smart Eliza is.
Now there is a clear way that Eliza could be more helpful without going beyond its current capabilities: Eliza could provide referrals to other services. But if you think about it, there’s an obvious reason why Eliza doesn’t do that. Sending you to another service doesn’t benefit the corporation that produces Eliza. Instead, Eliza wants to keep you within the company’s own product ecosystem. What blocks Eliza from being a better product is not the technology, it’s the incentives. The company does not profit from making Eliza better.
Privacy
Another issue that Eliza explores is privacy. Eliza starts giving clients the option to grant access to their phone activity, supposedly in the name of improving Eliza’s counseling. There’s no reason to believe it improves Eliza’s treatment, because it’s only experimental so far, and one suspects that they just want to collect more training data for their general AI project. It’s also clear that not everyone who opts in actually understands the agreement, and the data includes communications with other people who were not party to the agreement. And some data gets shared with other companies, and from there, who knows?
Now I realize there’s a pretty broad range of attitudes towards privacy, from people who do their utmost to avoid data collecting platforms, to people who just do not care. You can put me in the does not care category, with some caveats. Companies definitely profit from having data so they better be providing some service in return. And obviously I wouldn’t want people stalking or stealing my identity. That’s okay if you disagree with me, though. Eliza’s relatively realistic portrayal allows the player to engage with the privacy issues as they are, allowing players as many distinct reactions as there are in real life.
…Except, I have one little niggle about Eliza’s portrayal, something that doesn’t ring true. When a client gives Eliza access to their data, their proxy has to go through their phone as well. This makes sense from a narrative standpoint, because the protagonist is a proxy, and the player needs to see the private data to get the full story. But I don’t think it makes sense from a business perspective to essentially incorporate stalking into their normal process. This isn’t just a privacy violation, it’s a very particular flavor of privacy violation where someone who “knows” you gets to read your private thoughts, and change how they see you. Stalking can happen, sure, but I felt it was a little over-dramatized.
Shades of unrealism
I’m going to complain about another bit of unrealism in the game. One of the characters is Rainer, the CEO of the company that owns Eliza. He believes that AI technology has finally made it when it’s capable of writing poetry. It’s a bizarre standard that is not particularly difficult to meet, and makes it seem like he doesn’t know what he’s talking about, although I’m not sure that’s the intended interpretation.
Anyway, if you like, you can choose an ending where you join Rainer on his quest for general AI. And I chose that ending (first), because I was thinking, hey a job working on a product that directly and demonstrably helps people? Even if the product is not very good, that’s a lot better than most. Not everyone can get a job saving the whole world, also those jobs don’t have good work-life balance, and so far the world-saving app is vaporware. Rainer is clearly an unsympathetic character, so I was pretty sure something bad would happen but I thought I’d give it a shot.
Without giving away the whole ending, it goes into some singularity shit and I didn’t like it. It’s a return to the common portrayal of AI as an all-powerful entity, a metaphorical vessel for our hopes and fears about the future. And I just don’t believe the future will play out like that. I mean, not that I know either way, it’s the future, I’ve never been there before. But for me, it broke the realism that I otherwise liked about the game.
For the most part Eliza is a game filled with moral ambiguity, giving you the space to see what AI technology is really like and form your own opinions. But I felt the ending suffered, because it had to collapse the many possibilities, offering a single outcome that didn’t quite fit with my own interpretation up to that point.
brucegee1962 says
I’ve heard it theorized that palm reading “works” simply because it is powerfully therapeutic for most people to have someone listen to them talking about their problems for half an hour while holding their hand.
Mara Jade says
brucegee1962 made a great comment about the hand holding effect of palm reading 🙂 That’s probably true about that aspect of it being therapeutic.
Thanks for the blog post. I’ll look up more about Eliza. Sounds interesting.
Mara Jade says
I see the game is on sale for half price on Steam until Feb. 3rd. I don’t think I’ve ever purchased an online video game before. It’s under 8 bucks. I may have to buy it before the sale ends.
Siggy says
@Mara Jade,
If you buy it, hope you enjoy it! It’s not very long, but it is fully voice acted.