Siri is probably not explicitly hiding abortion information

Next query: Siri, look harder.

So, Apple’s engineers recently found themselves with egg on their faces with regard to the iPhone 4S’ Siri application. Why? Privilege, as it happens.

My best guess about the “Siri doesn’t help find abortion clinics” maneno is that the engineers responsible for the application completely forgot to vet stuff relevant to women today, like abortion clinics, when testing their assistant’s ability to pull real phrases out of voice commands then finding relevant information from various sources. And they probably completely forgot to vet this stuff because the people testing it initially don’t particularly have that concern. They’re likely all men, or women who weren’t in positions to need reproductive services for whatever reason. Thus, those without that privilege get short shrift.

The AI technology works like this:

  • First, you speak a phrase at Siri. Your phone sends the raw audio of your speech to a central server, where it is processed to turn it into text.
  • That text is run through a keyword search, to pull out relevant words from the list, a syntax scanner that tries to determine what kind of question this is. In our example flowchart, let’s say “where can I find” clues it in that you’re looking for a nearby X, whatever X happens to be.
  • If overrides are programmed here to search for something else instead of a particular class of phrases, to obtain better results, then those “aliases” will be searched for here. Any alias-based programming overrides the original intent of the phrasing. Case in point: ask Siri where to hide a body, it asks whether you want to search for marshes, lakes, construction sites, etc. Or, the wide range of requests that Siri aliases to be requests for escort services. It’s intended to be funny, but it can be used to improve the service for everyone on the fly.
  • Then, since we’re assuming this is a location search, X is searched for via a location-based service like Yelp, or some other search if that class of phrase is going to get better results on a specific search engine.
  • The results are returned to your phone, and all that usually in practically no time flat.

As a test, I tried searching for abortion clinics on Yelp and didn’t get very many actual abortion clinics — just a few regular medical centres being derisively referred to as “like a back-alley abortion clinic”, e.g. dingy and dirty. So, the problem as far as I can tell is that nobody tested “abortion clinic” to see if the results spit back from the default location-based search engine were actually relevant. Thus the Crisis Pregnancy Centres (a.k.a. anti-abortion religious outfits pushing forced pregnancy, that love to masquerade as family planning centres) getting mixed into the search results, and Planned Parenthood being missing in many of them.

What’s more curious is the fact that normally when Siri doesn’t understand a query it sends it to Wolfram Alpha or Google. With “where can I find the morning after pill”, it doesn’t do this. I strongly suspect it’s the “morning after” part, since Siri is programmed with ease of making calendar entries or time-based reminders in mind. The syntax interpreter would read the “morning after” part and think it’s part of a time-based request, but the “where can I find” flags it as a search, and the AI probably gets confused for receiving two classes of requests at the same time.

The fix for the abortion question is to add aliases for “abortion clinic” or other such common phrases, and to direct the search process to seek out appropriate family-planning or other reproductive resource clinics via an engine that will actually provide appropriate results. Preferably, it would give you results that do not skew toward the CPCs which actively steer patients away from abortions, but I’d settle for a fix that shows both (described appropriately, of course). That fix has not yet been implemented, and there’s no real indication from Apple as to why.

Apple has claimed these issues are mere glitches, and are wholly unintentional. It strikes me as a very good thing they labeled the Siri service as beta, because they obviously alphatested it with a bunch of wise-asses internally, so as to get the widest range of joke answers they could manage, but only did cursory real-life testing before release. If someone using this technology had an emergency need for Plan-B contraceptives, one would hope they would not depend on this technology to steer them in the right direction.

All that said, I don’t believe it is an intentional omission, and I think people screaming about some sort of conservative agenda are imagining things. With all the real-life ways conservatives are trying to fuck up everyone’s lives for their own twisted sense of morality, there are better fights to expend resources on. Apple got burned by this one, and rightly so, but they’re not villains as far as I can tell. They just got really excited about some promising bit of tech they’d cooked up, they overhyped it as being like the Enterprise computer.

What they delivered was closer to ELIZA tied to a few search engines with a few quips for the most obvious jokes or sexist remarks. So now they have to scrub off the raw and raggedy edges, of which there are very many, and hopefully it’ll all happen in due course. The last thing I want to see is that every omission or oversight in Siri get turned into a knock-down drag-out fight over how embattled Apple has been in delivering on their promises. If they have to backpedal on their promises somewhat to prevent that from happening, I’d strongly advise they do so as early as possible.

In the event that they drag their feet on fixing the abortion issue, but manage to implement fixes for other issues that come up, that’s when you’re well within your rights to let slip the dogs of war.

{advertisement}
Siri is probably not explicitly hiding abortion information
{advertisement}

4 thoughts on “Siri is probably not explicitly hiding abortion information

  1. 1

    I haven’t actually seen any claims that the lack of women’s health care info from Siri was intentional. The fact that nobody thought to check, or to remedy the situation until now is the sexism that’s being complained about. That common medical needs of half the population are completely overlooked is the sexist part.

  2. 2

    Well within their rights now. This was a smack to the face whether it was intended to be or not. It’s such a massive, glaring oversight. How could you fuck this up? (No I don’t mean technical issues.) It isn’t as if this was some guy doing this in his spare time. This was released by Apple. It’s got their name on it. Wouldn’t you want to vet it a little more thoroughly?

  3. 3

    Alright, that was getting too long, so /del and short version….

    Assuming it’s early public beta, I’d expect these kinds of content ‘bugs’ (not finding legit abortion clinics and turning up erroneous results with the anti-abortion groups cropping up in the search). A limited user base means limited content checking. If developers were perfect and thought of everything, there would be no need for alpha/beta stages of program development.

    If it still isn’t fixed by the time the program leaves beta, then it’s time to get the pitchforks and torches. Although if 20 years of using search engines has taught me anything, there will still be erroneous results.

    Now, partner services (like Yelp, as mentioned) that don’t turn up these results properly (with their non-beta, full-release services) deserve the pitchforks and torches now.

  4. 4

    Of course it probably isn’t. It is utterly paranoid to assume that this is intentional.

    All software has flaws. Sometimes they’re politically incorrect. It’s embarrassing but it happens. There was the camera that thought certain Asian people were blinking, the original Kinect software that had trouble picking up darker complexions and now the pocket secretary *ahem* personal assistant application that doesn’t know what abortion is.

Comments are closed.