Can a passenger stop a self-driving car?


The above question was prompted by a strange dream I had last night. I was in the front passenger seat of a car that supposedly had self-driving capabilities. The owner of the car was in the driver’s seat and at one point got out to do something or other. The car started off without him and proceeded to go somewhere unknown to me. In the dream, I was wondering how to bring the car to a halt but had no idea what to do. In my dream, I looked for the steering wheel and brake pedal and other standard control features of ordinary cars but since I have never been in a self-driving vehicle, my dream did not have that specific information.

I am not sure how likely that scenario is but this morning I decided to see what options passengers had in self-driving cars. It turns out that there is very little on the internet about this specific question.

One source of truly self-driving cars are the taxis run by Waymo in some cities. This article says that passengers are not allowed to touch the vehicle controls.

Waymo’s rules for riders are fairly straightforward. The driver’s seat and steering wheel are off-limits, you must be 18 or older to ride alone, and small children will need an appropriate booster or infant seat.

Passengers are not allowed to touch the vehicle controls. Should the taxi unexpectedly stop, Waymo support will contact the vehicle’s riders within minutes. Passengers can also contact the company through the in-car buttons or the app.

If the agent determines the vehicle cannot continue on its own, they’ll send roadside assistance to help you complete your trip. Waymo will provide details on the passenger screen, including the location of the employee, their name, and their picture. The Waymo representative will arrive wearing a safety vest and will be able to unlock the vehicle from the outside.

The specialist will either take control of the car or, in rare situations, escort you to their vehicle and drive you the rest of the way. If needed, Waymo will also work with law enforcement.

So in the case of Waymo taxis, there is supposedly a support system that can be reached for assistance in the event of some unexpected development.

But what about cars that are owned by regular people? What could a passenger do if something goes wrong and, like me in my dream, they are taken on a ride where the destination is unknown? As anyone who has done any programming at any level knows, however carefully you design and test a system’s software and hardware, you cannot be completely sure that a low-probability combination of factors that were not anticipated makes the computer go off in an unexpected direction .

Comments

  1. says

    “The Waymo representative … will be able to unlock the vehicle from the outside.”
    Wait, are passengers unable to open the car doors? Or is this an assurance that the Waymo person can open the car even if the passengers are incapacitated?

  2. sonofrojblake says

    The failure state of the car in your dream is that it goes. The failure state of the car in your anecdote from the real world is that it has stopped.

    I design chemical plants. Stuff I’ve designed has handled materials that are horrible toxic, up to an including UN-monitored chemical weapon ingredients. Stuff I’ve designed has operated at high temperatures and pressures, and handled stuff that’s highly flammable and/or explosive.

    When designing such things, one applies a few common concepts. These include:
    -- fail safe conditions. Automated valves on chemical plants are typically actuated by application of compressed air. Sometimes the air pushes a valve open, sometimes it pushes a valve closed. In any case, in the even the air supply is cut off, the valve returns to its default position by the action of a mechanical spring. A valve adding a flammable component to a reactor would typically fail shut. A valve going to a vent allowing the reactor to depressurise would typically fail open. The failure state of a self-driving car should obviously be to STOP, and indeed in the real world this appears to be the case.
    -- layers of protection. When assessing the risk of a particular hazard manifesting, you can take credit for reducing the risk level only when you have multiple independent systems that will prevent or mitigate that scenario. Each layer is typically assumed to reduce your likelihood of occurrence by a factor of ten. If the scenario is “the valve stuck”, then the procedure for dealing with an open valve is a layer of protection you can take credit for. If the scenario is “the operator left the valve open”, you CAN’T take credit for the procedure because the human element already failed. You typically need multiple layers of protection to reduce risk to an acceptable level if the severity of it manifesting is high, because likelihood can usually be summed up as “pretty likely” without your layers of protection.
    -- safety integrity level. When you look at a protective layer, you need an idea of how reliable it is. If it’s a normal valve being shut by a logic controller, well, OK, that’ll work most of the time. But if the severity of the hazard is high, you may conclude that you need a higher integrity system. That means a more reliable (read: more expensive) instrument, a higher integrity control logic solver and a more reliable final element (e.g. valve/actuator). Such control loop elements are sold with advertised reliability levels to allow the safety integrity levels of loops with them in to be calculated. “It’ll need to be a SIL 2 loop” is a groaner because that means more cost and more paperwork… but if you make it a SIL 2 loop (and document that) then you’re covered legally if it fails and blows up. You did your best.

    When it comes to self-driving cars, the systems involved are so complex, so multi-layered, so dependent on ludicrously complex software being tasked with doing something most humans can’t do that reliably, and all of this is wrapped in a product that is NOT regularly torn down and maintained but instead just run and run and run by definition in circumstances its designers could not possibly foresee in detail. Ten years ago I thought I’d own a self-driving car. Today I don’t believe I’ll see them available to buy in my lifetime. And they are, then by definition any sensible system regulating them must absolutely require that any vehicle occupant would have access to a control that would return the vehicle IMMEDIATELY to a safe state. I certainly won’t ever get into a “self-driving” car that didn’t have such an option. Would you?

    (Aside: I’ve never seen a Waymo. But you tell me “The driver’s seat and steering wheel are off-limits” -- well good fucking luck enforcing that if the car does something that makes me nervous enough to think I need to take control. We can argue about it afterwards in court, if necessary, AFTER I’ve stopped the car, but meantime “off limits” or not, I’m stopping the car. This of course assumes that it has the controls of a normal car. A properly self driving car should have a red button accessible to any passenger and nothing else. That button should bring the car to a halt and slightly open ALL the doors immediately it halts, so I can get out if it’s e.g. on fire. If it can drive itself, a steering wheel is superfluous. The fact Waymo still fit steering wheels to their vehicles gives away the fact that even they don’t trust them to work. No thanks.)

  3. Pierce R. Butler says

    A true American passenger always has the option of pulling a gun to disable some key component of the vehicle.

    Furriners and liberals can always pray.

  4. Trickster Goddess says

    A very good short story about self-driving cars, particularly as it relates to the trolley problem is “Car Wars” by Cory Doctorow. You can read it here, or you can listen to the audiobook version here (36 minutes.)

  5. Trickster Goddess says

    Sorry hit post instead of preview. Just wanted to add that the story includes the scenario from your dream.

  6. birgerjohansson says

    sonofrojblake @ 2
    Thanks.
    Pierce R. Butler @ 3
    This is scary close to how some people think in the real world.

    -BTW AI cars brings visions of the quarreling doors, cars and vending machines in a Philip K Dick story, demonstrated in the film media as the Johnnycab in Total Recall.

  7. Pierce R. Butler says

    birgerjohansson @ # 6: This is scary close to how some people think in the real world.

    Remember, I live in the southeastern United States -- those people are my real world.

  8. seachange says

    I live where there is Waymo. Customers are by invitation only. Therefore ‘not allowed to touch’ is equivalent to ‘we won’t serve you anymore if you do’. They’re fairly small inside so you’d really have to vigorously squeeze yourself through to get into the front seat to even try. You’d be ‘the human error’ if the situation was actually dangerous.

    I’m pretty sure the steering wheel exists because California regulations demand that they do.

    They stop wherever they do (not always somewhere sensible) and they open the doors when they feel like it. The riders I have seen board before the ride and alight after seem accustomed to this.

  9. EigenSprocketUK says

    …pulling a gun to disable some key component of the vehicle…

    I think I recall a scene in the film adaptation of I Robot where Will Smith’s character is travelling the high-speed tubeways.
    He becomes convinced the remote computer driving the car is trying to put him in danger so, like any USA hero would, he seizes control of the car. The other passenger is horrified that he would do such an incredibly dangerous and thoughtless thing.

  10. seachange says

    There was an article in the Los Angeles Times stating today that the Waymo service area (not all of Los Angeles, and not LAX) is now open to all residents instead of invitation only. You do have to use their app.

    There is another autonomic cab company called Zoox in San Francisco, and it does NOT have a traditional car configuration. This means I gave bad information here about the presence of steering wheels being required by regulation. Waymo is basing their are on an existing RL model only fitted with their equipment on top of that. Zoox is sui generis. There are no controls in a Zoox cabin, anywhere.

    I-Pace (Waymo’s vehicle) is made in Austria, and Zoox’s are made in Foster City here in California.

Leave a Reply

Your email address will not be published. Required fields are marked *