Self-driving cars are already here and even legal on some public roads. It is likely that they will be better drivers than many of the humans that drive cars since they have faster reaction times and do not suffer from the impairments and distractions of people, such as talking on cell phones or even texting. They could be a boon for older and disabled people who will no longer be limited in their ability to go places.
The one big problem has been that even the best drivers can get into accidents and in the event of such an accident, the question will be about who is liable for the accident if the self-driving car is deemed to be at fault. The owner of the car? The manufacturer? The car’s occupant? If the car owner has no choice at all in how the car behaves, then the liability shifts to the manufacturer, which they would like to avoid.
It is clear that the programmers need to make choices in how cars react to tricky situations. But this raises some ethical issues that David Tuffley discusses.
The question of how a self-driven vehicle should react when faced with an accident where all options lead to varying numbers of deaths of people was raised earlier this month.
If car makers install a “do least harm” instruction and the car kills someone, they create legal liability for themselves. The car’s AI has decided that a person shall be sacrificed for the greater good.
Had the car’s AI not intervened, it’s still possible people would have died, but it would have been you that killed them, not the car maker.
So now comes along an idea to give drivers the option of choosing settings so that they would be responsible for what happens but this too raises other ethical issues.
The user gets to choose how ethically their vehicle will behave in an emergency.
The options are many. You could be:
- democratic and specify that everyone has equal value
- pragmatic, so certain categories of person should take precedence, as with the kids on the crossing, for example
- self-centred and specify that your life should be preserved above all
- materialistic and choose the action that involves the least property damage or legal liability.
Jason Millar discusses some of the other complex ethical issues these cars present which are similar to the famous trolley problems that ethicists like to pose.
It seems likely that the government will have to step in and make the determinations or at least set some limits on liability (such as exists for vaccine makers) in order to get around some of these complicated ethical issues.