"Who would buy a car programmed to sacrifice the owner?"

Teaching self-driving cars who they should kill

#Driving

Sun, Nov 1st, 2015 11:00 by capnasty NEWS

Albeit the self-driving car will be "safer, cleaner, and more fuel-efficient than their manual counterparts," they will require some ethical programming in the event of an unavoidable accident: should the car prioritise the lives of the occupants, or should it kill them if it meant minimizing the loss of life of many others? And would people drive a car that could possibly kill them in order to save others?

Here is the nature of the dilemma. Imagine that in the not-too-distant future, you own a self-driving car. One day, while you are driving along, an unfortunate set of events causes the car to head toward a crowd of 10 people crossing the road. It cannot stop in time but it can avoid killing 10 people by steering into a wall. However, this collision would kill you, the owner and occupant. What should it do?

One way to approach this kind of problem is to act in a way that minimizes the loss of life. By this way of thinking, killing one person is better than killing 10.

But that approach may have other consequences. If fewer people buy self-driving cars because they are programmed to sacrifice their owners, then more people are likely to die because ordinary cars are involved in so many more accidents. The result is a Catch-22 situation.

  1328

 

You may also be interested in:

“A driverless car might seem a sinister invention to some people – but it isn’t going to rape you.”
“Think Siri for cars but way smarter.”
“It’s an early sign of the unfolding disruption that will eventually shatter the auto industry.”
Same House, Two Different Cars
Changing the Oil of Your Car is so Easy, Even a Kid Can Do It