Featured Post

Plug In Drivers Not Missin' the Piston

This is the Kodak Moment for the Auto Industry. Electric vehicles are here to stay. Their market acceptance and growth will continue....

Wednesday, January 3, 2018

Trolley Problem and Murder Hospital


  • The Trolly Problem might be a philosophical construct that won't happen in the real-world, but the press around it will mean that some variant of it will be in autonomous car simulators
  • The _ethical_ case for hitting 5 people, rather than 1 is presented

Full Story:
If you've talked about (or read about) autonomous cars, then you've heard of "The Trolley Problem." If you haven't encountered this, count yourself lucky.

To briefly recap, here's how The Trolley Problem goes:
There's a runaway trolley car is barreling down the tracks. There's no way for you to stop it. Ahead, on the tracks, there are 5 people tied down, unable to move. The trolley is headed straight for them. You're standing next to a track lever. If you pull the lever, the trolley will divert to a different track and the 5 will be safe. However, there's 1 person tied to the other track. What do you do?

Each time I've encountered it, I thought, "Who cares? In the real-world, it will never come up. It is just a philosophical debate of no consequence." Chatting with a friend, the topic again came up and after my "This doesn't matter" objections, we agreed that, if for no other reason than the press around the topic, some variation of the trolley problem will be put into the training simulators for self-driving cars and the cars will have to do something, the system will have to make a choice. What should it be?

You have two standard options:
1) Do nothing, and the trolley kills the 5 people
2) Pull the lever, diverting the trolley, where it kills 1 person

You can save five lives by sacrificing one. Would you do it?

Assuming you know nothing about the people, the utilitarian answer seems to be: Pull the lever because 5 is greater than 1. You've saved a net-sum of 4 lives.

Let’s continue that reasoning. If you were sitting on a bridge above the tracks, and you saw the trolley heading towards 5 people, and you knew that you could shove the person next to you off the bridge, thereby derailing the trolley and saving 5, should you do it?

In this second case, most people now say 'No' because murder.

The results are the same in each case, 1 person dies to save 5 based on your actions; yet, these two feel very different. I propose that the response in the 2nd case (the bridge), of doing nothing, is the right one for both situations.

In both cases, the 5 are the ones that are in danger (by who knows what cause) and the 1 has not put themselves in danger (they were off the active track…). So the 5 must be the ones to suffer the consequences of their circumstances and it is wrong to force anyone else to suffer on their behalf.

Let's look at one more example, Murder Hospital, to make this point. If there were a national organ registry and they analyzed it periodically. During this analysis, if they found out that you could save 5 or more people by having your organs harvested, then they would round you up for harvesting.

My guess is that you would not like to live in a world with that system. Even if they told you that you would save 5 lives, plus your skin will be used for graphs to help burn victims and your eyes will be used to restore sight for someone. Your blood will go to help people in an ER. You will help more than a dozen people in very positive ways. Saving lives and restoring sight, you should be honored that you’ve been selected. And as part of the package, your family gets a lottery-sized check and will be taken care of for life. One life seems like a small price to pay to bring life and joy to so many.

So we should implement forced organ harvesting immediately, right? Of course not! You might feel sympathy for these sick people and you may donate money to their causes or volunteer time to their organizations, but sacrificing your life for people that you don't know is asking too much. In the end, the tragic situations of their lives are theirs to deal with; reasonable help and support are all that should be expected.

These life and death choices are not made by the simple utility of the outcome. They have to be based on the fairness of the situation. As Murder Hospital demonstrates, sacrificing an uninvolved bystander without their consent is wrong, even if it saves the greater number of people.

The ethical case for an autonomous car to run over five people instead of one

Applying this to autonomous cars, it means that the car doesn’t swerve into the smaller crowd, to avoid the larger one. If the car cannot avoid the accident (avoidance is, of course, always preferred), then it does all they can (such as braking) to mitigate the damage to what is right in front of them and then just lets fate take its course.

This has several advantages:
  1. This action is "more human." In the midst of an accident, no one is going to go through the ethical debate of which way to swerve. It is far more likely that they would just hit the brakes.
  2. It is easier to program. Look for a clear path, if one cannot be found, then brake.
  3. You won’t have bystander video footage of a car flying off the road and hitting innocent pedestrians.
Advantages aside, self-driving cars should act this way because it is the right thing to do. I assert that self-driving cars should act to minimize involvement (rather than simple utilitarian harm). When there's an imminent accident, the people, be they pedestrians crossing the street or in an oncoming car, are already involved. They voluntarily entered the arena where cars traverse. In doing so, they took on some measure of risk and responsibility.

So my answer to The Trolley Problem is to stay on the straight tracks. In a car, however, there are no tracks. Cars will have many more options. They can dodge, skid, brake, drift, and more. Autonomous cars will, at some point, have thousands of years of driving experience and skills beyond any human. These cars will be controlled by powerful AIs that do this one thing (driving) really really well. With these skills, they will likely find a way to avoid hitting anyone. Which all brings me back to the start: The Trolley Problem, won't be a problem.