Featured Post

This is the Kodak Moment for the Auto Industry

Plug-In Drivers Not Missin' the Piston Electric vehicles are here to stay. Their market acceptance is currently small but growing...

Wednesday, January 3, 2018

Trolley Problem and Murder Hospital


  • The Trolly Problem is just a philosophical construct, but some variant of the no-win situation will be in examined autonomous cars (at least in simulation) 
  • This post presents the ethical case for the trolly (or an autonomous car) to hit the group of 5 people, rather than the single person

Full Story:
If you've read much about autonomous cars, then you've heard of "The Trolley Problem." If you haven't encountered this, count yourself lucky.

To briefly recap, here's how The Trolley Problem goes:
There's a runaway trolley car is barreling down the tracks. There's no way for you to stop it. Ahead, on the tracks, there are 5 people tied down, unable to move. The trolley is headed straight for them. You're standing next to a track lever. If you pull the lever, the trolley will divert to a different track and the 5 will be safe. However, there's 1 person tied to the other track. What do you do?

Each time I've encountered it, I thought, "Who cares? In the real world, it will never come up. It is just a philosophical debate of no consequence." Chatting with a friend, the topic again came up and after my "This doesn't matter" objections, we agreed that, if for no other reason than the press around the topic, some variation of the trolley problem will be put into the training simulators for self-driving cars and the cars will have to do something, the system will have to make a choice. What choice should an autonomous car make in a no-win situation?

You have two standard options:
1) Do nothing, and the trolley kills the 5 people
2) Pull the lever, diverting the trolley, where it kills 1 person

You can save five lives by sacrificing one. Would you do it?

Assuming you know nothing about the people, the utilitarian answer seems to be: Pull the lever because 5 is greater than 1. You've saved a net-sum of 4 lives.

Let’s continue that reasoning. If you were sitting on a bridge above the tracks, and you saw the trolley heading towards 5 people, and you knew that you could shove the person next to you off the bridge, thereby derailing the trolley and saving 5, should you do it?

In this second case, most people now say 'No' because murder.

The net results are the same in each case, based on your actions, 1 person dies to save 5; yet, these two feel very different. I propose that the doing-nothing response in the 2nd case (the bridge), is the right one for both situations.

In both cases, the 5 people are in danger (by who knows what cause) and the 1 person has not put themselves in danger (they were off the active track…). So the 5 must be the ones to suffer the consequences of their circumstances and it is wrong to force anyone else to suffer on their behalf.

To clarify this, let's look at one more example, "Murder Hospital." Let's assume there's a national organ registry. This registry is periodically analyzed. During this analysis, if they find that you could save 5 (or more) people by having your organs harvested, then they would round you up for harvesting.

My guess is that you would not like to live in a world with such a system. Even if they told you that you would save 5 lives, plus your skin will be used for graphs to help burn victims and your eyes will be used to restore sight for someone. Your blood will go to help people in an ER. You would save at least 5 lives and you will help more than a dozen people in total in very positive ways. Saving lives and restoring sight, you should be honored that you’ve been selected, they'd say. And as part of the package, your family gets a lottery-sized check and will be taken care of for life. One life seems like a small price to pay to bring life and joy to so many.

So we should implement forced organ harvesting immediately, right? Of course not! You might feel sympathy for these sick people and you may donate money to their causes or volunteer time to their organizations, but sacrificing your life for people that you don't know is asking too much. In the end, the tragic situations of their lives are theirs to deal with; reasonable help and support are all that should be expected.

These life and death choices are not made by the simple utility of the outcome. They have to be based on the fairness of the situation. As Murder Hospital demonstrates, sacrificing an uninvolved bystander without their consent is wrong, even if it saves a greater number of people.

The ethical case for the trolley (or an autonomous car) to run over five people directly in front of it, rather than one bystander

Applying this to autonomous cars, it means that the car doesn’t swerve into the smaller crowd, to avoid the larger one. If the car cannot avoid the accident (avoidance is, of course, always preferred), then it does all that it can (such as braking) to mitigate the damage to the people or vehicle that is right in front of it and then it just lets fate take its course.

This has several advantages:
  1. This action is "more human like." In the midst of an accident, no one is going to go through the ethical debate of which way to swerve. It is far more likely that they would just hit the brakes.
  2. It is easier to program. If a collision is imminent, look for a clear path, if one cannot be found, then brake.
  3. You won’t have video footage of a car flying off the road (e.g., avoiding a bus) and hitting innocent pedestrians.
Advantages aside, self-driving cars should act this way because it is the right thing to do. I assert that self-driving cars should act to minimize involvement (rather than simply reduce total utilitarian harm). When there's an imminent accident, the people, be they pedestrians crossing the street or in an oncoming car, are already involved. They voluntarily entered the arena where cars traverse. In doing so, they took on some measure of risk and responsibility. I'm not saying this means they should be hitl rather, uninvolved others that are on the sidewalk or other safe areas should not be sacrificed to avoid hitting those that are already involved (or the cause of) the incident.

So my answer to The Trolley Problem is to stay on the straight tracks. In a car, however, there are no tracks. Cars will have many more options. They can dodge, skid, brake, drift, and more. Autonomous cars will, at some point, have thousands of years of driving experience and skills beyond any human. These cars will be controlled by powerful AI systems that do this one thing (driving) really really well. With these skills, they will likely find a way to avoid hitting anyone. This brings it all back to the start: For autonomous vehicles, The Trolley Problem, won't be a problem.


No comments:

Post a Comment