The Trolley Problem

The Trolley Problem is a false dilemma once framed correctly. Most people take a mistaken utilitarian view when facing the first part of this test, while a majority takes a correct decision when the problem is re-framed in a follow up scenario.

Description: You are the driver of a runaway tram which you can only steer from one narrow track on to another; five men are working on one track and one man on the other. A utilitarian view asserts that it is obligatory to steer to the track with one man on it. Most people select this option when faced with the problem. A follow up problem is: you are on a bridge under which it will pass, and you can stop it by putting something very heavy in front of it. As it happens, there is a very fat man next to you – your only way to stop the trolley is to push him over the bridge and onto the track, killing him to save five. Should you proceed? In this case, most people chose to not push the fat man to his death.

The dilemma is false for several reasons:

First, a clear moral law of all mankind is “do not kill your human fellow being”. By willingly directing the trolley one way or the other, this law is violated. And if you do, you will rightfully rot in prison for killing the innocent.

Second, to make this violation palatable, people are told that they “save” five lives at the expense of only one life. This is wrong: we all die sooner or later, therefore “saving” someone is really just “delaying their death” by an unknown duration. “Delaying death” is not as appealing as “saving” but it is the reality. In addition, “utility” can be measured in many different ways and its value will be different every time. For instance, the five people may be doomed for other reasons, while the single guy may be essential for the survival of the community. The decision could be seen self-serving if one of those at risk has a special relationship to the driver. It is not our duty or competence to decide between lives by killing someone.

Third, it is possible that the two options presented are not what they seem, and instead the trolley is harmless when left alone. Maybe the trolley is set to derail due to speed and terrain before reaching the five people, or maybe the five people are protected by an unseen safety device.

The Trolley scenario is supposed to become hot as Artificial Intelligence expands and will eventually have to handle these kind of situations. However, AI is just a tool designed by its creators that remain fully responsible for its actions. If the machine is programmed to sacrifice one person or the other, its creators will be liable, just as current equipment makers are liable for any harm caused by their device.

In conclusion, we have a moral duty to help others, but we should refuse to participate in life tradeoff experiments. Those that do participate anyway, are liable for their actions.

Posted in Topics and tagged , , , , .