Self Driving Cars Could Be Programmed To Kill You In Order To Save Other Lives

Google's Lexus RX 450h Self-Driving Car. Image Credit: Wikimedia
Google’s Lexus RX 450h Self-Driving Car. Image Credit: Wikimedia

For the past few years, Google’s self-driving cars have been traveling around the country to test how well they work and how safe they are. The cars have been in a total of 13 accidents, however, the self-driving vehicles were not responsible for any of the accidents that took place.

Most of the accidents occurred when the cars were rear-ended by other drivers who were not paying attention. The cars have logged about 1,011,338 miles since 2009 without a driver, and then nearly another million with a driver.

The cars are maneuvered with complex algorithms, that calculate everything from weather conditions, to red lights, to evasive moves in accident situations.

In the tweaking of these algorithms, some very strange ethical questions have begun to arise. For example, is it ethical for a car to be programmed to kill you if it means saving the lives of many other people?

That is a question that researchers at the University of Alabama at Birmingham are currently considering.

UAB researcher Ameen Barghi said that:

“Imagine you are in charge of the switch on a trolley track. The express is due any minute; but as you glance down the line you see a school bus, filled with children, stalled at the level crossing. No problem; that’s why you have this switch. But on the alternate track there’s more trouble: Your child, who has come to work with you, has fallen down on the rails and can’t get up. That switch can save your child or a bus-full of others, but not both. What do you do?”

Barghi continued:

“Utilitarianism tells us that we should always do what will produce the greatest happiness for the greatest number of people,” he explained. In other words, if it comes down to a choice between sending you into a concrete wall or swerving into the path of an oncoming bus, your car should be programmed to do the former. Deontology, on the other hand, argues that some values are simply categorically always true. For example, murder is always wrong, and we should never do it. Even if shifting the trolley will save five lives, we shouldn’t do it because we would be actively killing one.

Would you drive a car that was programmed to kill you in certain situations? Even if it meant saving the lives on many other people?


John Vibes writes for True Activist and is an author, researcher and investigative journalist who takes a special interest in the counter culture and the drug war.

Popular on True Activist