If you had to code an accident, whose life would you choose to save?

By Shira Jeczmien

Published Nov 1, 2018 at 11:09 AM

Reading time: 2 minutes

Over the past seven minutes I’ve been asked to choose whether I would rather kill five young passengers by crashing their self-driving car into a concrete barrier, or if the car should swerve slightly right, save their little souls and kill five elderly people crossing the road instead. This was during an illustrated MIT ‘Moral Survey’ slideshow that presented myself, and others like me from 223 other countries, with thirteen different scenarios where a self-driving car would have to be specifically coded to make a moral decision in the instance of a nearing and unpreventable fatal accident.

In case you were wondering, I rather reluctantly chose to save the five young people and before I knew it, I was saving an adult, a toddler and a cat over three adults and a toddler; a doctor, a dog and a toddler over three slightly overweight illustrated figures; two joggers, a doctor, a cat and a woman over five adult passengers with a thief amongst them. It’s fascinating how, once the initial hesitation to resist making such a painfully moral choice, my subconscious began to dictate choices I did not even believe would come into my ethic-driven decision making process.

See this is the thing, as Uber, Tesla and Google’s Waymo driverless cars begin to take root in our world—which they will and already are, there are some pretty serious decisions that need to be coded into their AI. Who to prioritise in the case of an approaching accident is up there on the agenda. Sure the whole point of having AI systems drive on simulated city roads for an accumulation of millions of hours is meant to almost entirely eliminate the possibility of a crash by teaching the AI to drive thousands if not millions of times better than a human. But we all know that mistakes happen. In fact, just last year two lives were lost in the U.S. as a result of machine failure. One involving a self-driving Uber that did have a driver but who was not paying attention and the other with a Tesla self-driving car. Two lives were lost. In Uber’s case it was a woman pedestrian who the car simply did not register even though footage of the camera view clearly shows her appearing out of a dark pedestrian crossing and in Tesla’s case, it was a man who drove the car.

Turns out that on the scale of ‘do I prefer to kill less or more people’ I scored quite heavily into the “does not matter” spectrum (in opposition to the average). But hey, at least I preferenced humans over pets and women over men (sorry, I’m in a female-comradeship kind of mood). On a more serious note, despite the fascinating scores this MIT survey has collected over the past four years on different cultures’ value of life, their preference to fitness, age, animal or human, what the makers of these cars of the future now face is to make real decisions that will guide AI to prioritise what kind of life is more valuable; a decision that I believe no one has the right to make.

I do admit however that projecting my moral ethics from behind a screen, far away from the developers and coders of such systems is easy. And whether we agree or disagree, these decisions will need to be made. The real question is how such decisions will be made, who makes them and to what capacity. Within just thirteen slides, the survey was able to present back to me my own bias, as a twenty-something year-old woman, to preference young people and women. As the technology around us becomes more automated, more powerful and with a stronger grip on everything around us, these types of moral questions—or moral surveys—are only set increase. So if instilling morals into the machines that will serve us is set to be the course of the next few years, we better make sure there is a fair representation of demographics coding these decisions. Otherwise we could accidentally end up with an army of cars that prefer cats over toddlers. Just saying.

Keep On Reading

By Charlie Sawyer

Brooklyn Beckham launches London pop-up restaurant to bless us with his cooking

By Abby Amoakuh

White US politician tells primarily Black audience that her father born in 1933 was a white slave

By Abby Amoakuh

What is girl ethics? The gen Z-improved TikTok version of the girl code

By Abby Amoakuh

Marjorie Taylor Greene clashes with reporter over Jewish space lasers conspiracy theory

By Fatou Ferraro Mboup

QAnon conspiracy theorists claim Iowa shooting was a political coverup for Jeffrey Epstein scandal

By Fatou Ferraro Mboup

My interview with a professional cuddler who earns £75 per hour

By Charlie Sawyer

The UK Conservative government is out to get the entire LGBTQIA+ community. Here’s how

By Charlie Sawyer

Meta faces backlash from Instagram users over new political content limitation feature

By Fatou Ferraro Mboup

Who is Adan Banuelos? The cowboy who has stolen Bella Hadid’s heart

By Abby Amoakuh

Bride walks out on her own wedding after the groom smashed a cake in her face, and she’s not the first one!

By Abby Amoakuh

New Alabama bill to add rape exception to abortion ban and punish rapists with castration

By Charlie Sawyer

TikToker exposes exclusive celebrity dating app Raya as a hub for toxic men

By Jack Ramage

Is your boss tripping on acid? New research suggests so

By Jack Ramage

Gen Alpha, Gen iPad: What’s the consequence of raising a generation of iPad kids?

By Charlie Sawyer

The death of independent media. VICE may have been the first victim, but it won’t be the last

By Charlie Sawyer

Drake addresses the Millie Bobby Brown age-gap controversy and potentially comes out as bisexual

By Fatou Ferraro Mboup

Meet Sisters of the Valley, the nuns revolutionising the weed industry one doobie at a time

By Alma Fabiani

American Pie star reveals he slept with sex worker as wife watched and ate crisps

By Charlie Sawyer

5 celebrity breakups that emotionally wrecked us in 2023

By Jack Ramage

What is a gymcel? And why is the term problematic?