If you had to code an accident, whose life would you choose to save?

By Shira Jeczmien

Published Nov 1, 2018 at 11:09 AM

Reading time: 2 minutes

142

Over the past seven minutes I’ve been asked to choose whether I would rather kill five young passengers by crashing their self-driving car into a concrete barrier, or if the car should swerve slightly right, save their little souls and kill five elderly people crossing the road instead. This was during an illustrated MIT ‘Moral Survey’ slideshow that presented myself, and others like me from 223 other countries, with thirteen different scenarios where a self-driving car would have to be specifically coded to make a moral decision in the instance of a nearing and unpreventable fatal accident.

In case you were wondering, I rather reluctantly chose to save the five young people and before I knew it, I was saving an adult, a toddler and a cat over three adults and a toddler; a doctor, a dog and a toddler over three slightly overweight illustrated figures; two joggers, a doctor, a cat and a woman over five adult passengers with a thief amongst them. It’s fascinating how, once the initial hesitation to resist making such a painfully moral choice, my subconscious began to dictate choices I did not even believe would come into my ethic-driven decision making process.

See this is the thing, as Uber, Tesla and Google’s Waymo driverless cars begin to take root in our world—which they will and already are, there are some pretty serious decisions that need to be coded into their AI. Who to prioritise in the case of an approaching accident is up there on the agenda. Sure the whole point of having AI systems drive on simulated city roads for an accumulation of millions of hours is meant to almost entirely eliminate the possibility of a crash by teaching the AI to drive thousands if not millions of times better than a human. But we all know that mistakes happen. In fact, just last year two lives were lost in the U.S. as a result of machine failure. One involving a self-driving Uber that did have a driver but who was not paying attention and the other with a Tesla self-driving car. Two lives were lost. In Uber’s case it was a woman pedestrian who the car simply did not register even though footage of the camera view clearly shows her appearing out of a dark pedestrian crossing and in Tesla’s case, it was a man who drove the car.

Turns out that on the scale of ‘do I prefer to kill less or more people’ I scored quite heavily into the “does not matter” spectrum (in opposition to the average). But hey, at least I preferenced humans over pets and women over men (sorry, I’m in a female-comradeship kind of mood). On a more serious note, despite the fascinating scores this MIT survey has collected over the past four years on different cultures’ value of life, their preference to fitness, age, animal or human, what the makers of these cars of the future now face is to make real decisions that will guide AI to prioritise what kind of life is more valuable; a decision that I believe no one has the right to make.

I do admit however that projecting my moral ethics from behind a screen, far away from the developers and coders of such systems is easy. And whether we agree or disagree, these decisions will need to be made. The real question is how such decisions will be made, who makes them and to what capacity. Within just thirteen slides, the survey was able to present back to me my own bias, as a twenty-something year-old woman, to preference young people and women. As the technology around us becomes more automated, more powerful and with a stronger grip on everything around us, these types of moral questions—or moral surveys—are only set increase. So if instilling morals into the machines that will serve us is set to be the course of the next few years, we better make sure there is a fair representation of demographics coding these decisions. Otherwise we could accidentally end up with an army of cars that prefer cats over toddlers. Just saying.

Keep On Reading

By Eliza Frost

All the Easter eggs from the first episodes of The Summer I Turned Pretty season 3

By Eliza Frost

Are you in Group 7? Explaining the latest viral TikTok trend

By Eliza Frost

The swag gap relationship: Does it work when one partner is cooler than the other?

By Charlie Sawyer

Father of former Harry Potter star gives serious warning to the new child stars in HBO Max reboot

By Eliza Frost

Is Belly Conklin the problem in The Summer I Turned Pretty?

By Charlie Sawyer

Gavin Casalegno cancelled? The Summer I Turned Pretty fans turn on him amid cast drama

By Eliza Frost

UK to lower voting age to 16 by next election. A controversial move, but the right one

By Eliza Frost

Bad timing? Gavin Casalegno’s Dunkin’ ad sparks backlash over actor’s alleged conservative views

By Eliza Frost

Jennifer Lawrence weighs in on The Summer I Turned Pretty love triangle, revealing she is Team Jeremiah

By Charlie Sawyer

President Trump and JD Vance angry over the DNC setting up a taco truck outside RNC headquarters

By Eliza Frost

How to spot a performative male out in the wild 

By Eliza Frost

Kylie Jenner now follows Timothée Chalamet on Instagram, but he doesn’t follow her back

By Eliza Frost

Online pornography showing choking to be made illegal, says government 

By Eliza Frost

How exactly is the UK government’s Online Safety Act keeping young people safe? 

By Eliza Frost

People think Donald Trump is dead and they’re using the Pentagon Pizza Index to prove it

By Eliza Frost

Everyone’s posing like Nicki Minaj: the TikTok trend explained 

By Eliza Frost

Is the princess treatment TikTok trend the bare minimum or a relationship red flag?

By Eliza Frost

Rina Sawayama calls out Sabrina Carpenter’s SNL performance of Nobody’s Son for cultural insensitivity 

By Eliza Frost

Bereavement leave to be extended to miscarriages before 24 weeks

By Charlie Sawyer

McDonald’s hit with new mass boycott. Here’s who’s behind it and why