If you had to code an accident, whose life would you choose to save?

By Shira Jeczmien

Published Nov 1, 2018 at 11:09 AM

Reading time: 2 minutes

Over the past seven minutes I’ve been asked to choose whether I would rather kill five young passengers by crashing their self-driving car into a concrete barrier, or if the car should swerve slightly right, save their little souls and kill five elderly people crossing the road instead. This was during an illustrated MIT ‘Moral Survey’ slideshow that presented myself, and others like me from 223 other countries, with thirteen different scenarios where a self-driving car would have to be specifically coded to make a moral decision in the instance of a nearing and unpreventable fatal accident.

In case you were wondering, I rather reluctantly chose to save the five young people and before I knew it, I was saving an adult, a toddler and a cat over three adults and a toddler; a doctor, a dog and a toddler over three slightly overweight illustrated figures; two joggers, a doctor, a cat and a woman over five adult passengers with a thief amongst them. It’s fascinating how, once the initial hesitation to resist making such a painfully moral choice, my subconscious began to dictate choices I did not even believe would come into my ethic-driven decision making process.

See this is the thing, as Uber, Tesla and Google’s Waymo driverless cars begin to take root in our world—which they will and already are, there are some pretty serious decisions that need to be coded into their AI. Who to prioritise in the case of an approaching accident is up there on the agenda. Sure the whole point of having AI systems drive on simulated city roads for an accumulation of millions of hours is meant to almost entirely eliminate the possibility of a crash by teaching the AI to drive thousands if not millions of times better than a human. But we all know that mistakes happen. In fact, just last year two lives were lost in the U.S. as a result of machine failure. One involving a self-driving Uber that did have a driver but who was not paying attention and the other with a Tesla self-driving car. Two lives were lost. In Uber’s case it was a woman pedestrian who the car simply did not register even though footage of the camera view clearly shows her appearing out of a dark pedestrian crossing and in Tesla’s case, it was a man who drove the car.

Turns out that on the scale of ‘do I prefer to kill less or more people’ I scored quite heavily into the “does not matter” spectrum (in opposition to the average). But hey, at least I preferenced humans over pets and women over men (sorry, I’m in a female-comradeship kind of mood). On a more serious note, despite the fascinating scores this MIT survey has collected over the past four years on different cultures’ value of life, their preference to fitness, age, animal or human, what the makers of these cars of the future now face is to make real decisions that will guide AI to prioritise what kind of life is more valuable; a decision that I believe no one has the right to make.

I do admit however that projecting my moral ethics from behind a screen, far away from the developers and coders of such systems is easy. And whether we agree or disagree, these decisions will need to be made. The real question is how such decisions will be made, who makes them and to what capacity. Within just thirteen slides, the survey was able to present back to me my own bias, as a twenty-something year-old woman, to preference young people and women. As the technology around us becomes more automated, more powerful and with a stronger grip on everything around us, these types of moral questions—or moral surveys—are only set increase. So if instilling morals into the machines that will serve us is set to be the course of the next few years, we better make sure there is a fair representation of demographics coding these decisions. Otherwise we could accidentally end up with an army of cars that prefer cats over toddlers. Just saying.

Keep On Reading

By Fatou Ferraro Mboup

Colombia sterilises first hippo left behind by Pablo Escobar amid ecological disaster

By Charlie Sawyer

Is the internet finally falling out of love with Emma Chamberlain?

By Abby Amoakuh

Oklahoma State Senator Dusty Deevers to criminalise watching porn with penalties of up to 20 years in prison

By Abby Amoakuh

Watch the first official trailer for Netflix’s new reality TV show, Squid Game: The Challenge

By Fatou Ferraro Mboup

Olivia Colman reveals she’d earn a lot more money in Hollywood if she were a man

By Fatou Ferraro Mboup

Pigeon accused of being a Chinese spy released after being detained for eight months

By Charlie Sawyer

Watch Tyler, the Creator and Post Malone get down to Colbie Caillat

By Alma Fabiani

Congratulations Wonka, you’ve officially snapped me out of my Timothée Chalamet obsession

By Charlie Sawyer

How much are the Love Island All Stars contestants getting paid?

By Alma Fabiani

Teacher tragically found dead at scene of nativity play at UK private school

By Charlie Sawyer

Explaining Swiftonomics: Why NFL stans need to be thanking Taylor Swift big time

By Abby Amoakuh

Black models boycott Melbourne Fashion Week to protest racial discrimination

By Charlie Sawyer

Republican lawmaker censured for saying mass shootings are god’s punishment for abortion rights

By Charlie Sawyer

What to do if Monzo freezes or closes your bank account

By Charlie Sawyer

Paris Hilton spills the tea on being a socialite and mum of 2 on new Call Her Daddy podcast

By Emma O'Regan-Reidy

What is demi method makeup, and what’s its connection to an alleged MLM scam?

By Abby Amoakuh

Drake responds to his nudes being leaked just hours ago

By Fatou Ferraro Mboup

Shia LaBeouf ditches acting career to become a Catholic deacon instead

By Charlie Sawyer

Schitt’s Creek star Emily Hampshire slammed for dressing up as Johnny Depp and Amber Heard for Halloween

By Charlie Sawyer

What is delulu?