Opinion

Big Data vs The Sesh: The bigger problem with Uber knowing when you’re too drunk

By Jack Palfrey

Updated May 16, 2020 at 10:07 AM

Reading time: 3 minutes

320

Seemingly oblivious (well not at all, perhaps) to their key demographic, it was recently revealed that Uber is developing a new AI system that can tell if users are drunk, allowing the driver to choose whether to accept a ride based on a variety of metrics such as your walking speed, frequent typos and whether you’re swaying around or holding the phone at a weird angle. As information technology colonises all aspects of day-to-day life, what happens when our drunken behaviour—aka our worst selves—falls into the profit-focussed world of Big Data too?

Even before going into speculations on how this data could be used in sinister ways, there’s a more obvious reason that this can pose a risk for users. When you consider Uber’s pretty terrible track-record at reporting sexual assaults committed by its employees, the idea that drivers would be able to spot drunk and vulnerable people when choosing whether to take the job is obviously a dangerous move which could easily be abused. There’s also the issue that for young women in particular, if it’s late at night and you’re drunk and alone, Uber can be a safer and quicker alternative to public transport. If these users are unable to book a lift home because they appear to be too intoxicated—bear in mind this is using superficial digital data to measure a chemical imbalance—then it could be putting them at risk even further.

Of course, there’s also the chance that this won’t extend further than the development phase, after all, that’s one perk of being a multi-billion dollar tech company: you can pump a bunch of money and resources into developing ridiculous ideas and then if they don’t work, just move on to the next. Still, I think it raises some interesting questions about the dangers imposed by the accumulation of this kind of data and, in particular, how it could be used against us—by Uber or any other private company. After all, it’s virtually impossible, by its very nature, for any kind of AI or automation to be totally free from personal, political or corporate bias, instilled consciously or unknowingly at some stage along in its development and deployment.

Uber has presented this idea as a way of keeping their drivers safe, however, I think it would be pretty naïve to presume that this is the only motive at play. That’s just how the tech industry works, data is capital and we volunteer to give it all away for the taking. One way Uber could use this would be to apply surge pricing, ramping up the price for those that appear drunk—knowing they’re more likely to accept the additional charges because of their booze-tainted decision-making or, as I’ve mentioned earlier, to avoid having to travel home alone and late at night. It’s for this reason that the ability to target us when we’re drunk would inevitably offer huge opportunities to marketers too.

It’s here when we start looking at how this technology could be misused in a wider sense where more sinister scenarios arise, such as how this feature could take on a more disciplinary usage. It almost resembles some form of digital breathalyser, only those doing the policing are the same tech companies whose business models rely on a vast mining of behavioural data for capitalistic gain.

Since back in 2015 a handful of U.S. health insurance companies have started experimenting with how they can use wearable technologies to their advantage. Luring them in with reduced rates and discounts, and even cash prizes, some companies have begun getting customers to opt-in to giving away the medical data from their Apple Watches and FitBit’s. It’s not hard to see how continual access to your biometric information would be of value to insurance companies. In a similar way, if alcoholism falls into this kind of area, then so-called signs of it in our digital footprint could be used to prevent us from a variety of different services—be it just a taxi, health insurance, or even access to certain places and areas if deployed on a more municipal level within a ‘Smart City’ that uses real-life data to inform its infrastructure and services.

Regardless of whether it does indeed go down the route, it’s clear that there’s a lot to be gained for certain parties with our drunken behavioural traits being added to the swarms of data we already outpour—posing serious threats in terms of privacy, surveillance, discipline and user safety as a result. It’s a pessimistic vision but it feels like an inevitable step in the profit-driven quest for Big Data to colonise all corners of human social experience, carving out a whole new data set for any interested party to play with as they please.

Keep On Reading

By Fatou Ferraro Mboup

Police rescue over 800 good-looking people lured into human trafficking love scam centre

By Fatou Ferraro Mboup

Of course the US far right is spreading false claims that the Lakewood Church shooter was trans

By Charlie Sawyer

Conspiracy theorists fear for King Charles’ safety after white bloody horse spotted in central London

By Abby Amoakuh

RuPaul’s new online bookstore Allstore removes anti-trans and far-right books following controversy

By Abby Amoakuh

France’s AI systems for Olympics disproportionately target minorities and expose them to violence

By Fatou Ferraro Mboup

Dear Naomi Campbell, nobody asked for your opinion on Gen Z’s lack of maternal instinct

By Fatou Ferraro Mboup

Gen Z in Kenya is reshaping politics by taking a stand against the Tax Bill on social media

By Charlie Sawyer

Non-English speaking artists are taking over the music industry, here’s why

By Abby Amoakuh

Kanye West announces launch of Yeezy Porn, an adult entertainment business

By Emma O'Regan-Reidy

Why are Gen Zers putting bows on everything? Explaining the coquette ribbon obsession

By Fatou Ferraro Mboup

World’s bravest rapper, Toomaj Salehi, sentenced to death in Iran for supporting women’s rights

By Emma O'Regan-Reidy

Stanley vs YETI: Which tumbler is worth the hype?

By Abby Amoakuh

Underage deepfake porn of Jenna Ortega and Sabrina Carpenter used in Instagram and Facebook ads

By Charlie Sawyer

French protesters to poo in the Seine amid Paris 2024 Olympics controversy

By Charlie Sawyer

Explaining Swiftonomics: Why NFL stans need to be thanking Taylor Swift big time

By Abby Amoakuh

We spoke to two anti-abortion advocates to test them on their feminism

By Abby Amoakuh

Challengers representatives step in after movie poster with racial slur goes viral

By Fatou Ferraro Mboup

What does the US Supreme Court’s decision to abolish mass protests in three states mean for democracy?

By Charlie Sawyer

You are shaming me: Nancy Mace calls news presenter disgusting for rape-shaming her over Trump

By Fatou Ferraro Mboup

Dakota Johnson fails to name a single Tom Holland Spider-Man movie during Madame Web promo