If you think Uber is just taking the mick with its prices and you’ve become an avid Bolt user like me instead, you’ll have probably noticed a little red helicopter pop up as a new ride option.So, what the hell is it? After seeing a video with nearly 3 million views circulating on TikTok of a UK user attempting to book a ride, followed by helicopter sounds which could be heard circling over the content creator, I just had to see for myself. For just a few pounds over a normal car ride price, it seemed too good to be true.
Bolt’s new addition had the internet stumped with one Twitter user writing, “me and my mate just checked taxis we could get to this rave in Brixton and one of the options on Bolt is a helicopter and it’s only £1 more than a car? Hang on, what?” Another wrote, “Can anyone tell me why Bolt is saying I can get a helicopter to East Croydon for £5?” Another simply asked, “Has anyone [even] tried the helicopter on Bolt?” And so, like the gullible fool I was, I attempted to book a helicopter…fact check guys, it’s not real—obviously. And here’s to the gentle reminder to not believe everything you see on TikTok.
What is this new feature then? While you can’t catch a helicopter to lunch just yet, the reason behind the new addition is actually a charitable one. The new ride option is part of a collaboration between Bolt and London’s Air Ambulance Charity (LAA) to honour its dedicated and heroic trauma care service for London civilians, which has spanned over 32 years. The charity is heavily reliant on donations from the public in order to cover its running costs and so Bolt’s initiative—which became available from 23 August 2021—aims to aid London’s Air Ambulance in raising money for it through the car ride app.
Sharing on the company’s blog, Bolt stated that “London’s Air Ambulance via Bolt is a special ride-type in our app which works just like a normal Bolt ride.” It’s still in a car, “the only difference is that our rides will include an extra 50p cost which is directly given to London’s Air Ambulance Charity as a donation… Along the way, we’ll [also] be sharing more information about the vital work that London’s Air Ambulance Charity does.”
The special in-app option will run until the end of Air Ambulance week, which falls between 6 and 12 September 2021 but won’t be the last charitable initiative taken by the car ride company—with Bolt pledging a year of fundraising events. In a first-time partnership for both parties, the Bolt logo will “also appear on the iconic red helicopter, designating Bolt as an official corporate partner.”
CEO of LAA, Jonathan Jenkins, stated, “We are extremely grateful to Bolt for their generous support of the charity, especially following the pandemic. This contribution will help us save more lives in London and will also help to raise awareness of London’s Air Ambulance Charity throughout the capital. Thank you to all of the team at Bolt, we are delighted to be your first London charity partner.”
The reasoning behind the partnership was not only a dedication and appreciation to LAA but Bolt’s way of saying thank you to Londoners themselves. Bolt’s UK regional manager, Sam Raciti, said “London has been such a welcoming home to Bolt for two years that our team wanted to say a proper thanks.” He continued, “Like other charities, LAA has faced a difficult 18 months—but the amazing paramedics and doctors have continued to serve London and save lives throughout, and we’re delighted to play our part in helping to keep the red helicopters flying.”
So while you might not get to fly, next time you need a ride, select Bolt’s helicopter option to help somebody else too—you’ll feel better about the fact that you’ve taken yet another 5 minute ride instead of a walk, trust me.
Seemingly oblivious (well not at all, perhaps) to their key demographic, it was recently revealed that Uber is developing a new AI system that can tell if users are drunk, allowing the driver to choose whether to accept a ride based on a variety of metrics such as your walking speed, frequent typos and whether you’re swaying around or holding the phone at a weird angle. As information technology colonises all aspects of day-to-day life, what happens when our drunken behaviour—aka our worst selves—falls into the profit-focussed world of Big Data too?
Even before going into speculations on how this data could be used in sinister ways, there’s a more obvious reason that this can pose a risk for users. When you consider Uber’s pretty terrible track-record at reporting sexual assaults committed by its employees, the idea that drivers would be able to spot drunk and vulnerable people when choosing whether to take the job is obviously a dangerous move which could easily be abused. There’s also the issue that for young women in particular, if it’s late at night and you’re drunk and alone, Uber can be a safer and quicker alternative to public transport. If these users are unable to book a lift home because they appear to be too intoxicated—bear in mind this is using superficial digital data to measure a chemical imbalance—then it could be putting them at risk even further.
Of course, there’s also the chance that this won’t extend further than the development phase, after all, that’s one perk of being a multi-billion dollar tech company: you can pump a bunch of money and resources into developing ridiculous ideas and then if they don’t work, just move on to the next. Still, I think it raises some interesting questions about the dangers imposed by the accumulation of this kind of data and, in particular, how it could be used against us—by Uber or any other private company. After all, it’s virtually impossible, by its very nature, for any kind of AI or automation to be totally free from personal, political or corporate bias, instilled consciously or unknowingly at some stage along in its development and deployment.
Uber has presented this idea as a way of keeping their drivers safe, however, I think it would be pretty naïve to presume that this is the only motive at play. That’s just how the tech industry works, data is capital and we volunteer to give it all away for the taking. One way Uber could use this would be to apply surge pricing, ramping up the price for those that appear drunk—knowing they’re more likely to accept the additional charges because of their booze-tainted decision-making or, as I’ve mentioned earlier, to avoid having to travel home alone and late at night. It’s for this reason that the ability to target us when we’re drunk would inevitably offer huge opportunities to marketers too.
It’s here when we start looking at how this technology could be misused in a wider sense where more sinister scenarios arise, such as how this feature could take on a more disciplinary usage. It almost resembles some form of digital breathalyser, only those doing the policing are the same tech companies whose business models rely on a vast mining of behavioural data for capitalistic gain.
Since back in 2015 a handful of U.S. health insurance companies have started experimenting with how they can use wearable technologies to their advantage. Luring them in with reduced rates and discounts, and even cash prizes, some companies have begun getting customers to opt-in to giving away the medical data from their Apple Watches and FitBit’s. It’s not hard to see how continual access to your biometric information would be of value to insurance companies. In a similar way, if alcoholism falls into this kind of area, then so-called signs of it in our digital footprint could be used to prevent us from a variety of different services—be it just a taxi, health insurance, or even access to certain places and areas if deployed on a more municipal level within a ‘Smart City’ that uses real-life data to inform its infrastructure and services.
Regardless of whether it does indeed go down the route, it’s clear that there’s a lot to be gained for certain parties with our drunken behavioural traits being added to the swarms of data we already outpour—posing serious threats in terms of privacy, surveillance, discipline and user safety as a result. It’s a pessimistic vision but it feels like an inevitable step in the profit-driven quest for Big Data to colonise all corners of human social experience, carving out a whole new data set for any interested party to play with as they please.