YouTuber tests whether a Tesla autopilot would run over a cat, a kangaroo and Elon Musk – Screen Shot
Deep Dives Level Up Newsletters Saved Articles Challenges

YouTuber tests whether a Tesla autopilot would run over a cat, a kangaroo and Elon Musk

Apparently, having the Massachusetts Institute of Technology (MIT) confirm that Tesla’s previously praised autopilot feature is unsafe back in September 2021 wasn’t enough proof for Mat Watson, the car enthusiast behind the YouTube channel carwow. Wanting to test the infamous electric vehicle’s limits further, the YouTuber posted a video titled Will a Tesla KILL a cat? on 10 March of this year. In it, as you probably would’ve guessed by now, he tried to figure out whether a Tesla car left on autopilot would be able to notice a defenceless cat in the middle of a road and slam the brakes early enough to not kill it. Lovely.

With just under half a million views to date, the 12-minute-long video showcases two different vehicles—a Tesla Model 3 and a Volvo V90. Both include (and are known for their) automatic safety features that are said to brake for obstacles when the driver isn’t paying attention. That’s why host Watson tested them both against another car, a pedestrian and even several other sizes of animals.

“What I’m gonna do is test out the automated braking system on this Tesla to see exactly what it’s designed to recognise,” he said while standing on the professional closed course designed to simulate an urban environment. “Will the Tesla kill Elon? We’re gonna find out.”

When the YouTuber drove both the Tesla and the Volvo at another Model 3 made out of foam, both braked in time. The same happened when he tried to drive straight into a cardboard cutout of Elon Musk. Who would’ve thought, right? Although the cars passed with flying colours for those tests, Watson noted that the Tesla braked sooner yet left less distance between the car and the obstacle. Furthermore, the Volvo tightened his seat belt while braking to prevent Watson from hitting his head on the seat and kept his body relatively still while the Tesla didn’t.

After that, the car enthusiast went on to experiment with a rather large kangaroo stuffed animal, which the Tesla identified and successfully avoided. But as Watson placed smaller animals like a stuffed dog and taxidermy cat in the middle of the road, both cars failed to brake for them. In other words, yes, your Tesla will probably run over a cat if you aren’t paying attention to the road and decide to put too much of your trust into the hands of its autopilot.

And if it does auto-brake, pressing the accelerator down will override the system, as Watson further demonstrated by zooming over a cardboard cutout of Jeff Bezos. It’s safe to say that if you’re looking for a perfect self-driving car, we’re not there yet—so keep your eyes on the road and hands upon the wheel, please.

AI

New MIT study confirms Tesla’s autopilot is indeed unsafe

A month ago, towards the end of August 2021, the National Highway Transportation Safety Administration (NHTSA) launched an investigation into Tesla’s Autopilot system after it was found responsible for 11 accidents, resulting in 17 injuries and one death. Now a new study, conducted by the Massachusetts Institute of Technology (MIT), has confirmed how unsafe Elon Musk’s infamous autopilot feature actually is.

Titled A model for naturalistic glance behavior around Tesla Autopilot disengagements, the study backs up the idea that the electric vehicle company’s “Full-Self Driving” (FSD) system is in fact—surprise, surprise—not as safe as it claims. After following Tesla Model S and X owners during their daily routine for periods of a year or more throughout the greater Boston area, MIT researchers found that, more often than not, they become inattentive when using partially automated driving systems. Note here that I went from calling the autopilot a Full-Driving system—which is the term Tesla uses to describe it and therefore entails it is fully autonomous—to then qualifying it of an automated driving system, also known as an advanced driver assist system (ADAS), which is what it truly is.

“Visual behavior patterns change before and after [Autopilot] disengagement,” the study reads. “Before disengagement, drivers looked less on road and focused more on non-driving related areas compared to after the transition to manual driving. The higher proportion of off-road glances before disengagement to manual driving were not compensated by longer glances ahead.” To be completely fair, it does make sense that drivers would feel less inclined to be attentive when they think their car’s autopilot is fully in control. Only thing is, it isn’t.

Meanwhile, by the end of this week, Tesla will roll out the newest version of its autopilot beta software, the version 10.0.1 in this case, on public roads—completely ignoring the current federal investigation when it comes to the safety of its system. Billionaire tings, go figure.

Musk has also clarified that not everyone who has paid for the FSD software will be able to access the beta version, which promises more automated driving functions. First things first, Tesla will use telemetry data to capture personal driving metrics over a 7-day period in order to ensure drivers are still remaining attentive enough. “The data might also be used to implement a new safety rating page that tracks the owner’s vehicle, which is linked to their insurance,” added TechCrunch.

In other words, Musk is aware of the risk the current autopilot system represents, and he’s working hard on improving it, or at least making sure he’s not going to be the one to blame if more Tesla-related accidents happen. How do you say your autopilot is not an autopilot without clearly saying it—and therefore risking to hurt your brand? You release a newer version of it that can easily blame drivers for their carelessness, duh.

“The researchers found this type of behavior may be the result of misunderstanding what the [autopilot] feature can do and what its limitations are, which is reinforced when it performs well. Drivers whose tasks are automated for them may naturally become bored after attempting to sustain visual and physical alertness, which researchers say only creates further inattentiveness,” continued TechCrunch.

My opinion on Musk and Tesla aside, the point of the MIT study is not to shame Tesla, but rather to advocate for driver attention management systems that can give drivers feedback in real-time or adapt automation functionality to suit a driver’s level of attention. Currently, Tesla’s autopilot system doesn’t monitor driver attention via eye or head-tracking—two things that researchers deem necessary.

The technology in question—which is a model for glance behaviour—already exists, with automobile manufacturers like Mercedes-Benz and Ford allegedly already working on implementing it. Will Tesla follow suit or will Musk’s ‘only child’ energy rub off on the company?