In a plot twist that surely some of us saw coming, Tesla is recalling a whopping 2 million-plus vehicles across the U.S. Why? To fix a quirky problem with a system that's supposed to make sure drivers aren't daydreaming when cruising on Autopilot. Spoiler alert: It's not as foolproof as they thought.
Tesla's Safety Dance
The EV-maker’s recall comes after a two-year dance with the National Highway Traffic Safety Administration (NHTSA). Turns out, the Autopilot's method of babysitting drivers can be a bit, well, inadequate. A series of crashes later, some even fatal, and here we are – a recall to allow Tesla to jazz up warnings, alerts, and tweak where Autopilot can do its thing.
Adding Controls and Warnings
Now, while Tesla's attempting to sprinkle some magic recall dust, safety experts aren't exactly doing the happy dance. Why? Because, dear reader, the recall doesn't fix the underlying issue. It just adds more bells and whistles, the driver is still the one responsible for spotting and stopping obstacles. Which, if you’ll pardon a luddite, is a good thing, surely?
But, the underlying issue is that, according to the NHTSA, the system’s method of making sure that drivers are paying attention can be inadequate and can lead to “foreseeable misuse of the system.” So… if the driver isn’t paying attention, and they’re the one responsible for spotting… It’s just too complicated. It almost makes you think that self-driving cars just aren’t there yet.
Autosteer, Traffic Aware Cruise Control, and More
Let's dive into the Tesla self-driving alphabet soup: Autosteer, Traffic Aware Cruise Control, and the star of the show, Autopilot. The update does play around with where Autosteer can strut its stuff. If conditions aren't right, it’ll refuse to engage. But the big question remains: Why can't Tesla's automated systems spot and stop for obstacles? A big question? Sort of huge if you think about it.
While there are many articles that do not accurately convey the nature of our safety systems, the recent Washington Post article is particularly egregious in its misstatements and lack of relevant context.
— Tesla (@Tesla) December 12, 2023
We at Tesla believe that we have a moral obligation to continue…
The Unhappy Critics of Autopilot
Safety advocates have been banging the drum for stronger regulations on driver monitoring systems. They've been shouting for cameras to keep an eye on the driver, just like other automakers with similar systems. But the Autopilot drama continues, with critics saying this recall doesn't tackle yet another problem – Teslas crashing into emergency vehicles. Wait, that’s a thing?
NHTSA's Autopilot Interrogation
NHTSA, the safety watchdog, isn't done yet. They've been investigating 35 Tesla crashes since 2016 where autopilot might've been pulling the strings. At least 17 lives have been lost, and the investigators are on a mission to make Tesla's Autopilot safer.
We’re left with the impression that, for now, you’re probably best, you know, driving your Tesla yourself. In any case, buckle up, it's going to be a bumpy ride.
It's been a challenging time for Elon Musk over the past few months, with questions asked about safety at Space X, money flitting back and forth between his companies and the news that advertisers are fleeing X (Twitter) in huge numbers. And of course, there's all sorts of fun and games going on with Cybertruck. Perhaps a break over the New Year is in order.