Tesla Autopilot veers into two federal investigations following fire truck crash

Quimby Mug Bayou Florida

First the bad news…a Tesla Model S reportedly traveling at 65mph on Autopilot this week slammed into the back of a fire truck on a California freeway. Amazingly, no one was seriously injured, however the brand finds itself under a new wave of scrutiny which includes two federal investigations.

Those concerned with early launches of autonomous tech will probably appreciate the irony: As Tesla puts out fires concerning lackluster production of its new Model 3 (CNBC said today to expect even more delays) and the viability of its autonomous features, a man reportedly using Autopilot on a Model S slammed into the back of a firetruck stopped on Interstate 405 in Culver City near Los Angeles.

Here’s where dedicated, early adopters shake their heads, chastise the driver for not being attentive enough and snap their fingers for the car’s data recorder to embrace another teachable moment. A wireless update, maybe tweak something here and there, remove a dead bug (as was the case for a Model S in 2016 when a large moth got plastered across a grille-mounted radar sensor and rendered it useless) and we’re back on the road. Just some potholes on the road to progress.

Really, at this point, we don’t have any hard data that supports the driver’s story that Autopilot had been engaged at the time of the crash. However, Wired reported today that the car’s owner’s manual warns that Autopilot may not detect stationary objects. This is owed to the radar and camera components used in the system. All of this recalls criticism leveled against Tesla CEO Elon Musk late last year following his claim that his vehicles have all the hardware necessary for fully autonomous, or Level V, driving.

“I think he’s full of crap,” Scott Miller, GM’s director of autonomous vehicle integration told The Sydney Morning Herald. “If you think you can see everything you need for a Level Five autonomous (car) with cameras and radar, I don’t know how you do that.”

I’m guessing the driver in Monday’s crash may be feeling the same way right about now. Vehicles with collision avoidance technology, including the gold star standard, automatic emergency braking (AEB), will take one to three paths, depending on their design, to mitigate a collision: warn the driver in the least, prime the brakes in more advanced versions and in the latest AEB technology, the brakes will be automatically engaged if the driver takes no action. The Model S is equipped with AEB, which makes this crash even more puzzling and troubling. The Model S is also promoted with this message on its webpage:

“All Tesla vehicles produced in our factory, including Model 3, have the hardware needed for full self-driving capability at a safety level substantially greater than that of a human driver.”

So, if it has the hardware for ‘full self-driving capability,’ yet it’s not really a self-driving car as Tesla supporters have been Tweeting all week, then why take the risk in using promotional language like that? Supporters may argue that you have to read the fine print and know and appreciate what the technology can do. Whatever…an attractive term like ‘self-driving’ should not be used loosely. Terms like ‘steering assistance’ and ‘automated braking’ are much more sensible and less litigious, but I know…they don’t have the same draw as ‘self-driving.’

What Tesla has drawn, however, is the attention, once again, of the National Transportation Safety Board and the National Highway Traffic Safety Administration, both of which have begun investigating Monday’s crash. Their first Autopilot crash investigation began in 2016 and ended over a year later last September. Their findings? Autopilot’s limitations as an autonomous driving platform played a role in the 2016 accident that claimed the life of dedicated Tesla devotee and former U.S. Navy SEAL Joshua Brown who crashed into a tractor-trailer that had failed to yield the right-of-way. Brown was also found at fault for relying on Autopilot to actually be a bonafide autopilot. Tesla has warned against that, yet stories pop up where overzealous drivers treat their Teslas like they’re actually self-driving cars. Hmmm…how’d they ever get that idea?!

While reminiscent of Brown’s tragic accident almost two years ago, this week’s crash was not nearly as horrific and serves as a reminder of Tesla’s stellar crash test rating. I mean…come on—the driver walked away after slamming into the back of a truck at 65mph! On that note, EVs that offer frunks in place of traditional engine bays have a clear safety advantage. Those frunks double as crash-absorbing crumple zones.

Now the weird news…police in San Francisco reported finding a man passed out drunk behind the wheel of his Tesla on Bay Bridge last weekend. Not that finding passed out drunks is anything new. The twist on this tale is that the Model S containing the allegedly drunk driver had come to dead stop on the bridge where commuters were finding it a little less than traffic-friendly. When police woke the man from his stupor, he said he had placed the car on autopilot—apparently after doing some serious drinking. Police said his blood alcohol level was twice the legal limit.

Sorry, but I can’t help but think of Barney Gumble from Fox TV’s The Simpsons. Barney is Homer Simpson’s perpetually drunk, bumbling and good-natured friend whose belches threaten to tear the time-space continuum. This is something he would do.

Otherwise, who does this?! Are there folks out there getting smashed and relying on Tesla Autopilots to get them home? And why, if indeed Autopilot had been activated, did the car stop on a bridge? Retribution a la Space Odyssey? (Sorry HAL fans…couldn’t resist.) Did the car somehow sense an inactive driver and stopped as a precaution? Did it simply run out of juice? Or did the guy actually have Autopilot on at all?

Assuming Autopilot was active, it may have kept the driver from slamming into the bridge or another vehicle. Whatever the case, bring on the data recorder. There’s more work to be done.