Tragic lesson in autonomous tech

Quimby Mug Bayou Florida Headshot
Updated Aug 12, 2019

Daimler Trucks North America CEO Martin Daum said recently that he didn’t expect to see driverless trucks on the road during his professional lifetime.

Daum’s comments, couched in practical concerns, came following a 25-mile self-driving test of which Daum himself was a participant.

Appreciative of driverless tech, but mindful of its limitations, Daum compared the development of autonomous driving to a NASA lunar mission.

The comparison is warranted and may not actually go far enough given the widespread implications of a driverless trucking model. A trip to the moon, though important on various fronts, would not have the same effect on commerce if the technology were to fail.

One rocket failure, as devastating as that can be, does not compare to perhaps dozens of autonomous 80,000-lb. trucks falling victim to shrewd hackers or various abnormalities, natural or otherwise.

Though sensors have been refined through the years to account for various driving conditions, they have not been perfected to the degree that intense, daily commerce, coupled with unpredictable factors, demand.

For instance, Daum notes that sensors need to be developed to the point that they can make vital, life-saving distinctions. Driverless systems need to know the difference between a large tree branch along the shoulder of the road and a human being.

A recent, deadly crash attributed to a faulty guidance system in a Tesla Model S did not involve a tree branch, but rather intense Florida sunlight.

The car’s sensor system, overcome by the sun’s rays, could not account for a nearby tractor-trailer that crossed its path. Tragically, the car did not brake or engage in an evasive maneuver. Instead, it crashed into the truck, claiming the life of its sole occupant, a former Navy SEAL and strong Tesla proponent.

A July 4 editorial in The Washington Post states that the accident has been wrongfully categorized: the so-called Autopilot system available on the Tesla Model S was never intended for autonomous driving, but rather semi-autonomous control.

The fault rests with the driver, not the car, the Post states, because the Autopilot system in the Model S still requires that the driver remain vigilant and take full control of the vehicle if necessary. Though a complete investigation is far from over, various reports state that the driver, 40-year-old Joshua Brown, was possibly watching a movie in his car at the time of the accident.

“The car’s semi-autonomous systems, which use onboard sensors to guide it away from hazards, were not advanced enough to steer and brake the car without the driver paying continuous attention and correcting when necessary,” the Post’s Editorial Board reports. “In fact, none of the semi-autonomous cars on the market are trustworthy enough to allow drivers to sit back and zone out.”

The Post asks that consumers not hastily dismiss autonomous driving technology because of the accident.

“When real self-driving technology is ready for prime time, it will change all sorts of things about the way people get around — for the better.”

Brown, a veteran of the Iraq War, died after his Tesla drove under the trailer of a semi that was attempting to make a left turn across U.S. Highway 27 in Williston, Fla., according to theguardian.com.

As the Tesla drove under the truck’s trailer, the top half of the vehicle was ripped away. The car, still moving at a high rate of speed following the collision, veered off-road, crashed through one fence, drove across a large yard and crashed through another fence before striking a telephone poll and snapping it. The car stopped about 900 feet after hitting the truck.

The Autopilot feature on the Model S, according to The New York Times, has been in beta test mode, and Brown had been enthusiastically reporting on the car’s performance via YouTube videos which had racked up as many as two million views.

At one point, Brown’s glowing reports caught the attention of Tesla CEO Elon Musk, who endorsed Brown’s citizen journalism. Brown considered it a highlight of his life.

Doubtless, there are plenty of Tesla and autonomous driving fans the world over. Musk has played to this fan base well and attracted even more supporters by permitting the public to use the company’s experimental Autopilot.

That decision has now led to a National Highway and Traffic and Safety Administration investigation that could take months to complete.

Meanwhile, Brown’s family has lawyered up and, as of late, is not granting any interviews.

The tragic lesson here is that experimental testing of this magnitude should not be made available to the public. Yes, it generates plenty of buzz on social media and provides more exposure, but there’s just too much at stake.

Should something go wrong, as it did in Brown’s case, it leaves behind headlines that hinder a fledgling industry that’s trying to earn a seat at the table.