Tesla’s Autopilot may have a Blind Spot that can kill you….
by Wallace Wyss –
When I first saw the headline referring to a Tesla hitting a semi tractor-trailer, I thought “Oh, that’s old news—didn’t that happen like three years ago?”
It did, in May 2016.
But now it’s happened again—this time it is a collision between a Tesla Model 3 and a semi truck in Delray Beach, Florida, an accident which happened March 1st, 2019. According to the police report, the truck was turning left to enter a main thoroughfare when the Model 3 crashed into the semi’s trailer, shearing off the car’s roof and killing the driver.
Now on the original Tesla-vs-truck accident in May 2016 it was determined that the Model S owner might have been watching a movie while he was driving and otherwise distracted. That same owner was infamous for filming videos of himself letting his Tesla do the steering while he sat hands off. He was decapitated for his counting too much on the Tesla’s Autopilot’s ability to spot hazards ahead (that truck was devoid of markings on the tractor trailer so the Tesla’s system might have just seen white against a white sky).
Supposedly Tesla’s response was to change the system tightening up the security by making it impossible for the driver to take their hands off the steering wheel for a long time.
On this new accident, Police have not yet determined if the Autopilot was active. The accident’s similarities include the trucks, I saw on a news clip on TV the tractor trailer in the Florida accident—it’s 95% white on the side with only a small sign. Eerily, the Model 3 kept motoring on after the accident for more than 500 yards before coming to a stop, at first not aware that “Oops, I killed the driver.”
This is one of the things I have against autonomous cars—they just do what they are programmed to do. But is the programming comprehensive enough?
The NTSB could take a year until they have results of the new investigation. Tesla is co-operating. In that 2016 crash, they determined the vehicle had Autopilot engaged and that car also kept going a significant distance after the crash, but that time Tesla was cleared of having an system with blind spots and the NTSB found evidence that the driver had a history of flaunting the car’s autonomous driving and taking chances.
It seems to me that we have a pattern here—same accident, two different model Teslas, two different tractor trailers. But will it take a third Tesla-vs-tractor trailer death to spark a Congressional investigation?
Let us know what you think in the Comments.
THE AUTHOR: Wallace Wyss is the author of 18 automotive histories. He has been a consultant to several automakers including Ford, Honda and Toyota.
Wally, you sound like what’s his name, what else scared you that you saw on TV?
It appears as the Tesla’s, or any self driving technology only focuses on objects it can detect up to a certain height( like the height of a car) the Tesla sees a clear path obviously under these trucks.it thinks it’s going under a bridge.
It’s always this way when it comes to new technology. Here’s some information on the period at the last turn of the century when horses and cars were interacting vying for the dominant position in transportation.
Like any who are left behind during times of technological wizardry and advancement, horses began to lose their jobs as motored vehicles took to the roads in the late 19th century. Of course, horseless carriages were a brave new world to be explored by the curious and adventurous, those who sought greater enlightenment, speed, and status as an early adopter. Such courageous souls drove these amazing but “infernal machines” down the streets, startling every horse and pedestrian along the way. Frightened horses became such a problem that some owners threatened to shoot drivers on sight.
No one should be using self-driving modes in ANY car right now without being on constant alert. No vehicle is ready for it. BUT, we have to give it all time because the genie is out of the bottle and people are going to move forward regardless of the deaths. Same was true with aircraft, rockets, microwave ovens, etc. Some things move forward without careful review of the health consequences (cigarettes, air conditioning and refrigerant chemistry, thousands of medical drugs). But regulated progress with proper constraints and appropriate use will eventually yield beneficial results. But along the way, people have to abuse the limits – that seems to be the way our species works.
In Silicon Valley if you drive the commute on SanTomas or Lawrence expressway you can witness thousands of self driving cars commuting to work every day with their owners doing almost everything but driving. Tech people relying and trusting their lives on tech.
Someone will bring up the first jet airliners crsjing because the shape of the windows was wrong and they split the fuselage along the window line, but that;s all forgotten now, so I think engineers are too accepting of a certain percentage of ay technology going wrong. But I thought Tesla said the first time they were going to correct that flaw, One idea I had since is fitting all tractor trailers with perimeter sensors around the wides of the box tht theTesla sensors can read, But there;s millions of tractor trailers….who would pay for that?
What makes parking lots in particular difficult for self-driving cars?