My Car Quest

December 4, 2022

Editorial: Self Driving – Tesla and Elon Musk in the Crosshairs

by Wallace Wyss –

We all have that that stubborn uncle/dad/brother who has some obsession and no matter how much you try to steer them back on course they want to stick to their original idea.

So it is with Elon Musk. What makes it different this time is that he is about to be charged as criminally liable for Tesla accidents involving an option called Full Self Driving. Specifically, all new Teslas now sold in the US have as standard Autopilot, an 8- camera system that they do not claim has the potential to someday operate their cars fully autonomously but which they do claim is (quoting their own website) “an advanced driver assistance system that enhances safety and convenience behind the wheel. When used properly, Autopilot reduces your overall workload as a driver.”

But Musk is not in trouble over Autopilot (though one could argue that too is mis-named since it is not piloting automatically 100% of the time). Where Musk is in trouble is a separate option, which came on some models later than others. It’ called FSD for Full Self-Driving Capability.

Of course many of his competitors have similar systems under development, but what all the automakers know is that there is an iceberg in the way of its full implementation, an iceberg bigger than the one that sank the Titanic. That iceberg is The Decision.

I’m talking about a court decision on a case that hasn’t happened yet. About an accident that hasn’t happened yet. It will be a case with a scenario like this: said vehicle is preceding along a two lane road behind a garbage truck, the kind of dump truck that hangs a fully loaded dumpster by chains in the air out behind the truck.

Tesla Image

Then the chain holding the dumpster snaps and the car, at that instant on Full Self Driving, has to decide. It knows it’s only 50 ft. behind the dump truck doing 70 mph. It knows, even with brakes full on, it needs 200 ft. to avoid impact. Its cameras look to the left and see a old lady, 70-ish, who has no chance in hell of scampering out of the way should the car go there to avoid hitting the dumpster. The car’s cameras then look to the right–there’s five pre-schoolers in line behind their minder. Their little legs can’t get them out of the way if the car goes right to avoid hitting the dumpster.

So the AI system in the car, makes a decision and hits a person or persons, fulfilling its mission of preserving the car. One death could spark a national debate but if there’s multiples it will stand more of a chance of being a test case that will go all the way to the Supreme Court.

One issue will be who to blame it on? If the car still had a driver, who had decided to elect full self driving, most of the blame will go on them for not recognizing an emergency and reclaiming control of the wheel. But if the driver is in the back seat sleeping, or the back seat passenger is an Uber/Lyft type customer with no intention of taking the wheel, then who do you blame?

Or what if the full self driving car is at that moment driverless, operating fully remote, summoned somewhere by cell phone by a customer who wants to be taxied somewhere? Do you blame this customer for the accident merely because they ordered the car–a car they haven’t ever laid eyes on?

I predict–again on the number of and ages of the deceased– when this trial takes place, the whole nation will be waiting for a decision. Because if the financial liabilities fall upon the automaker, then Tesla, will first of all have to retract the use of the term Full Self Driving and restitute all the optional packages for FSD sold.

But that’s not part of the Tesla plan. Their goal is to see fully autonomous cars allowed in the US, to make it possible for someone who wants a taxi like service to summon a fully robot car without a driver. If they can make available rentals without a human driver, then the Uber/Lyft type companies will make much more money than they do now, having to pay drivers. Tesla has described how full self driving owner rentals will work. They plan to keep track of the driving habits of all the people who own Teslas. They might decide, if you have a squeaky clean record, you can qualify for a program where you can offer your Tesla to robot service ride-share drivers or users when you’re not using the car. I don’t know how much money you would make, but let’s say your car is driven 500 miles by others or rented as a robo-driver by customers the very same day you are toiling at the office 9 to 5. You come out of work and there it is, waiting for you, proud it just made enough to pay a good chunk of the next car payment,

It’s not such a totally foreign concept–it’s how Air B & B works. You own a vacation house. Vacationers book that house online, go stay a few days and their rental fee allows you to make your next house payment. Maybe 3 days a month pays your whole payment.

Now Tesla and other automakers are champing at the bit and see all these worries as negligible. If full self driving becomes a common option throughout the industry it will allow many more people to buy cars as the car will pay for itself by working just a few days a month.

Sounds great, right? But you forgot. That court decision? Hasn’t been made yet. The horrific accident? Hasn’t happened yet. After it does happen, if the court rules the car owner legally responsible–even if he or she isn’t there–it will kill off the idea of fully autonomous cars being offered by their owners as rentals. If the court rules the rental firm, coordinating daily rentals possible–they will go under. If the court rules the automaker responsible they will drop back to Level 4, requiring a driver behind the wheel. And lose that share of the market that wants to sit back and let the robot drive.

I think Level 5 full autonomous cars won’t be on the option list very soon because the court decision hasn’t been made yet, because of an accident that hasn’t happened yet. When it does, we need a weighty decision on who takes the blame if it all goes wrong?

Now there’s another case shaping up, based on accidents already recorded. This talk of criminal charges against Musk himself bringing it all to a head. You can bet the world’s automakers are all being attentive as the progress of the case, because if FSD is vindicated, they too will reap millions of additional sales when autonomous cars are allowed to be sold to a new customer base who would rather let the machine do the riving.

In court, in this Federal case shaping up, Tesla will argue that they are very clear in warning customers and would-be vehicle buyers that buying their advanced driver-assist system does not make their cars autonomous.

But the court will try to prove Tesla gives the impression that their electric vehicles are able to drive themselves.

Tesla Image

The new focus on FSD came when Reuters quoted several people saying the Department of Justice had launched the previously undisclosed probe last year after becoming concerned over a dozen crashes that involved Teslas, some
fatal.

Musk has frequently been quoted saying Teslas would eventually be able to drive themselves without a human behind the wheel, but for now they give many warnings on how to use the system which means it could be tough to prosecute Musk when it’s the customers mis-using the option. Perhaps it’s not the text in the sales contract that gives the wrong impression. It’s Musk in interviews, bragging of FSD’s capabilities.

Elon Musk told Automotive News in 2020 that Autopilot problems are the result of customers using the system incorrectly. The fan site Teslarati even added that there are aftermarket companies who promote defeat devices that are designed to “trick” Autopilot into thinking that drivers are paying attention to the road even if they’re not.

I compare those devices to the “bump stick” that some mass murderers attached to their semi automatic rifles. The rifle maker never offered it, this was an aftermarket attachment but changed their weapon closer to full automatic in the rate of fire.

But I doubt that the fatal accidents involving Teslas will find any aftermarket devices were employed at the time of the accident. It all goes back to Musk’s stubbornness in insisting on the moniker Full Self Driving. If Elon Musk weren’t so stubborn he could change the name so drivers wouldn’t assume it has more capabilities than it does.

Like why not “Driver assist?” I think changing the name, refunding all the Tesla buyers the fee they paid for the option FSD and being able to prove that the drivers, some of them deceased, mis-used the system might save him.

Emphasize “might.”

Let us know what you think in the Comments.

Wallace Wyss art

THE AUTHOR Wallace Wyss is a commentator on cars new and collectible on KUCT FM’s Autotalk show. As a fine artist, his portraits of classic cars are seen are the Mecum auctions.

 
 
 
 
 

Tesla Logo

Summary
Editorial: Self Driving - Tesla and Elon Musk in the Crosshairs
Article Name
Editorial: Self Driving - Tesla and Elon Musk in the Crosshairs
Description
Elon Musk told Automotive News in 2020 that Autopilot problems are the result of customers using the system incorrectly.
Author

Comments

  1. So,…. does anyone other than Teslarati WANT to save Elon?

  2. The other issue with Musk is that he uses Tesla resources (people) to work on Twitter stuff. I suspect some Tesla shareholders do not like that. (Note: I own shares of Tesla.)

    If this continues I suspect some Tesla shareholders may make a fuss.

Speak Your Mind

*