My Car Quest

December 19, 2024

Opinion: Are Self-Driving Cars Moving Too Fast?

by Mike Gulett –

Are we moving too fast with this new technology for self-driving cars? Should the self-driving car companies be allowed to experiment with the lives and safety of uninvolved people?

The California DMV allowed a few car companies to operate taxis services both in the day and at night including Cruise (a GM subsidiary) and Waymo (a Google subsidiary) in San Francisco. Imagine getting into a taxi with NO DRIVER – just like Johnny Cab in the Sci Fy movie Total Recall, except it is for real. Other cities, like Phoenix, also allowed this wholesale experiment with the lives of unknowing people.

In October 2023 after the California Department of Motor Vehicles canceled Cruise’s permits for self-driving cars after an unfortunate accident involving a pedestrian, Cruise said it would suspend all driverless operations in the US to examine its process and earn back public trust. Maybe they should have earned the public trust before they put lives at risk? The CEO of Cruise has since resigned.

From the Washington Post,

Ed Walters, who teaches autonomous vehicle law at Georgetown University, said that driverless technology is critical for a future with fewer road fatalities because robots don’t drive drunk or get distracted. But, he said, this accident shows that Cruise was not “quite ready for testing” in such a dense urban area.

“In hindsight you would have to say it was too early to roll these cars out in that environment,” he said. “This is a cautionary tale that we should be incremental. That we should do this step by step and do as much testing as we can with people in the cars to see when they are safe and whether they are safe.”

Yep, this seems to be correct – why did the government regulators not see this risk ahead of their approval? Why were they in such a hurry to approve this new and risky technology?

Tesla [1], the leader in new car technology, is also having problems with self-driving cars. Tesla says “full self-driving” is a feature but in reality not so. There have been many accidents in the US involving Teslas in “full self-driving” mode and as a result Tesla is facing several lawsuits.

Self-driving capability is motivated by the perception that it is safer than regular people driving vehicles. But it is certainly motivated by lots of profit potential for the companies who crack the code to success.

Could it also be motivated by governments (like California) that make a lot of tax income from profitable companies in the state?

As I wrote in June 2015, The Self-Driving Car Is Almost Here,

This new technology is exceedingly complex and may take longer to become commonplace than anticipated by the companies working on self-driven cars like Google who has promised that their self-driven car will be available by 2020.

Self-driving car companies have over promised and under delivered and government agencies responsible for regulation of this new technology have rushed forward recklessly.

I say slow down and let the technology catch up with the reality of the problem.

Until then I for one do not want a ride in a driverless taxi.

Let us know what you think in the Comments.

Concept of self-driven car from the 1950's

Concept of self-driven car from the 1950’s – Source: unknown

[1] The author owns shares of Tesla.

Summary
Opinion: Are Self-Driving Cars Moving Too Fast?
Article Name
Opinion: Are Self-Driving Cars Moving Too Fast?
Description
Are we moving too fast with this new technology for self-driving cars? Should the self-driving car companies be allowed to experiment with the lives and safety of not involved people?
Author

Comments

  1. WALLACE WYSS says

    Maybe this will get Elon’s attention. From TIME magazine

    BUSINESS AUTOMOBILESTESLA RECALLS NEARLY ALL VEHICLES SOLD IN U.S. TO FIX SYSTEM THAT MONITORS DRIVERS USING AUTOPILOT
    Tesla Recalls Nearly All Vehicles Sold in U.S. to Fix System That Monitors Drivers Using Autopilot
    The Tesla logo is displayed on the hood of a Tesla car on May 20, 2019 in Corte Madera, California. (Justin Sullivan—Getty Images)
    The Tesla logo is displayed on the hood of a Tesla car on May 20, 2019 in Corte Madera, California. Justin Sullivan—Getty Images
    BY TOM KRISHER / AP DECEMBER 13, 2023 8:02 AM EST
    (DETROIT) — Tesla is recalling nearly all of the vehicles it sold in the U.S., more than 2 million across its model lineup, to fix a defective system that’s supposed to ensure drivers are paying attention when they use Autopilot.

    Documents posted Wednesday by U.S. safety regulators say the company will send out a software update to fix the problems.

    Watch more from TIME
    Click to unmute

    The recall comes after a two-year investigation by the National Highway Traffic Safety Administration into a series of crashes that happened while the Autopilot partially automated driving system was in use. Some were deadly.

    The agency says its investigation found Autopilot’s method of ensuring that drivers are paying attention can be inadequate and can lead to foreseeable misuse of the system.

    BRANDED CONTENT
    TIME and Intel prompted AI to redesign TIME’s iconic covers.
    BY INTEL
    The recall covers models Y, S, 3 and X produced between Oct. 5, 2012, and Dec. 7 of this year.

    The software update includes additional controls and alerts “to further encourage the driver to adhere to their continuous driving responsibility,” the documents said.

    The update was to be sent to certain affected vehicles on Tuesday, with the rest getting it at a later date, the documents said.

    Autopilot includes features called Autosteer and Traffic Aware Cruise Control, with Autosteer intended for use on limited access freeways when it’s not operating with a more sophisticated feature called Autosteer on City Streets.

    The software update apparently will limit where Autosteer can be used.

    “If the driver attempts to engage Autosteer when conditions are not met for engagement, the feature will alert the driver it is unavailable through visual and audible alerts, and Autosteer will not engage,” the recall documents said.

    Depending on a Tesla’s hardware, the added controls include “increasing prominence” of visual alerts, simplifying how Autosteer is turned on and off, additional checks on whether Autosteer is being used outside of controlled access roads and when approaching traffic control devices, “and eventual suspension from Autosteer use if the driver repeatedly fails to demonstrate continuous and sustained driving responsibility,” the documents say.

    Recall documents say that agency investigators met with Tesla starting in October to explain “tentative conclusions” about the fixing the monitoring system. Tesla, it said, did not concur with the agency’s analysis but agreed to the recall on Dec. 5 in an effort to resolve the investigation.

    Auto safety advocates for years have been calling for stronger regulation of the driver monitoring system, which mainly detects whether a driver’s hands are on the steering wheel. They have called for cameras to make sure a driver is paying attention, which are used by other automakers with similar systems.

    Autopilot can steer, accelerate and brake automatically in its lane, but is a driver-assist system and cannot drive itself despite its name. Independent tests have found that the monitoring system is easy to fool, so much that drivers have been caught while driving drunk or even sitting in the back seat.

    In its defect report filed with the safety agency, Tesla said Autopilot’s controls “may not be sufficient to prevent driver misuse.”

    A message was left early Wednesday seeking further comment from the Austin, Texas, company.

    Tesla says on its website that Autopilot and a more sophisticated Full Self Driving system cannot drive autonomously and are meant to help drivers who have to be ready to intervene at all times. Full Self Driving is being tested by Tesla owners on public roads.

    In a statement posted Monday on X, formerly Twitter, Tesla said safety is stronger when Autopilot is engaged.

    NHTSA has dispatched investigators to 35 Tesla crashes since 2016 in which the agency suspects the vehicles were running on an automated system. At least 17 people have been killed.

    The investigations are part of a larger probe by the NHTSA into multiple instances of Teslas using Autopilot crashing into parked emergency vehicles that are tending to other crashes. NHTSA has become more aggressive in pursuing safety problems with Teslas in the past year, announcing multiple recalls and investigations, including a recall of Full Self Driving software.

    In May, Transportation Secretary Pete Buttigieg, whose department includes NHTSA, said Tesla shouldn’t be calling the system Autopilot because it can’t drive itself.

    In its statement Wednesday, NHTSA said the Tesla investigation remains open “as we monitor the efficacy of Tesla’s remedies and continue to work with the automaker to ensure the highest level of safety.”

  2. Tesla recalls nearly all 2 million of its vehicles on US roads

    https://www.cnn.com/2023/12/13/tech/tesla-recall-autopilot/index.html

Speak Your Mind

*