Tesla Autopilot and other driver-assistance systems get new scrutiny.

Federal safety regulators told automakers to provide more information about accidents involving cars and trucks with automation technology.

Advertisement

Continue reading the main story

Tesla Autopilot and other driver-assistance systems get new scrutiny.

There has been growing concern about the safety of driver-assistance systems, in particular Tesla’s Autopilot.Credit…KCBS-TV, via Associated Press

June 29, 2021, 11:55 a.m. ET

A federal safety agency told automakers on Tuesday to begin reporting and tracking crashes involving cars and trucks that use advanced driver-assistance technology such as Tesla’s Autopilot and General Motors’ Super Cruise, a sign that regulators are taking the safety implications of such systems more seriously.

Automakers must report serious crashes within one day of learning about them, the agency, the National Highway Traffic Safety Administration said. Serious accidents include those in which a person is killed or taken to a hospital, a vehicle has to be towed away, or airbags are deployed.

“By mandating crash reporting, the agency will have access to critical data that will help quickly identify safety issues that could emerge in these automated systems,” said Steven Cliff, the agency’s acting administrator. “Gathering data will help instill public confidence that the federal government is closely overseeing the safety of automated vehicles.”

The order comes amid growing concern about the safety of such systems, in particular Autopilot, which uses radar and cameras to detect lane markings, other vehicles and objects in the road. It can steer, brake and accelerate automatically with little input from the driver, but it can sometimes become confused.

At least three Tesla drivers have died since 2016 while driving with Autopilot engaged. In two cases, the system and the drivers failed to stop for tractor-trailers crossing roadways, and in a third the system and the driver failed to avoid a concrete barrier on a highway. Tesla has acknowledged that Autopilot can have trouble recognizing stopped emergency vehicles, although the company and its chief executive, Elon Musk, maintain that the system makes its cars safer than those of other manufacturers.

The agency, which some auto safety experts have criticized for going easy on automakers, has begun investigations into about three dozen crashes of vehicles with advanced driver-assistance systems. All but six of those accidents, the first of which took place in June 2016, involved Teslas. Ten people were killed in eight of the Tesla crashes, and one pedestrian was killed by a Volvo that was being used as a test vehicle by Uber.

Critics of Autopilot say Mr. Musk has overstated the technology’s abilities, and the Autopilot name has caused some drivers to believe that they can turn their attention away from the road while the system is turned on. A few people have recorded videos of themselves leaving the driver’s seat while the car was in motion. Mr. Musk also frequently promotes a more advanced technology in development called Full Self-Driving, which Tesla has allowed some customers to use even though the company has acknowledged to regulators that the system cannot drive on its own in all circumstances.

Under the agency’s order on Tuesday, automakers must provide more complete information on serious crashes involving advanced driver-assistance systems within 10 days. And companies must submit a report on all crashes involving such systems every month.

The agency has also asked drivers to contact it if they own a vehicle with a driver-assistance system and believe it has a safety defect.

Leave a Reply