Tesla Autopilot and Other Driver-Assist Systems Linked to Hundreds of Crashes

Nearly 400 crashes in the United States in 10 months involved cars using advanced driver-assistance technologies, the federal government’s top auto-safety regulator disclosed Wednesday. The findings are part of a sweeping effort by the National Highway Traffic Safety Administration to…

Nearly 400 crashes in the United States in 10 months involved cars using advanced driver-assistance technologies, the federal government’s top auto-safety regulator disclosed Wednesday.

The findings are part of a sweeping effort by the National Highway Traffic Safety Administration to determine the safety of advanced driving systems as they become increasingly commonplace.

In 392 incidents cataloged by the agency from July 1 of last year through May 15, six people died and five were seriously injured. Teslas operating with Autopilot, the more ambitious Full Self Driving mode or any of their associated component features were in 273 crashes. Five of those Tesla crashes were fatal.


Continue reading the main story

The data was collected under a NHTSA order last year requiring automakers to report crashes involving cars with advanced driver-assistance systems. Scores of manufacturers have rolled out such systems in recent years, including features that let you take your hands off the steering wheel under certain conditions and that help you parallel park.

NHTSA’s order was an unusually bold step for the regulator, which has come under fire in recent years for not being more assertive with automakers.

“Until last year, NHTSA’s response to autonomous vehicles and driver assistance has been, frankly, passive,” said Matthew Wansley, a professor at the Cardozo School of Law in New York who specializes in emerging automotive technologies. “This is the first time the federal government has directly collected crash data on these technologies.”

Thanks for reading The Times.
Subscribe to The Times
Speaking with reporters ahead of Wednesday’s release, Steven Cliff, the NHTSA administrator, said the data — which the agency will continue to collect — “will help our investigators quickly identify potential defect trends that emerge.”

Dr. Cliff said NHTSA would use such data as a guide in making any rules or requirements for their design and use. “These technologies hold great promise to improve safety, but we need to understand how these vehicles are performing in real-world situations,” he said.


Continue reading the main story

But he cautioned against drawing conclusions from the data collected so far, noting that it does not take into account factors like the number of cars from each manufacturer that are on the road and equipped with these types of technologies.

An advanced driver-assistance system can steer, brake and accelerate vehicles on its own, though drivers must stay alert and ready to take control of the vehicle at any time.

Daily business updates The latest coverage of business, markets and the economy, sent by email each weekday. Get it sent to your inbox.
Safety experts are concerned because these systems allow drivers to relinquish active control of the car and could lull them into thinking their cars are driving themselves. When the technology malfunctions or cannot handle a particular situation, drivers may be unprepared to take control quickly.

About 830,000 Tesla cars in the United States are equipped with Autopilot or the company’s other driver-assistance technologies — offering one explanation why Tesla vehicles accounted for nearly 70 percent of the reported crashes in the data released Wednesday.

ELON MUSK’S CRASH COURSEA documentary from The Times explores Tesla’s pursuit of self-driving systems and the doubts voiced by some at the company.
Ford Motor, General Motors, BMW and others have similar advanced systems that allow hands-free driving under certain conditions on highways, but far fewer of those models have been sold. These companies, however, have sold millions of cars over the last two decades that are equipped with individual components of driver-assistance systems. The components include so-called lane keeping, which helps drivers stay in their lanes, and adaptive cruise control, which adjusts a car’s speed and brakes automatically when traffic ahead slows.

In Wednesday’s release, NHTSA disclosed that Honda vehicles were involved in 90 incidents and Subarus in 10. Ford, G.M., BMW, Volkswagen, Toyota, Hyundai and Porsche each reported five or fewer.


Continue reading the main story

The data includes cars with systems designed to operate with little or no intervention from the driver, and separate data on systems that can simultaneously steer and control the car’s speed but require constant attention from the driver.

The automated vehicles — which are still in development for the most part but are being tested on public roads — were involved in 130 incidents, NHTSA found. One resulted in a serious injury, 15 in minor or moderate injuries and 108 in no injuries. Many of the crashes involving automated vehicles were fender benders or bumper taps because they were operated mainly at low speeds and in city driving.

In more than a third of the 130 accidents involving the automated systems, the car was stopped and hit by another vehicle. In 11 crashes, a car enabled with such technology was going straight and collided with another vehicle that was changing lanes, the data showed.

Most of the incidents involving advanced systems were in San Francisco or the Bay Area, where companies like Waymo, Argo AI and Cruise are testing and refining the technology.

Waymo, which is owned by Google’s parent company and is running a fleet of driverless taxis in Arizona, was part of 62 incidents. Cruise, a division of G.M., was involved in 23. Cruise just started offering driverless taxi rides in San Francisco, and this month it received permission from the California authorities to begin charging passengers.

None of the cars using the automated systems were involved in fatal accidents, and only one crash led to a serious injury. In March, a cyclist hit a vehicle operated by Cruise from behind while both were traveling downhill on a street in San Francisco.

NHTSA’s order for automakers to submit the data was prompted partly by crashes and fatalities over the last six years that involved Teslas operating in Autopilot. Last week NHTSA widened an investigation into whether Autopilot has technological and design flaws that pose safety risks.


Continue reading the main story

The agency has been looking into 35 crashes that occurred while Autopilot was activated, including nine that resulted in 14 deaths since 2014. It had also opened a preliminary investigation into 16 incidents in which Teslas under Autopilot control crashed into emergency vehicles that had stopped and had their lights flashing.

In November, Tesla recalled nearly 12,000 vehicles that were part of the beta test of Full Self Driving — a version of Autopilot designed for use on city streets — after deploying a software update that the company said might cause crashes because of unexpected activation of the cars’ emergency braking system.

NHTSA’s order required companies to provide data on crashes when advanced driver-assistance systems and automated technologies were in use within 30 seconds of impact. Though this data provides a broader picture of the behavior of these systems than ever before, it is still difficult to determine whether they reduce crashes or otherwise improve safety.

The Issues With Tesla’s Autopilot System
Card 1 of 5
Claims of safer driving. Tesla cars can use computers to handle some aspects of driving, such as changing lanes. But there are concerns that this driver-assistance system, called Autopilot, is not safe. Here is a closer look at the issue.

Driver assistance and crashes. A 2019 crash that killed a college student highlights how gaps in Autopilot and driver distractions can have tragic consequences. In another crash, a Tesla hit a truck, leading to the death of a 15-year-old California boy. His family sued the company, claiming the Autopilot system was partly responsible.

A federal investigation. Last year, the National Highway Traffic Safety Administration began investigating Autopilot’s role in crashes, after at least 11 incidents involving Teslas crashing into parked emergency vehicles. In June 2022, the agency, which has the authority to force a recall or require new safety features, announced it was significantly expanding the investigation.

Shortcuts with safety? Former Tesla employees said that the automaker may have undermined safety in designing its Autopilot driver-assistance system to fit the vision of Elon Musk, its chief executive. Mr. Musk was said to have insisted that the system rely solely on cameras to track a car’s surroundings, instead of also using additional sensing devices. Other companies’ systems for self-driving vehicles usually take that approach.

Information gap. A lack of reliable data also hinders assessments on the safety of the system. Reports published by Tesla every three months suggest that accidents are less frequent with Autopilot than without, but the figures can be misleading and do not account for the fact that Autopilot is used mainly for highway driving, which is generally twice as safe as driving on city streets.

The agency has not collected data that would allow researchers to easily determine whether using these systems is safer than turning them off in the same situations. Automakers were allowed to redact descriptions of what happened during the accidents, an option that Tesla as well as Ford and others used routinely, making it harder to interpret the data.

Some independent studies have explored these technologies, but have not yet shown whether they reduce crashes or otherwise improve safety.

J. Christian Gerdes, a professor of mechanical engineering and a director of Stanford University’s Center for Automotive Research, said the data released Wednesday was helpful, up to a point. “Can we learn more from this data? Yes,” he said. “Is it an absolute gold mine for researchers? I don’t see that.”

Because of the redactions, he said, it was hard to gauge the ultimate utility of the findings. “NHTSA has a lot better understanding of this data than the general public can get just looking through what was released,” he said.


Continue reading the main story

Dr. Cliff, the NHTSA administrator, was guarded about acting on the results. “The data may raise more questions than they answer,” he said.

But some experts said the newly available information should prompt regulators to be more assertive.

“NHTSA can and should use its various powers to do more — rule makings, star ratings, investigations, further inquiries and soft influence,” said Bryant Walker Smith, an associate professor in the University of South Carolina’s law and engineering schools who specializes in emerging transportation technologies.

“These data could also prompt further voluntary and involuntary disclosures,” he added. “Some companies might willingly provide more context, especially about miles traveled, crashes ‘prevented’ and other indicators of good performance. Trial attorneys will be looking for patterns and even cases in these data.”

All in all, he said, “this is a good start.”

Jason Kao, Asmaa Elkeurti and Vivian Li contributed research and reporting.

Related Coverage

How Safe Are Systems Like Tesla’s Autopilot? No One Knows.
June 8, 2022

Inside Tesla as Elon Musk Pushed an Unflinching Vision for Self-Driving Cars
Dec. 6, 2021

Tesla Sells ‘Full Self-Driving,’ but What Is It Really?
Aug. 20, 2021
Neal E. Boudette is based in Michigan and has been covering the auto industry for two decades. He joined The New York Times in 2016 after more than 15 years at The Wall Street Journal. @nealboudette

Cade Metz is a technology correspondent, covering artificial intelligence, driverless cars, robotics, virtual reality and other emerging areas. He previously wrote for Wired magazine. @cademetz

Jack Ewing writes about business from New York, focusing on the auto industry and the transition to electric cars. He spent much of his career in Europe and is the author of “Faster, Higher, Farther,” about the Volkswagen emissions scandal. @JackEwingNYT • Facebook

A version of this article appears in print on June 16, 2022, Section B, Page 1 of the New York edition with the headline: Driver-Assist Toll Tallied: 392 Crashes In 10 Months. Order Reprints | Today’s Paper | Subscribe