Updated News Around the World

Tesla Tops U.S. Agency List of Crashes Suspected to Involve Driver-Assistance Technology

The auto industry’s top safety regulator said Wednesday that it had received reports of nearly 400 recent crashes in which advanced driver-assistance features were engaged during or immediately before the incident. More than two-thirds of those crashes happened in a Tesla vehicle, it said.

The National Highway Traffic Safety Administration cautioned against using the data to draw conclusions about the safety of any company’s technology given a series of limitations. Those include that the data, which is company-reported, doesn’t take into account the number of vehicles outfitted with driver-assistance technology that any given manufacturer has sold, or the number of miles those cars have traveled.

Certain auto makers, such as Tesla, are also able to access more information about their vehicles remotely than others, potentially making it easier to learn about crashes that are subject to the federal reporting requirements.

Tesla didn’t immediately respond to a request for comment.

The Texas-based company has said that driving with its advanced driver-assistance system, called Autopilot, engaged is safer than doing so without it. The company points to internal data showing that crashes were less common when drivers were using Autopilot. Some researchers have criticized Tesla’s methodology.

Tesla has said its cars have driven more than 1 billion miles on Autopilot.

General Motors Co.

said its vehicles using a similar system the Detroit manufacturer calls Super Cruise had logged just over 10 million miles as of last summer.

The auto-safety agency ordered companies last summer to report serious crashes involving advanced driver-assistance features that assume some control over steering, braking and acceleration. Such technologies have become more common on U.S. roadways in recent years, particularly as more car companies sell vehicles with features that automate more of the driving task.

The crash data, NHTSA said, is intended to provide the public with a more detailed look at the frequency and severity of vehicle crashes involving driver-assistance or automated-driving systems. The agency also said it hopes to use the data to determine which technologies might pose a safety risk and should be investigated further.

The companies were required to report crash incidents within one day of learning about them if a fatality or injury occurred. NHTSA said it received reports of 11 such crashes involving driver-assistance systems, six of which were fatal.

A Tesla vehicle’s crash site on a highway in California.



Photo:

/Associated Press

The technology under scrutiny by NHTSA includes lane-keep assistance and cruise-control systems that maintain a fixed distance behind a leading car. The data also captured higher-end systems, such as technology offered by Tesla that can guide a car along highways with minimal driver input.

Such systems aren’t tightly regulated in the U.S., and how they operate varies widely depending on the manufacturer, sometimes causing confusion among drivers. Lawmakers have been calling on regulators to scrutinize such features.

More than 100 companies were subject to the auto-safety agency’s order, including auto makers and tech companies involved in building such technology. Most auto makers cited in the agency’s data reported 10 or fewer crashes. The regulator said it received reports of 90 advanced driver-assistance-related crashes involving cars made by

Honda Motor Co.

, second only to the more than 270 reported by Tesla.

Honda said Wednesday that automakers may have differing interpretations of their reporting obligations and that an apples-to-apples comparison among manufacturers may not be possible at this time.

In order to comply with NHTSA’s 24-hour reporting deadline, the auto maker’s reports are based on unverified complaints regarding whether driver-assistance features were engaged at the time of a reported crash, the car company said. Honda said it has deployed its driver-assistance tech in approximately six million vehicles.

Auto makers have heavily promoted these driver-assistance systems in recent years as a way to improve safety, saying they are designed to help prevent crashes by relying on sensors, cameras and radars to detect potential roadway dangers. Some systems, such as those offered by GM and

Ford Motor Co.

, allow motorists to go hands-free behind the wheel in certain situations, a feature that is marketed as helping to alleviate driver fatigue, particularly on long road trips.

NHTSA also released data on crashes involving more automated systems, such as those deployed in vehicles operated by

Alphabet Inc.’s

Waymo LLC and GM’s Cruise LLC. Those vehicles are equipped with technology that fully automates driving in certain circumstances.

Cruise, a self-driving car firm acquired by GM in 2016, recently launched commercial service in San Francisco, where it is now offering rides to the public in vehicles without a human driver. Waymo operates a robotaxi service in the Phoenix area. NHTSA said it has received reports of 130 crashes involving these systems, mostly from Cruise and Waymo. Those companies reported 23 and 62 crashes, respectively. NHTSA received no reports of fatalities in autonomous vehicles.

Cruise and Waymo didn’t immediately respond to a request for comment.

Industry trade groups have been critical of NHTSA’s intention to release the data.

The Alliance for Automotive Innovation, a trade association representing most major car companies, said last week that the data collected by NHTSA was insufficient to evaluate the relative safety of driver-assistance features or automated-driving systems.

The Alliance said it would be a mistake to make generalizations about the technology without more context, particularly when comparing the reports with those involving manually operated cars.

NHTSA, as it released the data, said that it expects to update the data monthly. The agency said the data would be useful for efforts to identify potential safety risks, although key factors in many crashes were unclear. In 75% of the reports, the severity of injury associated with incidents involving driver-assistance features was listed as unknown. Some reports didn’t include the date that an incident occurred.

SHARE YOUR THOUGHTS

Do you think driver-assistance systems are safe? Why or why not? Join the conversation below.

Tesla’s Autopilot is among the best-known driver-assistance systems, helping motorists steer within a lane on the highway and match the speed of surrounding traffic. The system is standard on all new Teslas and is available on models built in late 2014 or after. Tesla also sells an upgraded suite of assistance technologies, known as Full Self-Driving, for $12,000. That package includes features that recognize and slow down for traffic lights and stop signs, help drivers with changing lanes on the highway and, in some cases, with navigating through cities.

NHTSA over the years has stepped up its scrutiny of Autopilot, following reports of fatal crashes that were suspected to have occurred after drivers had engaged the system. Last summer, it opened an investigation into collisions involving Tesla vehicles using Autopilot and first-responder vehicles stopped at emergency scenes.

The auto-safety agency said last week it had escalated that probe to what is called an engineering analysis and expanded its inquiry to a range of crashes. The move is a critical step in determining whether the technology poses a safety risk and should be recalled.

Tesla hasn’t commented on the probe or the incidents that NHTSA says prompted it.

Separately, NHTSA has opened a series of investigations into crashes suspected of having involved advanced driver-assistance technology. Of the 43 crashes it is looking into as part of this effort, 35 involved Tesla vehicles, including 10 that have resulted in fatalities, according to the agency.

Tesla’s driver-assistance system has received wider attention in part because some drivers take liberties with the technology. Some users have posted videos online in which they override safety functions to operate their Teslas without their hands on the wheel even though the company warns drivers against doing so. Some critics also say the terms Autopilot and Full Self-Driving risk inflating drivers’ perceptions of the car’s capabilities. Neither set of features makes Teslas autonomous. Tesla hasn’t responded to requests for comment about the criticism.

Write to Rebecca Elliott at [email protected]

Copyright ©2022 Dow Jones & Company, Inc. All Rights Reserved. 87990cbe856818d5eddac44c7b1cdeb8

For all the latest Technology News Click Here 

 For the latest news and updates, follow us on Google News

Read original article here

Denial of responsibility! NewsUpdate is an automatic aggregator around the global media. All the content are available free on Internet. We have just arranged it in one platform for educational purpose only. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials on our website, please contact us by email – [email protected]. The content will be deleted within 24 hours.