Updated News Around the World

NHTSA Upgrades Tesla Autopilot Probe Into Emergency-Scene Crashes

U.S. auto-safety regulators have escalated an investigation into

Tesla Inc.’s

TSLA 1.48%

Autopilot, after identifying new crashes that occurred with first-responder vehicles at emergency scenes.

The National Highway Traffic Safety Administration said in a notice published Thursday that it was expanding a probe begun last August into a series of crashes in which Tesla vehicles struck stopped emergency vehicles.

The agency said it was upgrading its earlier investigation to an engineering analysis, a step the NHTSA takes in determining whether to order a safety recall.

NHTSA also said it has expanded its examination of Autopilot to include a wider range of crashes, not only those at emergency scenes. The agency said it would further assess how drivers interact with Autopilot and the degree to which it might reduce motorists’ attentiveness.

Forensic data available for 11 of the crashes showed that drivers failed to take evasive action in the two to five seconds before the collision, the agency said.

The investigation covers an estimated 830,000 Tesla vehicles made from 2014 to 2021, including the Model 3, Model S, Model X and Model Y.

The NHTSA said in its filing that it has identified 15 injuries and one fatality related to the crashes.

Tesla didn’t immediately respond to a request for comment. The electric-car maker’s stock was up 2.5% in midday trading Thursday, following news of a strong bounceback in production at its plant in China.

Autopilot, Tesla’s name for the advanced driver-assistance technology used in its vehicles, is designed to help drivers with tasks such as steering and keeping a safe distance from other vehicles. Tesla instructs drivers using the system to pay attention to the road and keep their hands on the wheel.

The electric-car maker has long maintained that driving with Autopilot engaged is safer than doing so without it. Tesla points to internal data showing that crashes were less common when drivers were using Autopilot. Some researchers have criticized Tesla’s methodology.

In opening its initial probe last year, the NHTSA said that it had identified 11 crashes since early 2018 in which a Tesla vehicle using Autopilot struck one or more vehicles involved in an emergency-response situation. In its latest filing, the agency said it discovered six additional crashes involving Teslas and first-responder vehicles where Autopilot was in use.

U.S. safety regulators are probing crashes involving Teslas, suspecting the company’s Autopilot system might be involved. WSJ’s Robert Wall reports on how some motorists may mistakenly think Autopilot is a self-driving feature that doesn’t require their attention. (Video from 3/18/21)

The expanded probe of Autopilot is the latest sign that U.S. auto-safety regulators are getting more aggressive in scrutinizing advanced vehicle technologies that automate some or all of the driving tasks.

The NHTSA is getting ready to release new crash data this month that will give the public its first detailed look at the frequency and severity of incidents involving what are known as automated driving or advanced driver-assistance features, The Wall Street Journal has reported.

More than 100 companies are subject to an agency order requiring them to report crashes in which such systems were in use. Among those included are operators of autonomous-car fleets, like

Alphabet Inc.’s

Waymo and General Motors Co.’s Cruise LLC.

The technology under scrutiny includes lane-keeping assistance and cruise-control systems that keep a fixed distance behind a leading car, as well as higher-tech systems such as features that can guide a car along highways with minimal driver input.

Autopilot has become a particular focus for U.S. regulators in recent years, prompted by incidents in which drivers have misused the technology, overriding safety functions to operate a vehicle without their hands on the wheel, for example. Some critics also said the term Autopilot risks giving drivers an inflated sense of the system’s capabilities.

The NHTSA said in its latest filing that driver use or misuse of Autopilot doesn’t necessarily preclude the agency from determining whether the technology is defective.

“This is particularly the case if the driver behavior in question is foreseeable in light of the system’s design or operation,” the NHTSA said. Auto makers are legally required to initiate a recall if a safety defect is discovered in their vehicles.

Separately, the NHTSA has opened a broader investigation into several dozen crashes where advanced driver-assistance features are suspected to have played a role. While the probe covers vehicles made by any car company, incidents involving Teslas represent most of the cases under examination, including several with fatalities.

Copyright ©2022 Dow Jones & Company, Inc. All Rights Reserved. 87990cbe856818d5eddac44c7b1cdeb8

For all the latest Technology News Click Here 

 For the latest news and updates, follow us on Google News

Read original article here

Denial of responsibility! NewsUpdate is an automatic aggregator around the global media. All the content are available free on Internet. We have just arranged it in one platform for educational purpose only. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials on our website, please contact us by email – [email protected]. The content will be deleted within 24 hours.