Updated News Around the World

Drivers of self-driving cars can rely too much on autopilot, and that’s a recipe for disaster

Drivers of self-driving cars can rely too much on autopilot, and that's a recipe for disaster
Credit: Inside Edition

We were promised a very near future where autonomous machines would be serving our needs and vehicle ownership would be rendered unnecessary: robots would quickly and efficiently deliver our orders and we could squeeze in a few more hours of work or sleep while being chauffeured around in self-driving cars.

Progress has been made, at least, on some of this. University campuses and cities across North America have indeed witnessed the growing presence of small food-delivery robots. Likewise, new partnerships have recently been announced to develop and test the safety of self-driving trucks.

The journey toward autonomous or self-driving consumer cars, on the other hand, has arguably come to a screeching halt. In 2021, top industry experts recognized that developing safe autonomous driving systems was not as simple as it was anticipated. Among them, Elon Musk himself conceded that developing the technology required to deliver safe self-driving cars has proved harder that he thought.

Automation paradox

More bad news came this week when the U.S. National Highway Traffic Safety Administration (NHTSA) released numbers that showed Tesla vehicles being responsible for nearly 70% of the crashes involving so-called SAE Level 2 cars.

Some cars are completely autonomous and are capable of driving without any input from the human driver. For example, Waymo One, in Phoenix, Ariz., is a ride-hailing service that currently deploys autonomous cars on a test route.

SAE Level 2 autonomous systems, like Tesla Autopilot, require human drivers to stay alert at all times, even when the system temporarily takes control of steering and acceleration. As soon as the traffic or road conditions aren’t adequate for the system to operate, control is given back to the driver who needs to take over manual control of the vehicle.






Inside Edition looks at people’s behaviours in autonomous cars.

Human factors engineering is a cross-disciplinary research field investigating how humans interact with vehicle technology. Its researchers have, for years, highlighted the safety risks of automated driving—especially when the system requires the driver to make up for technological shortcomings to operate safely.

This is the case in what is known as the automation paradox, wherein the more automated the vehicle, the harder it is for humans to operate it properly.

Overestimating vehicle capability

Among the most prominent risks of operating SAE Level 2 cars is when drivers misunderstand the capabilities of the automated system. The issue often leads to unsafe behaviors like reading a book or taking a nap while the vehicle is in motion.

In 2021, there were so many reports of unsafe behaviors at the wheel of Level 2 cars, that the NHTSA required manufacturers to start reporting crashes that had occurred when these systems were engaged.

The initial findings, released in June 2022, showed that since 2021, Tesla and Honda vehicles were, respectively, involved in 273 and 90 reported crashes when these systems were engaged. Most crashes occurred in Texas and California.

While these data paint a dismal picture of the safety of these systems, they pale in comparison to the over 40,000 reported fatal crashes that occurred in the United States in 2021 alone.

As part of the same report, NHTSA itself highlights some of the methodological limitations of the study: from the incompleteness of some of the source data to failing to account for individual manufacturers’ total vehicle volume or distance traveled by vehicles.

For the skeptics, this does not spell the end of autonomous cars. It does, however, confirm that widespread deployment of safe self-driving cars is not years, but decades, in the making.


US report: Nearly 400 crashes of automated tech vehicles


Provided by
The Conversation


This article is republished from The Conversation under a Creative Commons license. Read the original article.The Conversation

Citation:
Drivers of self-driving cars can rely too much on autopilot, and that’s a recipe for disaster (2022, June 17)
retrieved 17 June 2022
from https://techxplore.com/news/2022-06-drivers-self-driving-cars-autopilot-recipe.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.

For all the latest Technology News Click Here 

 For the latest news and updates, follow us on Google News

Read original article here

Denial of responsibility! NewsUpdate is an automatic aggregator around the global media. All the content are available free on Internet. We have just arranged it in one platform for educational purpose only. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials on our website, please contact us by email – [email protected]. The content will be deleted within 24 hours.