Self-driving cars may be safer than human drivers in everyday situations, but the technology struggles more than humans in dark places and when turning, according to the largest accident research study to date.
The findings come at a time when self-driving cars are already on the roads in some US cities, and GM-owned Cruise is set to resume testing of driverless cars. Pedestrian dragging incident The March incident prompted California to suspend its license, but Google spinoff Waymo has been gradually expanding its robot taxi operations in Austin, Los Angeles, Phoenix and San Francisco.
“It is important to improve the safety of self-driving cars at dawn, dusk or when turning,” he said. Ding Sheng Xuan “Key strategies include strengthening weather and lighting sensors and effectively integrating sensor data,” say researchers from the University of Central Florida.
Ding and his colleagues Mohamed Abdel AtiA team from the University of Central Florida collected data from California and the National Highway Traffic Safety Administration (NHTSA) on 2,100 crashes involving vehicles equipped with some degree of autonomous or driver-assistance technology, as well as more than 35,000 crashes involving unassisted human drivers.
The researchers then used statistical matching techniques to find pairs of accidents that occurred under similar circumstances, with common factors such as road conditions, weather, time of day, whether the accident happened at an intersection or on a straight road, etc. The researchers focused their matching analysis on 548 autonomous vehicle accidents reported in California, excluding less automated vehicles equipped only with driver assistance systems.
Abdel Aty said the overall results suggest that self-driving cars are “generally safer in most situations.” But the analysis also found that self-driving cars are five times more likely to crash when driving at dawn and dusk than human drivers, and are almost twice as likely to crash when making turns.
One obstacle to research is that “the database of autonomous vehicle accidents is still small and limited,” Abdel Aty said. He and Din cited the need for “enhanced autonomous vehicle accident reporting,” a major caveat that independent experts agree with.
“I think this is an interesting, but very early, step in measuring the safety of self-driving cars.” Missy Cummings Cummings, of George Mason University in Virginia, said the number of self-driving car accidents is “too small to make blanket conclusions about the safety of these technologies,” and warned about biased reporting by self-driving car makers. During her time at NHTSA, Cummings said, video footage of accidents didn’t always match the manufacturers’ explanations, which tended to place the blame on the human driver. “When you looked at the actual videos, they told a completely different story,” she said.
He said some minor collisions may not be reported to police, so that factor needs to be taken into account when comparing accidents involving self-driving cars with those involving human drivers. Eric Teo Virginia Insurance Institute for Highway Safety, 2017 study Early testing of Google’s self-driving cars found that only three out of 10 accidents made it into police reports.
“Neither California nor NHTSA require comprehensive data reporting on the testing and deployment of autonomous vehicles,” Cho Junfeng “Autonomous vehicles, and especially robotaxis, often operate in specific regions and environments, making it difficult to generalize research findings,” the Arizona State University researchers said.
topic:
- artificial intelligence/
- Driverless cars
Source: www.newscientist.com