Study finds self-driving cars are safer than human-driven vehicles
Research shows that autonomous vehicles are less prone to accidents, except in low-visibility conditions and when turning
Self-driving cars haven’t taken off yet. Although they can be seen in half a dozen cities in the United States and China, they are being rolled out slowly and any incident stalls their development. However, there are advantages to autonomous vehicles. Ninety percent of accidents occur due to human error and any technological development that corrects this deficiency can reduce accidents which kill 1.19 million in the world, according to the World Health Organization. A study published on Tuesday in Nature Communications shows that autonomous vehicles are safer and less likely to be involved in accidents than human-driven cars, except in two circumstances: when turning and in low-visibility conditions.
Mohamed Abdel-Aty and Shengxuan Ding, researchers in Transportation, Electrical and Automotive Engineering and Computer Science at the University of Central Florida, reached this conclusion after analyzing data from 2,100 autonomous vehicles and 35,133 human-driven cars over a six-year period. The most advanced driving systems, according to the study, reduce the possibilities of rear-end, head-on and lateral collisions, as well as running off the road, by between 20% and 50%.
In all these situations, autonomous vehicles proved to be more effective than humans. “This is because they are equipped with advanced sensors and software that can quickly analyze the surrounding environment and make decisions based on the data receive,” states the study, adding: “There are many potential benefits of AVs on traffic safety, such as a reduction in human error, reduced fatigue, and distraction.”
However, in low visibility conditions, at dawn or dusk, and in turning condition, human-driven cars are two to five times safer. “These are the areas where autonomous driving technology may need further refinement to match or exceed human driving capabilities,” the researchers explain.
Thus, according to the study, the technology does not outperform human drivers in all circumstances, and autonomous cars must still overcome the challenge of increasing their ability to perceive and detect hazards, as well as developing decision-making programs and fail-safe mechanisms. The latter still account for 56% of the problems of autonomous driving.
“Improving automated vehicle safety involves advanced detectors, robust algorithms and smart design considerations. Key strategies include enhancing weather and lighting sensors, implementing redundancy measures, and integrating sensor data effectively,” the authors explain in a joint email.
The researchers point to technological solutions, such as the combined use of cameras and LiDAR (laser), GNSS (satellite navigation) and radar sensors, which enhance autonomous capabilities in cloudy, snowy, rainy and low-visibility scenarios, when a delay in detecting potential hazards and reacting appropriately can be deadly.
“Sensor fusion,” the researchers add, “allows cross-verification of information, which reduces errors. However, processing this data in real time is challenging and requires advanced computing power, which increases the cost and complexity of these systems.”
“It is a significant challenge to generate sufficient information and achieve comprehensive detection of the surrounding environment from a single independent source due to limited sensor ranges and limited coverage of the environment by sensors in autonomous vehicles. Additionally, some autonomous vehicles are programmed to follow predefined rules and scenarios, which may not encompass every possible driving situation,” the study warns.
In this sense, the researchers highlight that human drivers can “predict pedestrian movements and exercise caution based on their driving experience, whereas autonomous vehicles may struggle with recognizing pedestrians’ intentions, potentially leading to emergency braking or accidents due to a lack of understanding of social cues and psychological reasoning.”
To address this problem, Mohamed Abdel-Aty and Shengxuan Ding propose “advanced sensing and perception systems, predictive algorithms, and Vehicle-to-Everything communication.” This last concept is known as V2X and refers to a system in which devices not only detect potential danger and initiate a maneuver to save it, but share its insight with other cars and road safety systems so that they can anticipate it. “Both automated vehicles and human drivers face challenges with limited visibility, but the former can use advanced technologies and data for better safety assistance,” the authors explain.
Research is advancing to provide vehicles with human-like senses and improve latency (response time). Nature recently published two papers on the development of a processor to respond quickly to an event with minimal information and on a new system (algorithm) to improve the accuracy of mechanical vision with lower latency.
“Current driver assistance systems, such as those from MobileEye — which are integrated into more than 140 million cars worldwide — work with standard cameras that take 30 frames per second, that is, one image every 33 milliseconds. Additionally, they require a minimum of three frames to reliably detect a pedestrian or car. This brings the total time to initiate the braking maneuver to 100 milliseconds. Our system allows us to reduce this time to below a millisecond without the need to use a high-speed camera, which would entail an enormous computational cost,” explains Davide Scaramuzza, professor of robotics at the University of Zurich (Switzerland), who has published a paper on event cameras.
Bernabé Linares, a research professor at Spain’s Institute of Microelectronics (IMSE), is developing vision systems that resemble human perception and are fundamental for autonomous driving vehicles. “The biological retina does not take images. All information goes through the optic nerve and the brain processes it. In the conventional camera, each pixel is autonomous and, at most, is made to interact with its neighbors to adjust brightness. But a digital image at the exit of a tunnel can be all white or black while we, except in very extreme conditions, can see what is inside and outside of it,” he explains.
Another solution is to apply artificial intelligence to locate the most dangerous places and include that information in autonomous driving systems to condition their maneuvers. This is the line of research that is being pursued by Quynh Nguyen, an epidemiologist and statistician at the University of Maryland School of Public Health, who has published a study in British Medical Journal (BMJ) of Injury Prevention. “It is crucial to understand how the physical environment can increase or decrease fatal collisions and which communities are most affected by this,” Nguyen argues.
The American Chemical Society (ACS) has proposed using paints that make objects more visible to autonomous vehicles. To this end, the ACS has developed a “highly reflective black tint that could help autonomous cars see dark objects and make mechanical driving safer.”
Meanwhile, researchers at the University of Iowa have investigated the possibility of providing autonomous vehicles with an exterior light signal that tells pedestrians when it is safe to cross in front of them because it has identified the person and is preparing to stop.
All developments are moving towards full autonomy (Level 5), in which no human intervention will be required. According to researchers at the University of Central Florida, “it may become possible, although it is many years away due to significant challenges. These include the development of advanced algorithms and sensors, and the necessary infrastructure upgrades to effectively support automated vehicle technology.”
Sign up for our weekly newsletter to get more English-language news coverage from EL PAÍS USA Edition