Engineering researchers work to advance automatic driver assistance functions
This technology was adopted in response to negative consumer opinion about the current ADAS systems and their accuracy.
Oct. 10, 2019
As humans, we like to think that we are smarter than technology. This can ring true in terms of the sometimes faulty Advanced Driver Assistance Systems that are common in newer cars. These systems can be likened to that of the boy who cried wolf, according to Jung Hyup Kim, assistant professor of industrial and manufacturing systems engineering.
Just as the villagers began to ignore the boy in the common fable, people have a tendency to ignore the alert systems on their cars when they feel that they are receiving false warnings.
“When they are really facing the danger, then they might ignore the warning and that might lead to critical death,” Kim said. “And sometimes they just turn off the function.”
ADAS are becoming increasingly common in the modern car, but these systems are still far from perfect. Drivers often receive alerts claiming they are illegally changing lanes, not keeping their hands on the steering wheel or getting too close to another car when they are driving in what they believe to be a safe and normal fashion.
“This is one-way communication where the machine gives the warning, saying ‘this is dangerous, this is dangerous, I don't care if you accept it or not, I will just keep sending this message to you and [I] don't care about your response,’” Kim said. “It will just do its job and keep sending it like spam mail.”
As a result, many drivers will decrease the sensitivity of their car’s system or turn notifications off altogether, thereby defeating the purpose of the technology and potentially putting themselves and others at risk in the event of an actual accident.
To bypass this issue, Kim and industrial engineering Ph.D. candidate Xiao Nan Yang designed a driving simulator through the College of Engineering, which allowed researchers to track participants’ pupillary movements and Electromyography responses in real time when they were faced with lifelike driving experiences.
Participants were asked to wear eye-tracking goggles as well as an EMG device on their arm.
“The goggles will directly track your pupil dilation, like how much your pupil depth and diameter changed,” Yang said. “We also include the muscle signals, because we think that most of the time, the pupil doesn’t tell the whole story for the driver’s reaction. Once we add another channel for the physiological data, [we can] reveal a clearer picture for how people respond to the different warnings.”
Kim and Yang found that drivers’ responses changed over time.
“The first time we tested the warning, everybody responded well to the warning,” Kim said. “But after they have an understanding of how much it was accurate, then if [they] experience a false warning, then they do not respond at all.”
They want to create a two-way communication system in which the driver and the vehicle interact, so to speak, allowing the vehicle’s artificial intelligence function to learn from the driver and adapt according to how people react.
“We want to let the warning system know how the driver is going to respond,” Yang said. “If the warning system detects that you’re going to respond, it will directly mute itself [in order] to avoid providing an unnecessary warning.”
However, drivers won’t be wearing eye-tracking goggles or EMG electrodes on the road. Kim hopes to streamline the process, allowing people’s reactions to be tracked constantly via tiny electronic chips in everyday objects.
“If the technology is advanced enough, then we can put some sort of stationary eye-tracking device in the dashboard so that we can capture the pupil,” Kim said. “And nowadays, people use their Apple Watches to measure their heartbeat, so maybe we are able to collect some of this muscle activity data through a smartwatch.”
Edited by Laura Evans | firstname.lastname@example.org