The role of Human Factors science in DMS technology
Our world-leading Driver Monitoring System (DMS) technology is underpinned by a deep understanding of human behaviour. Kyle Wilson PhD explains the important role of Human Factors science in developing and enhancing our life-saving technology.
Dr. Kyle Wilson is part of the Human Factors team at Seeing Machines and explains the role of Human Factors science to develop an effective Driver Monitoring System (DMS) solution. He delves into some examples of how fatigue and distraction can present in drivers and distinguishes between the two different types of DMS.
Transcript
Why is Human Factors important to DMS?
00:15
Human Factors concerns how people interact with systems, technology and their environment.
00:22
Driver Monitoring Systems are required to detect very complex human states and behaviours, and so for that reason, it’s crucial that they’re built using a deep understanding of human behaviour.
00:35
In order to get that deep understanding of human behaviour, you need Human Factors research and data.
00:42
So that means collecting data with drivers in simulators, on test tracks, on-road and naturalistic driving data, such as the nine billion kilometres and counting of naturalistic driving data that Seeing Machines has.
Fatigue can present in many ways
01:04
Fatigue can vary a lot between person to person. So, two people might have a microsleep, a temporary episode of unconsciousness. Person A, their microsleep might involve a long eye closure and the head might nod forward, whereas person B, with the same degree of unconsciousness, the same degree of impairment, their head might remain totally still, and their eyes might only partially close.
Examples of Distracted driving behaviour
01:40
The most simple and well understood type of distraction is a single long glance away from the forward roadway. Many distracted behaviours don’t actually meet this simple definition though.
01:52
For example, visual attention time sharing is where the driver glances back and forth between the forward road and a secondary task such as checking a cell phone or fiddling with the radio. This might go on for up to 20 to 30 seconds and during this time there might not be a single glance that is itself longer than one or two seconds. Yet the driver can be just as dangerously distracted as a driver doing a single long glance away.
02:25
Distraction can also manifest as owl or lizard behaviour. Owl behaviours describe when a driver’s head and eyes move together and this is often to areas that are further from the forward roadway, such as a passenger side mirror. Lizard glances describe when the eyes shift but the head remains still and these are often to areas that are closer to the forward road. An example might be a driver checking a cell phone held near the steering wheel.
02:58
To actually detect these behaviours, given the, you know, wide variation across people, requires for one, having a system that detects not only head movement, but eye movement as well. If your Driver Monitoring System relies on head movement alone to infer whether a driver is distracted, you’ll miss a lot of those distracted behaviours that fall into this lizard glance category.
03:26
It’s also crucial that your Driver Monitoring System works in real-world conditions. So that means drivers wearing hats, sunglasses, night-time conditions to bright sunlight.
Distinguishing between Driver Monitoring Systems
03:44
There are direct and indirect methods for Driver Monitoring Systems. Camera-based systems are examples of direct methods and hands on wheel sensors and sensors that detect whether the vehicle shifts inside the road lane are examples of indirect systems.
04:04
The best approach is to use camera-based systems. The best information for detecting fatigue and distraction comes from the face and the eyes.
04:14
As the saying goes, the eyes are the window to the soul and driving is a highly visual task.