Information Fusion for Driver Distraction Studies Using Eye Tracking Glasses

Publikation aus Digital

Lucas Paletta, Michael Schwarz and Caroline Wollendorfer and Roland Perko

International Conference on Applied Human Factors and Ergonomics, Juli 19-24, 2014 Krakow, Poland , 1/2014


Eye tracking research about driver distraction, applied to real world driving tasks, has so far demanded a massive amount of manual intervention, for the annotation of hundreds of hours of head camera videos. We present a novel methodology that enables the automated integration of arbitrary gaze localizations onto a visual object and its local surrounding in order to draw heat maps directly onto the environment. Gaze locations are tracked in video frames of the eye tracking glasses head camera, within the regions about the driver's environment, using optical flow methodology. The high robustness and accuracy of the optical flow based tracking - measured with a residual mean error of ca. 0.3 pixels on sequences, captured and verified in 576 individual trials - enables a fully automated estimation of the driver's attention processes, for example in the context of roadside objects. We present results from a typical driver distraction study and visualize the performance of fully aggregated human attention behavior.