Aggregated mapping of driver attention from matched optical flow.

Publication from Digital

Roland Perko, Michael Schwarz, and Lucas Paletta

IEEE International Conference on Image Processing, number 21, pages 214-218, Paris, France , 8/2014


Eye tracking research about driver distraction, applied to real world driving tasks, has so far demanded a massive amount of manual intervention, for the annotation of hundreds of hours of head camera videos. We present a novel methodology that enables the automated integration of arbitrary gaze localizations onto a visual object and its local surrounding in order to draw heat maps directly onto the environment. Gaze locations are tracked in video frames of the eye tracking glasses’ head camera, within the regions about the driver’s environment, using optical flow methodology. The high robustness and accuracy of the optical flow based tracking - measured with a residual mean error of ≈0.3 pixels on sequences, captured and verified in 576 individual trials - enables a fully automated estimation of the driver’s attention processes, for example in the
context of roadside objects. We present results from a typical driver distraction study and visualize the performance of fully aggregated human attention behavior.