Aggregated mapping of Driver Attention from matched optical flow
Publication from Digital
Perko, R. and Schwarz, M. and Paletta, L
IEEE International Conference on Image Processing , 1/2014
Eye tracking research about driver distraction, applied to real world driving tasks, has so far demanded a massive amount of manual intervention, for the annotation of hundreds of hours of head camera videos. We present a novel methodology that enables the automated integration of arbitrary gaze localizations onto a visual object and its local surrounding in order to draw heat maps directly onto the environment. Gaze locations are tracked in video frames of the eye tracking glasses head camera, within the regions about the driver's environment, using optical flow methodology. The high robustness and accuracy of the optical flow based tracking - measured with a residual mean
error of 0.3 pixels on sequences, captured and verified in 576 individual trials - enables a fully automated estimation of the driver's attention processes, for example in the context of roadside objects. We present results from a typical driver distraction study and visualize the performance of fully aggregated human attention behavior.