Human Factor Modeling from Wearable Sensed Data for Evacuation based Simulation Scenarios

Publication from Digital

Paletta, L., Wagner, V. and Kallus, W. and Schrom-Feiertag, H. and Schwarz, M. and Pszeida, M. and Ladstaetter, S. and Matyus, T.

Proc. 5th International Conference on Applied Human Factors and Ergonomics, AHFE 2014, 3rd International Conference on Digital Human Modeling \& Human Factors, July 19-23, 2014 Krakow, Poland , 1/2014


The design and the evaluation of evacuation systems are crucial to guarantee successful responses after an incident. Recent results are presented that target to significantly improve evacuation simulation by parameterizing the human agents behavior with information on human factors about stress, perception and decision making. In particular, the single person's behavior in its specific situational context is investigated in the frame of its embodied decision making. For this purpose, users were equipped with wearable sensors that capture information about the environment, the psychophysiological status of the user, and its viewing (eye tracking glasses) and motion behavior. The studies take place during regularly performed evacuation exercises of large business buildings. From the correlation between the multisensory perceptual and psychophysiological data on the one hand, and the automatically sensed and interpreted situational context on the other hand, we will extract a rule base with a set of logical pre-condition action pairs that will parameterize the crowd simulation model.