Human Factors Lab

The Human Factors Lab combines state-of-the-art measuring technologies with innovative analytical software for conducting human-centred studies in applied research in the field of "Human Factors and Ergonomics".


Human Factors focuses on the human being, his behaviour, stress and emotions in order to adapt technical systems optimally to the human being.

In the Human Factors Lab innovative measurements can be carried out to analyse human attention, stress, emotion and behaviour.

Infrastructure and Measuring Methods

With the following measurement instruments and platforms new measurement technologies are developed, simulations of human-machine interactions are created and intelligent, multi-modal interfaces for human-centred assistance systems are developed in the Human Factors Lab:

Eye-tracking glasses

Eye-tracking glasses are used to conduct attention studies for the application areas of marketing (attention in business areas), usability (mobile devices, guidance systems), human-machine interfaces (control centre, flight simulator) mobility (drivers, driver assistance, driver safety, etc.) and performance assessment (emergency services, sports).

Gaze Analytics

Stationary eye-tracking measurement techniques can be used to conduct precise on-screen studies in the areas of market research (product and shelf research), evaluation of guidance systems (signage), and video analysis.


Digital, precise and portable/wearable biosensors can be used to objectively analyse physical and cognitive stress scenarios, especially in connection with eye-tracking analysis.

Emotion Analytics

For emotion analytics, biosensors (analysis of facial muscles) and video analysis (Affectiva system) are used.


This might interest you:


The infrastructure for human factors measurements includes in detail:

  • Portable Eye Tracking Glasses (i) Pupil Labs Invisible, eye camera 200Hz @ 192x192 px IR illumination, scene camera 30Hz with 1088x1080px, field of view 90°x90°, real time video streaming 55 fps gaze data with video, microphone, accelerometer and gyroscope, (ii) SMI ETG1, ETG2, 30-60 Hz, parallax compensation, HD 1280 x 960 pixel scene video with gaze annotation, positioning accuracy of 0.5-1°, field of view 70° horizontal x 55° vertical
  • Static precision eye tracking, 500 Hz binocular, RED 500, spatial resolution 0.03°.
  • BeGaze Eye-Tracking editing software, MPEG-4 scene video with gaze annotation.
  • Psychophysiological measurements with BIOPAC portable biosensor technology (EDA, ECG, PPG, EMG, respiration), emotion measurements (Facial EMG) for real-time measurement in the lab and off-line measurements for up to 3 persons indoors/outdoors.
  • Cyberith Virtualizer Elite 2 for mobility research in virtual reality environments.
  • 2 Social Assistive Robots (Softbanks Inc., Pepper) for measuring social interactions.
  • Virtual Reality based measurements with Eye Tracking (HTC Vive Pro Eye), Oculus. Software for VR-based estimation of cognitive state and mindfulness score.
  • Augmented Reality (AR) with Eye Tracking: 2 HoloLens 2), Hololens with Pupil Labs Eye Tracking; AR-HMD based eye tracking measurements for concentration, stress and cognitive multi-tasking load.
  • 10 OptiTrack IR cameras with high precision position tracking and inlet eye tracking device.
  • iMotions multi-sensor time synchronisation software.
  • Emotion classification software (iMotions Affectiva) based on video-supported classification of facial emotions.
  • Eye tracking and analysis with Tablet PCs: Mobile Eye Tracking Hardware for Tablet PCs (Microsoft Surface); web camera based software WebGazer for Android Tablets.
  • “Human Factors Analysis Multi-Sensor Measuring System” for precise time-synchronous multi-channel analysis of psychophysiological measurements and simultaneous measurement of eye movements.

Research activities and objectives

The Human Factors Lab is particularly active in the development of assistance systems in the AAL and Digital Care research area. One focus is on "ICT and Dementia" with the development of gaze-based dementia training within the componente  MIRA (Mobile Instrumental Review of Attention) as well as VR-supported relaxation for expanding the cognitive reserve.

    Reference Projects and Products

    • multimodAAL - Playful multimodal intervention, monitoring and decision support for activation of people with Alzheimer’s dementia
    • PLAYTIME- Playful Multimodal Daily Training, Diagnostics and Recommendation System within a Social Network.
    • AMIGO - Analysis and motivation of training activities for persons with dementia by social robotics with dialogue-based coaching
    • CollRob - Collaborative Robotics
    • MMASSIST - Assistance Systems in Production in the Context of Man – Machine Cooperation
    • SIXTHSENSE - Wearable Biosensors and Decision Support for Risk Monitoring of First Responders


    digitAAL Life is our project partner and offers digital solutions for health and care.

    Explorative studies and analytics using state-of-the-art methodology of international research


    • Cooperation with University of Graz, Institute for Psychology (psychophysiological measurements, stress analysis).
    • Cooperation with Medical University Graz, Institute for Nursing Sciences (assistance systems for people with dementia, nursing staff).
    • Cooperation with University of Augsburg, Chair of Embedded Intelligence for Health Care and Wellbeing (classification of audio data).
    For further information please contact