Robotics

RNN-based Human Pose Prediction for Human-Robot Interaction

Publication from Robotics
Industrie-Robotersystem-Technologien

Chris Torkar , Saeed Yahyanejad , Horst Pichler , Michael Hofbaur , Bernhard Rinner

Proceedings of the Joint ARW & OAGM Workshop, pp. 76-80 , 5/2019

Abstract:

In human-robot collaborative scenarios human workers operate alongside and with robots to jointly perform allocated tasks within a shared work environment. One of the basic requirements in these scenarios is to ensure safety. This can be significantly improved when the robot is able to predict and prevent potential hazards, like imminent collisions. In this paper, we apply a recurrent neural network (RNN) to model and learn human joint positions and movements in order to predict their future trajectories. Existing human motion prediction techniques have been explored in a pseudo scenario to predict human motions during task execution. Building upon previous work, we examined their applicability to our own recorded dataset, representing a more industrial-oriented scenario. We used one second of motion data to predict one second ahead. For better performance we modified the existing architecture by introducing a different output-layer, as opposed to common structures in recurrent neuronal networks. Finally, we evaluated the artificial neuronal network performance by providing absolute positional errors. Using our method we were able to predict joint motion over a one second period with less than 10 cm mean error.

 

BibTeX Download:

Download (2 kB)