Current manufacturing facilities already meet the limits of the required throughput of their products to be produced. This is because of requirements such as growing product diversity and short-term customer requirements. Despite a high level of automation, production chains lack flexibility to adapt to fast changing demand. To counteract this deficiency, partial processes are implemented as islands whose spatial arrangement and concatenation can be dynamically changed in the overall process. Due to the constantly changing production sequences and locations, transportation between the islands and their machines is difficult to automatize. FlexIFF introduces intralogistics task teams consisting of people, mobile robots and mobile manipulators. These cyber-physical systems are able to handle the transport steps which are necessary to execute the production plan in a coordinated manner. Human team members as well as operators keep an overview through assistance systems by the use of, e.g., augmented reality (AR)-supported interaction methods, and can intervene in support, provide solutions to new problems and optimize them.
The project is coordinated by JOANNEUM RESEARCH ROBOTICS. Project leader is Dr Horst Pichler.
EFFECTIVELY SUPPORTING WORKERS IN HUMAN-ROBOT COLLABORATION
The main objective of this work package is the research and development of an intelligent interface framework for the intuitive and systematically efficient coordination of teamwork operations in a collaborative suite of human and cyber-physical work forces, such as, mobile manipulators, in the industrial intra-logistics environment of a highly flexible and automatized manufacturer shop floor.
The main objective of this work package is the research and development of an intelligent interface framework for the intuitive and systematically efficient coordination of teamwork operations in a collaborative suite of human and cyber-physical work forces using augmented reality devices, human factors technologies and mobile manipulators, in the industrial intra-logistics environment of a highly flexible and automatized manufacturer shop floor. Main project development activities:
- Creation of mobile interfaces using innovative augmented reality (AR) devices for close collaboration and mixed reality (MR) head-mounted devices (HMD) for remote assistive interaction
- Evaluation of optimized information ergonomics for collaborative human-robot tasks
- Research on the optimization of interaction driven by innovative real-time sensing of human factors, such as, situation awareness, stress, conflict detection, cognitive workload
- Intuitive interface using gaze, gesture, visual and speech communication that aims at an efficient reduction of complexity in joint operations involving the human and the autonomous robot
Intelligent user interfaces aim at a significant contribution to safety, interaction ergonomics and performance optimisation. This should happen in the frame of global and human related automation processes, at the workplace working in a collaborative team, side-by-side with autonomous robots. Latest AR/VR technologies, e.g., MS Hololens, Meta One, Atheer, HTC Vive and FOVE will be investigated for their potential, and evaluated capabilities to improve human-robot collaboration. The goals of this work package are:
- Model collaborative interaction in flexible intra-logistic environments (D5.1)
- Assess and select advanced devices and knowledge infrastructure for MR interaction (D5.2)
- Implement gaze-based situation awareness, information and interaction ergonomics in a intra-logistic context (D5.3)
- Efficiently realize assistance for resolution of mixed team conflicts and enable transfer of learning from human to robot operators (D5.4 and D5.5)