The widespread use of aerial robots in inspection, search and rescue, and monitoring has created a growing need for intuitive human-drone interfaces. These aim to streamline and enhance the user interaction and collaboration process during drone navigation, ultimately expediting mission success and accommodating users' inputs. In this paper, we present a novel human-drone mixed reality interface that aims to (a) increase human-drone spatial awareness by sharing relevant spatial information and representations between the human equipped with a Head Mounted Display (HMD) and the robot and (b) enable safer and intuitive human-drone interactive and collaborative navigation in unknown environments beyond the simple command and control or teleoperation paradigm. Our framework is validated through extensive user studies and experiments conducted in simulated post-disaster scenarios, with performance compared to traditional First-Person View (FPV) control systems. Multiple tests on several users underscore the advantages of the proposed solution, which offers intuitive and natural interaction with the system. This demonstrates the solution's ability to assist humans during a drone navigation mission, ensuring its safe and effective execution.
Intuitive Human-Drone Collaborative Navigation in Unknown Environments Through Mixed Reality
14.05.2025
3504689 byte
Aufsatz (Konferenz)
Elektronische Ressource
Englisch
Intuitive Human-Drone Collaborative Navigation in Unknown Environments through Mixed Reality
ArXiv | 2025
|Development of a Virtual Reality Environment (VRE) for Intuitive Drone Operations
SAE Technical Papers | 2017
|Development of a Virtual Reality Environment (VRE) for Intuitive Drone Operations
British Library Conference Proceedings | 2017
|Human-inspired Robot Navigation in Unknown Dynamic Environments
British Library Conference Proceedings | 2016
|