In theory, the control of a flying drone should not be much different from the operation of the helicopter, but it is difficult, often two dimensional depending on cameras and joysticks, to visualise the current position of the drone relative to environmental obstacles without a person in a cockpit.
Researchers Chuhao Liu and Shaojie Shen of the Hong Kong University of Science and Technology this week revested an intriguing new way of developing live 3D maps, which can be easily traced to goals that are viewed over a flat surface by drone pilots.
On the display side, a Microsoft HoloLens headset generates the content of augmented reality like a colour full map of voxel, which can be viewed from any perspective, using autonomous depth cameras of the drone and radiation for real-time location data. The system provides a lively and highly spatial sense of elevation and depth for the environment , which allows the drone to easily be seen and re-positioned from a third-person perspective.
Then HoloLens feeds commands back into the drone by turning his hand gestures and gazing into points and clicks controls to determine his next target in the holographic map. The self-contained drone will then fly to the new site to update the 3D map.
At least on the holography side, a demo video provided by the researchers seems to be straight from scientific films. The drone provides only 3D map data to an AR interface, not accompanying first-person video due to bandwidth constraints.
The University of Hong Kong team still has the way before it is ready to deploy the holographic drone control system. Initially, the data of the drone was shared with Wi-Fi in an indoor test space, but 5 G cellular connections are likely to work outside once 5 G networks progress beyond their currently limited drone stage. The investigators also noted that HoloLens' "very limited field of view in ARs" has caused frequent complaints in a group of testing devices. Furthermore, despite their familiarity with AR hardware, testing professionals required a practice which could lead to signals or imperfect three-dimensional users.
It should be noted that 3D mapping data are more efficient in terms of bandwidth than first person live video and only need 272 MB data for 10 times a second updates compared with 1.39 GB data for sending video imagery for the first person at 30 frames per second. The team wants to include both types of streams for the benefit of the user and to optimise the data to achieve the minimum network bandwidth.
Despite the problems, there's a lot of potential for the holographic AR system. Apart from the visual novelty of the interface, the use of a portable standalone AR headset is extremely useful when controlling a remote vehicle, instead of needing an integrated computer, monitor and joystick. At the International Conference on Intelligent Robots and Systems scheduled for October 25-29, 2020, the researchers will formally present their 'first steps' in combining AR with autonomous drones.