Complex environments are nothing new to unmanned aerial vehicles (UAVs), commonly known as drones. In fact, drones have been tackling many challenging environments and monitoring different areas all around the world. From agriculture farming to surveillance, mapping, search and rescue and aerial photogrammetry, the use of UAVs to cope with highly dynamic responses has drastically increased.
The only challenge is the continuity of a UAV operation when GPS is degraded or denied, such as flying near high buildings and trees or flying from outdoors to indoors. This is the main subject of a new paper in which the authors Thanabadee Bulunseechart and Pruittikorn Smithmaitrie are presenting an algorithm for 3D localization during transition between indoor and outdoor environments. The paper is titled, “A method for UAV multi-sensor fusion 3D-localization under degraded or denied GPS situation”, and is published in the Journal of Unmanned Vehicle Systems.
As the authors state in the introduction of the paper:
“The main goal of this work is to develop an algorithm for 3D UAV localization between indoor–outdoor transition environment similar to works presented by Nyholm (2015) and Shen et al. (2014). However, proprioceptive and exteroceptive methods are adopted to apply to low-cost sensors.”
The anticipated benefit of their work, as they say, is to expand the frontier of semi-autonomous UAV operations to more applications that would involve GPS quality environment. One of them is the search and rescue application that uses a semi-autonomous UAV to survey a variety of areas including indoor areas, high-rise buildings and open spaces.
The paper initially focuses on multi-sensor fusion for estimation of 3D position, velocity, attitude as well as method of GPS measurement quality quantization, pre-scale process of SLAM as well as state estimation technique for UAV positioning which is separately discussed.
As the authors describe it, multi-sensor fusion (MSF) is “a signal processing technique to combine information from different sensors with an aim to provide robust and complete navigation information for UAV localization.”
Through various measurements within the scope of IEKF, sensors and barometers, the authors also focus on optical flow measurement to cover the terrain measurement models from different perspectives.
Another part of their measurement is the GPS quality indicator, which as they say, should “taken into account only when its variance is less than a specific value.”
In the following measurements, the authors focus on the pre-scale processes in vision measurement as well as the seamless transition position measurement where they find the consistency between the new position sensor measurement and the current estimated positions – as properly handled before feeding into the MSF as measurement for the next state estimation.
In the implementation and setup phases, the authors focus on the UAV platform, the real-time onboard software, control systems and determination of system parameters. As they state in their experimental results:
“Because this work is to design a smooth navigation system during transition between outdoor and indoor flight, the test emphasizes the UAV state in the transition period. To show the algorithm performance, the UAV is tested over a large open space and the result obtained from the proposed algorithms is compared with the conventional GPS-based cut-off algorithm. The test field encompasses an eight-storey building and another 10 m tall building to ascertain that GPS signals would diminish when the UAV is close to the buildings.”
In the end, the authors conclude that multi-sensor fusion (MSF) algorithm along with a GPS quality indicator for indoor-outdoor transition and pre-scale vision handling have been proposed in their work, showing that MSF work may be further improved by adding on more sensor measurements to assist UAV state estimation in many complex tasks such as avoidance of other moving objects.
Citation: A method for UAV multi-sensor fusion 3D-localization under degraded or denied GPS situation, Thanabadee Bulunseechart, Pruittikorn Smithmaitrie, Journal of Unmanned Vehicle Systems, 2018, 6:155-176, https://doi.org/10.1139/juvs-2018-0007