Let’s face it – we are now used to seeing drones everywhere, from parks to mountains, events and cinematographic purposes. However, drones were more than just a big thing back in the day. Even though they are currently manually operated, they have functionalities based on machine learning and computer vision.
In fact, these functionalities have emerged over the past few years – opening up the space for UAVs and maximizing their potential. One of the areas where this maximized potential can be used is UAV filming – which is still an industry that has to face several challenges.
The Need for Improvement in the Field of UAV Cinematography
To put it simply, autonomous UAV shooting nowadays has many constraints. Some of the most important ones include the motion types of UAVs which are not in total sync with the camera focal length (and therefore the range of feasible shot types) – resulting in avoidance of visual tracking.
A new paper submitted by four authors at the Department of Informatics at the Aristotle University of Thessaloniki in Greece opens up a lot of possibilities regarding UAVs or drones equipped with professional cameras. Not only drones like these have become an essential tool for all cinematographers during the past few years, they have also been adopted in TV/movies production, newsgathering as well as advertisements and outdoor event coverage – mostly because of their ability to capture shots that would otherwise incur higher production costs (with using helicopters or cranes).
This is the actual benefit that UAV operators have nowadays. However, as drones become more independent, their increasing automation has appeared and with that presented a new set of features powered by machine learning and computer vision modules.
Improving the Visual Target Tracking Through Five New (Remodelled) Camera Motion Types
The aim of this paper is to actually improve the shooting part – or the visual target tracking – that UAVs have and the one they need to operate properly. In cinematography, the shot type defines the percentage of the video frame covered by the target (or subject) that is being filmed and is mainly adjusted by the zoom level. Hence, the technology can not always provide crystal clear images or perfect visual target tracking.
As the authors outline and plan, there are five target-tracking UAV/camera motion types that have been modelled with their help – and ones which can perfect the UAV cinematography techniques as we know them.
- Lateral Tracking Shot (LTS)
- Vertical Tracking Shot (VTS)
- Fly-Over (FLYOVER)
- Fly-By (FLYBY)
- Chase/Follow Shot (CHASE)
All of these camera motion types will result in better motion and flight subject tracking. Aside from this, they would solve most of the focal length constraint and improve UAV cinematography in many ways.
Thanks to the five industry-standard target-tracking UAV and camera motion types, the maximum focal length constraints can be extracted for computer vision-assisted UAV physical target following.
In numerous cinematography applications, this is beyond essential – especially when the maximum focal length regulates the range of permissible shot types. As the paper concludes, “the derived formulas can be readily employed as low-level rules in the intelligent UAV shooting and cinematography planning systems.”
Citation: “Bottle Detection in the Wild Using Low-Altitude Unmanned Aerial Vehicles,” J. Wang, W. Guo, T. Pan, H. Yu, L. Duan and W. Yang, 2018 21st International Conference on Information Fusion (FUSION), Cambridge, United Kingdom, 2018, pp. 439-444. doi: 10.23919/ICIF.2018.8455565 | URL: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=8455565&isnumber=8454975