Researchers at Simon Fraser University’s Autonomy Lab have used artificial intelligence to allow drones to respond to the facial expressions and hand gestures of humans.
For first responders, it could mean the saving precious time and therefore also lives.
A case in point could be the two lives saved at a beach on the east coast of Australia earlier this year, when two teenagers, seen waving their arms in rough surf were saved after a surf life saving drone dropped a life raft after being spotted by observers.
In such a case, if the beachgoers had not seen the boys, they could have drowned.
Computing science professor Richard Vaughan, who leads the research, notes that drones with the autonomous ability to recognise human gestures would also be useful in situations where a pilot is not available to fly it.
“Most commercial drones today come with controllers which work really well, but sometimes you may find yourself in a situation where your hands are busy,” he says. “Or maybe you weren’t expecting to interact with a drone today so you don’t have special equipment with you.”
Because drones can be deployed very quickly, they make ideal tools for assistance in many emergency situations, such as fire fighting and rescue missions.
The use of drones can reduce a task that used to take hours and hundred of people – such as a search and rescue mission – to minutes.
An instance where the rescuer has their hands otherwise occupied could benefit from being able to guide the drone without the need for a traditional remote control or joystick.
To better adapt drone technology to such tasks, Vaughan’s team is creating a drone that can be directed to turn midflight – or even flip upside down – merely when the pilot waves their arms.
Speaking in a video released by the university, Vaughan explains that they want the drones to understand our intentions in ‘a natural and intuitive way’.
They are also developing a drone that can recognise a facial expression – what the researchers refer to as a ‘trigger face’. When the drone ‘sees’ the facial expression, it responds accordingly by peforming an action such as taking a photo.
Eventually, Vaughan hopes that we will be able to work with drones and other robots as if they were our work colleagues.
“Our work is to try and make robots more capable and able to look after themselves and survive in interesting, complex environments and work around people,” he continues.
“We would like to get to the point where interacting with a robot is as easy as working with a co-worker or a trained animal,” says Vaughan.