Sure, the market for unmanned aerial vehicles (UAVs), more commonly known as drones, is booming. Revenues should reach $6 billion this year, with an increase of 34 percent, to $11.2 billion by 2020, according to Gartner. And the number of vehicles to be produced in 2017 will grow by 39 percent to 2,991 units, potentially used for everything from evaluating forest fire risk to monitoring of traffic conditions.
For drones to become ubiquitous they need a sophisticated, fool-proof ability both to sense items in their way and avoid crashing into themBut, according to many technology experts, for drones to become ubiquitous they need a sophisticated, fool-proof ability both to sense items in their way and avoid crashing into them. Specifically, they have to know how to stop from imminently colliding into other airborne vehicles (collision avoidance), as well as how to steer clear of branches, walls and other things in the way, before a collision is close at hand.
“We need UAVs to make real-time navigational decisions on their own,” says Alexander Harmsen, co-founder of Iris Automation, a startup with a visual sensing system for drones.
Now, more researchers are trying to develop increasingly advanced collision and conflict avoidance systems, addressing everything from sensor limitations to unpredictable obstacles .
Small sensors and an unpredictable nature
Some of the biggest challenges for researchers include the limited sensors on small drones flying at low altitudes and doing anything from package delivery to crop monitoring. Another issue is the unpredictable nature of aircraft.
“The other vehicle you’re trying to avoid could, say, suddenly turn left and the drone needs to be able to react,” says Mykel Kochenderfer, assistant professor of aeronautics and astronautics at Stanford University.
With that in mind, researchers are experimenting with a variety of computer-based solutions for these problems. One emerging area of work involves a “partially observable Markov decision process”, according to Kochenderfer, otherwise known as POMDP. That’s a mathematical formulation addressing situations where there’s uncertainty about the trajectories of vehicles with complex and potentially competing objectives. It allows a computer to produce a decision-making strategy for controlling the problem.
“The computer decides exactly when to turn left or right or when to not alert anyone at all,” says Kochenderfer. “So the computer, not a human operator, figures it out.” A potentially competing objective might include, for example, a need to be safe without taking too many steps.
A team being led by the FAA has been testing a collision avoidance system derived from the solution to a POMPD for large manned and unmanned aircraft over the past few years; a major test, in fact, recently took place. (The FAA is focused on collision avoidance, while NASA is concentrating on conflict avoidance, according to Kochenderfer).
To create computerized collision avoidance strategies, Kochenderfer and other researchers are looking at using deep neural networks, which basically are computer simulations of the way the human brain works. The issue: Neural nets have been successfully used for a variety of non-safety critical tasks, like speech recognition. But there’s a concern these networks aren’t reliable in all situations and, thus, can’t meet the needs of collision avoidance systems.
To that end, Kochenderfer and other researchers are working on an automated theorem proof of certain properties about neural networks. A property, in this instance, means a sentence that can be proved true or false, according to Kochenderfer. Example: If you follow the net’s recommendation, you won’t collide with another aircraft. They’re now exploring the matter of whether the neural net’s advice will be safe if it’s made under certain assumptions about the physical dynamics of an aircraft.
While conflict avoidance generally requires human air traffic controllers, a lot of current research involves unmanned traffic management (UTM) systems, which provide, in effect, a layer of protection ensuring that problems don’t get to the point of a potential collision. Working with funding from NASA Ames, researchers, for example, recently did a flight test of small hexcopter (six propeller) drones on campus. Using NASA protocols, they developed algorithms allowing them to create a conflict avoidance system able to integrate with NASA’s framework and send alerts to operators in real time.
Then there’s the matter of “situational awareness”, or the ability for a drone to be aware of other vehicles in its space. It’s a key ingredient in conflict avoidance. One approach is to track a drone’s location using GPS, then communicate the information to operators. Researchers at NASA Ames are investigating how to do that, looking into different radio frequencies and protocols for sending messages between the aircraft, UTM system and operator.
All about sparcity
Another area of research: drone swarms, or groups of multiple UAVs and how they interact without colliding into one another. To that end, Mac Schwager, also at Stanford—he’s director of the Multi-robot Systems Lab—is studying what he calls “sparcity” and “dynamic obstacles”: quickly determining an object’s trajectory with a minimum of data.
“If some drones are trying to deliver packages in a neighborhood and there’s a kid playing, you need to know where that child might move,” says Schwager.
He’s working on systems able to predict such a child’s potential movements and, even before that, categorize what kind of obstacles the drones are likely to encounter in general. When an object is recognized as belonging to a specific category, the system can pinpoint the model that should be applied to assess possible movement paths. If there’s a child, for example, the assumption would be there’s a high degree of unpredictability, and that, in turn, would influence how cautiously the drones need to move, with a maximum of efficiency.
Schwager readily acknowledges that researchers have a long road ahead. “This is a really new area,” he says. “There’s a lot of work to be done.”
The contents or opinions in this feature are independent and may not necessarily represent the views of Cisco. They are offered in an effort to encourage continuing conversations on a broad range of innovative technology subjects. We welcome your comments and engagement.
We welcome the re-use, republication, and distribution of “The Network” content. Please credit us with the following information: Used with the permission of http://thenetwork.cisco.com/.