Sense and Avoid for Drones is No Easy Feat

But development is vibrant, and you’ll see it work first in prosumer drones

THE FACTS:

“Sense and avoid” for drones is a popular topic in the press right now, but the phrase can mean different things in different contexts and for different people. To clarify, there is a difference between solving the problem of “sense” and solving the problem of “avoid.”  Also, there is a difference between “airborne collision avoidance” (which is what most concerns the FAA) and “obstacle avoidance” (which is the problem that most manufacturers are trying to solve right now). With that in mind, this post looks at what a few manufacturers and software providers are doing to solve obstacle avoidance.

WHAT’S COOL AND WHAT’S NOT:

DJI – DJI was one of the first to release a drone that could sense and avoid obstacles. In June 2015, they announced Guidance, a combination of ultrasonic sensors and stereo cameras that allow the drone to detect objects up to 65 feet (20 meters) away and stay away from objects at a preconfigured distance. The kit was immediately available for the Matrice 100 drone development platform.  They subsequently incorporated that technology into their flagship Phantom 4 prosumer drone but not their new professional drone, the Matrice 600.

The Phantom 4 has front obstacle sensors combined with advanced computer vision and processing that allow it to react to and avoid obstacles in its path. The secret sauce for the Phantom 4’s ability to sense and avoid obstacles in real time and hover in a fixed position without a GPS signal is a set of specialized software algorithms for spatial computing and 3D depth sensing. These algorithms are coupled with an onboard Movidius vision processing unit (VPU) that gives the Phantom 4 drone the ability to sense and avoid obstacles in real time. In the “TapFly Mode” of the flight control program, the Phantom 4 obstacle sensing systems are supposed to enable you to fly a path with the drone automatically moving around objects as it flies. But you can find several real-world tests like this one that show it’s not a perfect system.

Intel – Intel is all over sense and avoid, and they accomplish it with active sensors. In 2015 at the Consumer Electronics Show (CES), they gave this sneak peek at what they were working on. In January 2016, they acquired German drone manufacturer Ascending Technologies (AscTec) and dazzled CES with an on-stage demo of their Intel® RealSense™ technology integrated into an AscTec drone that showcased how it can avoid obstacles and continue to follow the subject. They recently announced their Aero Ready-to-Fly Drone, a fully functional quadcopter powered by the Intel® Aero Compute Board, equipped with Intel® RealSense™ depth and vision capabilities and running an open-source Linux operating system. It is geared for developers, researchers, and UAV enthusiasts.

It’s clear Intel understands the importance of sense and avoid technology for ready-to-fly prosumer and commercial drones, too. In June 2016, Intel announced the addition of a factory-installed Intel RealSense R200 camera and an Intel Atom processor module for Yuneec’s Typhoon H.  The module will map the Typhoon H’s surroundings in 3D, which it then uses to autonomously navigate its environment—including rerouting itself around obstacles. Yuneec’s Typhoon H camera drone already had the ability to stop itself before colliding into large objects. But now it should avoid obstacles and keep moving right around them. We’ll see if that comes true in the real world. Let’s hope it does. Otherwise Intel’s $60 million investment in Yuneec may show signs of not delivering the expected return.

Either way, Intel has hedged its bets. In July 2016, a team from Intel and Airbus demonstrated an aircraft visual inspection with a modified AscTec Falcon 8 with RealSense cameras. The demo took place during this week’s Farnborough International Airshow in England on an Airbus passenger airliner.  Additionally, in September 2016, Intel acquired DJI’s VPU vendor Movidius, which means they may have the market cornered for sense-and-avoid technology.

ParrotParrot’s S.L.A.M.dunk integrates advanced software applications based on the robotic mapping construct called “simultaneous localization and mapping,” or SLAM.  The name of Parrot’s solution is a play on the words “slam dunk,” but really it’s anything but that.  SLAM is a computational problem of constructing or updating a map of an unknown environment while simultaneously keeping track of an agent’s location within it. Parrot’s use of SLAM enables a drone to understand and map its surroundings in 3D and to localize itself in environments with multiple barriers and where GPS signals are not available. In other words, it performs obstacle avoidance. Their solution depends on active sensors. You can read more here.

NeuralaNeurala is a software solution that analyzes the images from off-the-shelf cameras to enhance drone navigation. Unlike Parrot’s solution, Nueurla technology is passive. It uses GPU-based hardware running artificial intelligence neural network software. While commercial-grade GPS can fly a drone close to its objectives, Neurala software can help it identify safe areas to travel and land. At InterDrone, Neurala announced the launch of Bots Software Development Kit. The kit will allow manufacturers to install artificial-intelligence “neural” software directly into their applications without the need for additional hardware. That said, full collision avoidance is still under development.

LeddarTech – Leddar just announced its modular Vu8. The specs make it ideal for autonomous drone use. The Vu8 is a compact solid-state LiDAR sensor that detects targets at a range of up to 705 feet (or 215 meters) and weighs 75 grams. The Vu8 is an active sensor that “could be” used for collision avoidance, navigation, and as an altimeter for drones. According to LeddarTech, the Vu8 LiDAR is “immune to ambient light” and was designed to provide “highly accurate multi-target detection over eight independent segments.” There are some cool details in this video but no real-life use on a drone demo just yet.

BOTTOM LINE:

At this time, the drone industry appears to be rich with R&D and solutions that attempt to tackle the obstacle avoidance problem. But a simple search on YouTube for successful real-world examples reveals we still have a way to go before anyone claims victory. I like what LeddarTech says:

Available drones sensing solutions for position and range measurements as well as for collision avoidance are still far from perfect: GPSs and barometers aren’t full-proof—even outdoors—and can’t be relied upon when navigating indoors. Ultrasonic altimeters have very limited range. Optical flow sensors require good lighting and textured surfaces, and camera vision are still a work in progress and tend to be processing-intensive.

As with any technology, there are always trade-offs. It’s still not clear to me who has the category-killing solution. I think that’s going to take more R&D investment. One thing is for sure—we’ll see more new sense-and-avoid product and tech announcements this year. Like with DJI, I believe it will continue to be released first in prosumer drones because that’s the only place where sales volumes and margins are strong enough to recoup the investment.

Image credit: Intel

This post first appeared on DRONELIFE.com

Can PrecisionHawk Tame Drone Traffic in the Sky?

PrecisionHawk’s LATAS delivers an innovative air traffic control system for drones, but it’s one of several that depends on the not-so-imminent success of all aircraft using ADS-B.

THE FACTS:

This past week (August 29, 2016), the FAA granted PrecisionHawk a waiver from Part 107.31’s visual line of sight (VLOS) limitations, which gives them the ability to continue their research and to train those who want to offer these extended visual line of sight (EVLOS) flights as a service. The waiver was granted based on over a year’s worth of testing under the FAA Pathfinder program. Under Pathfinder Phase 1 research, PrecisionHawk determined that the extension in range offered by EVLOS operations supports a significant expansion in the area that each drone flight, possibly up to 12 times what is achievable within line of sight.

To do this, PrecisionHawk uses their airspace display technology called LATAS, which stands for Low-Altitude Tracking and Avoidance System. LATAS is an onboard system that connects airspace management technologies, such as sense and avoid, geo-fencing, and aircraft tracking, into a service package for commercial and recreational drone operators as well as regulators and air traffic controllers. Developed to be plug and play or integrated into a drone’s circuit during manufacturing, LATAS is small (3-in by 2-in by 1-in), light (Less than 100 grams) and operational on network speeds as low as 2G. While it is not required to receive an EVLOS waiver, LATAS plays a key role in PrecisionHawk’s own operations. The LATAS web application is a free tool available on www.flylatas.com  and is intended to provide an extra layer of safety and protection operators flying under Part 107.

WHAT’S COOL AND WHAT’S NOT:

As I’ve have noted in Market Impact of the FAA Small Drone Rule, the inability to fly EVLOS restricts some high-margin operations. This new ability allows a drone to improve its economic efficiency and cover acreage which is needed for a large percentage of agriculture fields, mining operations, and large infrastructure sites.

One problem we see with this type of system is it may not be reliable in remote areas.  Even though cell network companies are working to extend their networks by partnering with rural carriers, everyone who uses cell phones knows about gaps in service that happen unexpectedly. These gaps could have much more serious consequences than a dropped call if they happen to a small drone.

Additionally, we see integration with the Harris ADS-B (Automatic Dependent Surveillance – Broadcast) network data as a good thing, but, as we have written about in our in-depth research study ADS-B and Its Use for Small Drone Traffic Management, the FAA’s NextGen mandate for ADS-B has inherent limitations. For one, use of ADS-B “Out” (the signal that says “here I am”) is not required in Class G airspace where most small drones fly, and two, the FAA did not mandate ADS-B “In” (the ability to see other traffic). These together are killers for its effectiveness. Aircraft (including drones) can push all the “Out” signals they want, but if other aircraft can’t receive or “see” them, then they don’t know where your aircraft is and no avionics system can overcome that.

THE COMPETITION:

PrecisionHawk is not alone in their endeavor, and we’re beginning to see others create ADS-B based solutions for drones.

For example in July 2016, DJI and uAvionix announced the release of an ADS-B collision avoidance developer kit. The uAvionix “Ping” sensors are among the smallest and lightest ADS-B-based hardware available for unmanned aircraft. Their Ping ADS-B receiver allows a drone to “see” surrounding aircraft and initiate collision avoidance maneuvers based on that information.

Sagetech has created a family of transponders ideal for the size, weight, and power requirements of unmanned systems applications. Their XP transponder data can be output via RS-232 serial communications to a wide range of compatible flight computers.

Other drone traffic management paradigms have been proposed – for example Google’s SkyBender and Amazon’s “Good, Better, Best”.  I could go on, but you get the point. The pot is beginning to boil.

BOTTOM LINE:

The current FAA plan emphasizes using small UAS in areas outside airport locations which should be geo-fenced to avoid drones interfering with large vehicle landing and take-off activities. But for all these drone traffic management plans, ADS-B technology (or ADS-B-like signal integration) is a key element for tracking and reporting a drone’s position.

NASA knows that someday unmanned vehicles will share airspace at low altitudes with general aviation equipment such as airplanes, helicopters, and gliders. That is why it created the Unmanned Aircraft System (UAS) Traffic Management (UTM) initiative.  Agreeing on a safe and efficient system that will manage both manned and unmanned traffic is a vital concern for the FAA, NASA, private companies, and academic users.

But given the inherent limitations of ADS-B, will any of these systems work as intended?

Image credit: PrecisionHawk

This post first appeared on DRONELIFE.com