Radar is an ideal sensor for autonomous machines navigating outdoor environments. The gold standard in high-performance radar is electronically scanned array (ESA) radar, but until now they were not available to machine perception developers because of their size, weight, power and cost. Until Echodyne unveiled MESA, enabling a new line of high-performance ESA radar for autonomous flying and driving machines.
Some UAV applications can be conducted within the operator’s line of sight, but many of the high-value applications for UAVs require flying beyond the operator’s line of sight. For missions beyond visual line of sight (BVLOS), airspace situational awareness is essential for safety.
UAVs come in many sizes, for many purposes, and operate at varying altitudes. Larger aircraft, such as air taxis and cargo transport drones, can easily support a robust onboard sensor suite.
EchoFlight onboard a drone from Virginia Tech and the Mid-Atlantic Aviation Partnership testing NASA's airborne detect-and-avoid AI.
Smaller aircraft, such as medical or package delivery drones, might not have payload budget for large onboard sensors. Remote ID will provide awareness of cooperative UAVs in the airspace, but an additional solution is needed to provide full situational awareness of uncooperative aircraft in the airspace.
For dense flight areas, such as cities, corridors, and droneports, a network of ground-based sensors generating rich airspace situational awareness data benefits all fliers and has substantial cost advantages over equipping sUAS with multiple onboard sensors. Radar data is easily consumed by UAS Service Suppliers (USS) and integrated with UTM data feeds for managed airspace that accelerates the business of airborne machines.
EchoGuard radars deployed for ground-based airspace management by University of Alaska Fairbanks for the FAA's IPP, testing along the Alyeska pipeline.
Our most recent product introduction is for autonomous vehicles (AV) and is inspired by the way the human eyes and brain work together to understand their surroundings and resolve ambiguity. When a person’s peripheral vision detects something, the brain directs the eye to examine it further to remove the ambiguity.
The brain of the AV is referred to as the AV Stack. It is currently limited by 1-way data flows from sensor and fusion layers. The AV stacks of today have no ability to focus on a particular ambiguous element of the driving scene to remove ambiguity.
We believe this inability to focus sensor resources to remove ambiguity is a fundamental blocker to true "eyes-and-hands-off" vehicle transportation. Echodyne has developed a first-of-its-kind high resolution imaging radar that allows the AV stack to dynamically task the radar for data regarding specific regions of a driving scene.
EchoDrive radar integrated into sensor array for L4+ autonomous vehicle testing.
The idea of multiple package delivery companies operating fleets of drones in urban airspace rightly causes concern. Just as cities use advanced traffic management sensor and information networks to optimize the roadway system, so will cities need a similar sensor and information network for the unmanned aerial future.
Similar principles hold true for ground transportation. Autonomous vehicles of various sizes and functions are part of our future. Optimizing the flow of people and goods will be an increasingly data-driven real-time logistics challenge.
Echodyne offers a suite of products all related to improving transportation and transportation-dependent services while ensuring safety through data-driven situational awareness. If your company is involved in Smart Cities programs and would like to learn more about the role radars will play in safely managed Smart Cities, we would welcome the conversation.