By using simulated sensor data, researchers can virtually test unmanned aerial vehicles in settings and scenarios that would be either too costly or too difficult to replicate in the field.

In 2018, the U.S. Air Force released a memo restricting the use of unmanned aerial vehicles (UAVs) on military installations such as Hanscom Air Force Base. The ban was proposed after a Department of Defense report found that military units were not sufficiently prepared to detect or evaluate cybersecurity risks associated with using small drone systems.

Since then, Laboratory staff have been finding creative ways to test UAV systems and capabilities without actually flying them in the test environment. To do this, they fly the UAV in an unrestricted location, such as the Laboratory’s indoor Autonomous Systems Development Facility, while providing it with simulated sensor data that tricks the system into thinking it is in a different location.

"The UAVs use their own accelerometers and gyroscopes to measure how they are oriented and accelerating, but their camera data is simulated, with the help of a motion capture system that tells the simulation where the UAV is and how it is moving," said Kevin Leahy of the Advanced Capabilities and Systems Group.

Through their experiments, the researchers are finding several benefits to flying UAVs in virtual environments.

Using simulated sensor data, UAVs can be virtually tested in settings and scenarios that would either be too costly or too difficult to replicate in the field— for example, a partially demolished building in a search and rescue scenario. With both the lower cost and lower risk involved, this approach could be a practical way to pretrain UAVs to perform certain tasks.

"We're using virtual environments with drones to help develop algorithms that execute tasks in the real world," said Dan Griffith, who is a staff member in the Artificial Intelligence Software Architectures and Algorithm Group and is working on a similar project to Leahy's. "These tasks may involve searching a large area for a specific target, exploring a building, or flying a race." An example of such a search scenario is shown in the video below.

A computer-generated image of a UAV flying over a forest
Virtual environments can help staff develop algorithms that enable UAV swarms to execute tasks. In this notional example, seven UAVs search a forested area for vehicle of interest, in this case a red pickup truck. Video: Constantine Frost

One Laboratory program, funded by the Technology Office, is developing new techniques that enable autonomous systems to execute tasks in data-starved environments. Another Technology Office program is creating a map shared among multiple autonomous agents that are faced with challenging conditions, like those following a natural disaster, that might interfere with perception and communications.  

Staff are also considering how modern video game engines (software-development environments that game creators use while building video games) could provide insights that may be useful for developing UAV machine-learning algorithms.

"Game engines have far outpaced robotics simulators when it comes to generating photorealistic environments and are starting to incorporate other high-fidelity sensors, such as lidar, for self-driving cars. We're leveraging state-of-the-art gaming technology to simulate common sensors and adapting that technology to more bespoke ones," said Griffith. "Along with improving simulated sensor fidelity, we've been experimenting with AI techniques to help bridge the gap between simulated and real-world data. We're hoping that we can bring these technology breakthroughs to researchers at the Laboratory who are developing state-of-the-art AI algorithms for drones."

One of the challenges to simulating environments for UAVs is relaying data to the system in real time. Since the UAV needs to make control decisions based on what it sees, it also needs to receive the data quickly enough to make those decisions.

Another challenge is that large, high-quality environments are time-consuming to create and memory-intensive to use. For these reasons, staff are developing a pipeline to create new environments and a streaming system to navigate those environments in real time. So far, they have used this process to build environments modeled on the Fort Devens Range Control and the area surrounding Joint Base Cape Cod.

Leahy and his colleague Mark Mazumder worked together for a roadmap study surveying how virtual environments can accelerate micro–air vehicle research at the Laboratory. The two staff members have also been working to facilitate collaborative research in this field between Laboratory and MIT campus researchers.

"We have a data connection between campus and the Laboratory's Autonomous Systems Development Facility [ASDF] so that we can have vehicles operating in the same virtual environment while they are in totally separate buildings," said Leahy. "The plan is to have two UAVs suspend a hoop in the air on campus, while a RACECAR [an autonomous robot] platform 'sees' the hoop in the ASDF. When the hoop is lined up correctly, the car will jump off of a ramp and through the hoop." Leahy plans to complete this demonstration before summer's end.