For self-flying aircraft to take to the skies, they need to learn about their environments to avoid hazards. NASA aeronautics researchers recently developed a camera pod with sensors to help with this challenge by advancing computer vision for autonomous aviation.
This pod is called the Airborne Instrumentation for Real-world Video of Urban Environments (AIRVUE). It was developed and built at NASA’s Armstrong Flight Research Center in Edwards, California. Researchers recently flew it on a piloted helicopter at NASA’s Kennedy Space Center in Florida for initial testing.
The team hopes to use the pod to collect large, diverse, and accessible visual datasets of weather and other obstacles. They will then use that information to create a data cloud for manufacturers of self-flying air taxis or drones, or other similar aircraft, to access. Developers can use this data to evaluate how well their aircraft can “see” the complex world around them.
“Data is the fuel for machine learning,” said Nelson Brown, lead NASA researcher for the AIRVUE project. “We hope to inspire innovation by providing the computer vision community with realistic flight scenarios. Accessible datasets have been essential to advances in driver aids and self-driving cars, but so far, we haven’t seen open datasets like this in aviation.”
The computer algorithms that will enable the aircraft to sense the environment must be reliable and proven to work in many flight circumstances. NASA data promises that fidelity, making this an important resource for industry. When a company conducts data collection on their own, it’s unlikely they share it with other manufacturers. NASA’s role facilitates this accessible dataset for all companies in the Advanced Air Mobility industry, ensuring the United States stays at the forefront of innovation.
Once the design is refined, through evaluation and additional testing, the team hopes to make more pods that ride along on various types of aircraft to collect more visuals and grow the digital repository of data.