HYPE2GO: Cost Maps from Hyperspectral Imaging

The HYPE2GO project seeks to augment autonomous system terrain understanding with hyperspectral imaging.
An illustration of a four-legged robot scanning a flooded muddy pathway in a forest.
An illustrated autonomous robot scans a flooded dirt road using hyperspectral imaging to understand surface conditions and to generate traversability cost maps for autonomous mobility. Illustration: Coralee Lazebnikova

Autonomous robots operating in unstructured, uncertain environments must rapidly interpret terrain to make safe navigation decisions. While many robotic systems are optimized for on-road conditions, austere environments — such as unmaintained rural roads or post-disaster landscapes — require richer and more nuanced sensing. Traditional robot perception pipelines rely heavily on RGB (red, green, blue) cameras and lidar. While effective in many settings, these modalities do not directly capture physical properties of terrain relevant to mobility. As a result, machine-learning models must compensate by increasing complexity, which can make their decisions difficult to interpret and less reliable in degraded environments.

Lincoln Laboratory’s HYPE2GO project explores whether alternative spectral sensing approaches, particularly hyperspectral imaging (HSI), can provide complementary information to improve navigation performance while using comparatively smaller machine-learning models. HSI captures dense spectral signatures that enable physics-based inference of material properties, such as moisture content, vegetative health, and surface composition. For this project, the research team developed a multimodal sensing platform that integrates snapshot HSI with lidar and inertial sensing and collected large-scale multimodal datasets across diverse terrain and lighting conditions. Research staff trained machine-learning models to use these data to generate traversability cost maps with minimal manual labeling, demonstrating a practical pathway for incorporating hyperspectral sensing into autonomous navigation.

The team’s ongoing efforts focus on expanding spectral coverage into the shortwave infrared, increasing spatial resolution of sensors, and ultimately demonstrating closed-loop autonomous navigation through on-the-go hyperspectral sensing. If successful, this work will enable autonomous systems to operate more safely and reliably in complex, degraded, and unpredictable environments where traditional perception methods fail.

The team welcomes collaboration from partners interested in scaling hyperspectral perception, expanding operational demonstrations, and transitioning this technology to real-world autonomous systems. Please contact Nathaniel Hanson for more information.