Self-driving cars should look down, not just ahead
Self-driving cars are like snowbird retirees. Given the freedom to live, or operate, anywhere in the country, they turn their backs on wintry states and flock to the sun. There’s a reason Waymo, Uber, and even grocery store giant, Kroger, are testing their shiny new autonomous vehicles in southwestern cities like Phoenix. Yes, Arizona’s regulations are friendly to them, but the year-round good weather is the major draw. As sophisticated as these machines are becoming, they still struggle when fog reduces visibility, or when snow covers lane markings.
But eventually, AVs will have to learn to navigate the wintry mix. That's where WaveSense, a Boston-based startup, sees an opportunity. The company, launching formally today, wants to take technology developed at MIT for the military, and use it to give self-driving cars an extra sense. It says equipping vehicles with a radar looking down, penetrating the ground, will give them a new way to map exactly where they are in the world without relying on visual clues or GPS.
The tech comes out of MIT’s Lincoln Laboratory, a defense R&D center, and was first deployed in 2013 to help troops navigate in Afghanistan, where staying on path and avoiding landmines is a matter of life and death.