CPS:MEDIUM: Radar-based Perception and Control for Small Autonomous Robots
Lead PI:
Deepak Vasisht
Abstract
This project aims to establish millimeter wave (mmWave) radar as a first-class perception and control tool for small-sized robots and drones. Small-sized robots and drones are key enablers for many emerging applications ? precision agriculture, inventory management in smart warehouses, drone-based delivery of goods, and search & rescue operations. Perception and control of such robots and drones is fundamentally challenged by their over-reliance on optical sensors, low power budgets, and limited computational capabilities. Optical sensors such as cameras and lidars cannot see through smoke, dust, and fog and miss transparent surfaces like glass. In contrast, mmWave radar sensors can see through smoke, dust, and dark conditions, and can detect glass and low-texture surfaces. Due to their small wavelengths, these sensors can be packaged in small form factor devices and are being increasingly commoditized, with each sensor costing less than a hundred dollars. Despite these advantages, mmWave radars have seen limited use in robotic perception and control. This project aims to build new perception and control pipelines for radar sensors that are performant on the small onboard compute of robots and accessible to the robotics community. The project will demonstrate these pipelines on robots in real-world scenarios such as agriculture and create new educational and outreach materials including open-source software, a research workshop, and student-training modules.<br/><br/>The project outlines a new approach that bridges signal processing techniques with modern machine learning methods and tightly integrates perception and control. Specifically, the project will utilize this approach to enable four new capabilities: a new radar-based localization and mapping pipeline that uses neural networks and antenna array processing to create high-fidelity maps of the environment; a new passive mmWave markers that can seamlessly interact with off-the-shelf radars and use these markers to identify unseen objects in environments to obtain semantic scene understanding; neural fields to generate a realistic 3D model of the environment and enable realistic simulations of radar signals in this environment; and, build on these techniques for active perception strategies wherein the robot or drone seeks optimal viewpoints or modifies hardware parameters for enhanced perception. These components form a new approach for small robots and drones to sense and react to the environment, even in visually degraded conditions, improving robustness of robot operation.<br/><br/>This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.
Performance Period: 07/15/2024 - 06/30/2027
Award Number: 2414227