CAREER: High Integrity Navigation for Autonomous Vehicles
Lead PI:
Grace Gao
Abstract

The number of systems developed for applications including package delivery via small unmanned aerial vehicles (UAVs) and self-driving cars, is growing. To ensure safe and reliable positioning, it is critical to address not only positioning accuracy, but also the confidence in accuracy, defined as integrity. Most of the positioning and navigation studies for autonomous vehicles have focused on only accuracy, but not integrity. However, navigating autonomous vehicles equipped with relatively low-cost sensors in complex and rapidly changing environments -- e.g., urban areas with Global Positioning System (GPS) signal blockage -- poses great challenges compared to flying aircraft in the open sky, where positioning integrity has been well addressed by the Federal Aviation Administration (FAA)-regulated aviation industry. 

This project aims to assess, monitor and improve positioning integrity for autonomous vehicles, such as UAVs and self-driving cars, and integrate the proposed research into education and outreach. The project involves a novel positioning integrity assessment and monitoring solution that is robust in GPS-challenged environments and is suitable for navigation sensor fusion. The investigator will (1) derive a new algorithm to directly assess and monitor GPS integrity in urban environments; (2) design an integrity monitoring framework for GPS sensor fusion using camera vision, LiDAR and inertial measurements; and (3) improve integrity by turning unwanted multi-path signals into a useful navigational source based on physical interaction with the environment. 

This CAREER development plan will also integrate an education plan with the research goals by broadening participation of under-represented groups, such as women, by fostering a female researcher community through organizing female social events at technical conferences; educating and informing the public about FAA rules and safety issues regarding flying UAVs; and outreach to K-12 students by demonstrating the results of the proposed research at the Illinois Engineering Open House and leading hands-on activities for various school girl camps.
 

Performance Period: 07/15/2019 - 04/30/2024
Institution: Stanford University
Award Number: 2006162
CPS: Medium: Aerial Co-Workers: Augmenting Physical and Cognitive Human Capabilities
Lead PI:
Giuseppe Loianno
Co-PI:
Abstract

This project studies the algorithmic foundations and methodological frameworks to augment human capabilities via a novel form of physical and cognitive collaboration between human and multi-agent robotic systems, creating Aerial Co-Workers. These machines will actively collaborate with each other and with humans and tackle the fundamental gaps related to human-MAV collaboration at both physical and cognitive levels. The project is organized along two main thrust areas: Physical Collaboration and Cognitive Collaboration. The first thrust aims to significantly augment the physical ability of human workers by taking advantage of physical collaboration between the operator and a network of interconnected quadrotors, equipped with a set of flying hands", transporting objects. This will produce novel scientific solutions for human-robot collaborations to account for the complex legibility of the motions, and the variability of the relative positions of the agents. The second thrust aims to address two perception consensus problems to enable MAV-assisted augmented reality (AR) to augment the cognitive ability of operator(s). The key is to consistently collect, analyze, and display contextual information via multiple MAVs for effective and natural human-robot visual interactions. Aerial Co-Workers will get vantage viewpoints of the environment occluded from the humans which can be customized and augmented directly in the workspace to facilitate human actions via novel metric-semantic collaborative space mapping.

This project will have a strong societal impact as a disruptive technology for industry as well as the construction market, which is in urgent need of innovative solutions for enhancing the efficacy while maximizing safety. The outcome will enable safer, faster, and simpler task execution in scenarios including maintenance, inspection, transportation, and search and rescue. The project will contribute to lowering the barriers for new researchers in robotics, computer vision, and machine learning by making hardware designs, algorithms, datasets, and code available on open-source forums. The playful nature of AR tools and quadrotors employed in this project will contribute to engaging K-12 and undergraduate audiences.
 

Performance Period: 01/01/2022 - 12/31/2024
Institution: New York University
Award Number: 2121391
Collaborative Research: CPS: Medium: Spatio-Temporal Logics for Analyzing and Querying Perception Systems
Lead PI:
Georgios Fainekos
Abstract

The goals of Automated Driving Systems (ADS) and Advanced Driver Assistance Systems (ADAS) include reduction in accidental deaths, enhanced mobility for differently abled people, and an overall improvement in the quality of life for the general public. Such systems typically operate in open and highly uncertain environments for which robust perception systems are essential. However, despite the tremendous theoretical and experimental progress in computer vision, machine learning, and sensor fusion, the form and conditions under which guarantees should be provided for perception components is still unclear. The state-of-the-art is to perform scenario-based evaluation of data against ground truth values, but this has only limited impact. The lack of formal metrics to analyze the quality of perception systems has already led to several catastrophic incidents and a plateau in ADS/ADAS development. This project develops formal languages for specifying and evaluating the quality and robustness of perception sub-systems within ADS and ADAS applications. To enable broader dissemination of this technology, the project develops graduate and undergraduate curricula to train engineers in the use of such methods, and new educational modules to explain the challenges in developing safe and robust ADS for outreach and public engagement activities. To broaden participation in computing, the investigators target the inclusion of undergraduate women in research and development phases through summer internships.

The formal language developed in this project is based on a new spatio-temporal logic pioneered by the investigators. This logic allows one to simultaneously perform temporal reasoning about streaming perception data, and spatially reason about objects both within a single frame of the data and across frames. The project also develops quantitative semantics for this logic, which provides the user with quantifiable quality metrics for perception sub-systems. These semantics enable comparisons between different perception systems and architectures. Crucially, the formal language facilitates the process of abstracting away implementation details, which in turn allows system designers and regulators to specify assumptions and guarantees for system performance at a higher-level of abstraction. An interesting benefit of this formal language is that it enables querying of databases with perception data for specific driving scenarios without the need for the highly manual process of creating ground truth annotations. Such a formal language currently does not exist, and this is a huge impediment to building a thriving marketplace for perception components used in safety-critical systems. This framework sets the foundation for a requirements language between suppliers of perception components and automotive companies. The open source and publicly available software tools developed in this project will assist with testing of perception systems by engineers and governmental agencies.

Performance Period: 01/01/2021 - 12/31/2024
Institution: Arizona State University
Award Number: 2038666
Collaborative Research: CPS: Medium: A3EM: Animal-borne Adaptive Acoustic Environmental Monitoring
Lead PI:
George Wittemyer
Abstract

The application of acoustic monitoring in ecological sciences has grown exponentially in the last two decades. It has been used to answer many questions, including detecting the presence or absence of animal species in an environment, evaluating animal behavior, and identifying ecological stressors and illegal activities. However, current uses are limited to the coverage of relatively small geographic areas with a fixed number of sensors. Animal-borne GPS-based location trackers paired with other sensors are another widely used tool in aiding wildlife conservation and ecosystem monitoring. Since capturing and collaring wild animals is a traumatic event for them, as well as being expensive and resource-intensive, multiyear deployments are required. There are severely limited opportunities to recharge batteries making relatively power-hungry sensing, such as acoustic monitoring, out of reach for existing tracking collars. The aim of the A3EM project is to devise an animal-borne adaptive acoustic monitoring system to enable long-term, real-time observation of the environment and behavior of wildlife. Animal-borne acoustic monitoring will be a novel tool that may provide new insights into biodiversity loss, a severe but underappreciated problem of our time. Combining acoustic monitoring with location tracking collars will enable entirely new applications that will facilitate census gathering and monitoring of threatened and endangered species, detecting poachers of elephants in Africa or caribou in Alaska, and evaluating the effects of mining and logging on wildlife, among many others. All data, hardware designs, and software source code will be released to the public domain, enabling tracking collar manufacturers to include the technology within their products.

A3EM constitutes a complex cyber-physical architecture involving humans, animals, distributed sensing devices, intelligent environmental monitoring agents, and limited power and network connectivity. This intermittently connected CPS, with a power budget an order of magnitude lower than typical, calls for novel approaches with a high level of autonomy and adaptation to the physical environment. A3EM will employ a unique combination of supervised and semi-supervised embedded machine learning to identify new and unexplored event classes in a given environment, dynamically control and adjust parameters related to data acquisition and storage, opportunistically share knowledge and data between distributed sensing devices, and optimize the management of storage and communication to minimize resource needs. These methods will be evaluated through the creation of a wearable acoustic monitoring system used to support ecological applications such as enhanced wildlife protection, rare species identification, and human impact studies on animal behavior.

Performance Period: 08/01/2023 - 07/31/2026
Institution: Colorado State University
Award Number: 2312392
Conference: 2022 Cyber-Physical Systems Principal Investigators Meeting
Lead PI:
Frankie King
Abstract

The purpose of this project is to plan and organize the 2022 National Science Foundation (NSF) Cyber-Physical Systems (CPS) Principal Investigator (PI) Meeting. This meeting convenes all PIs of the NSF CPS Program for the 13 time since the program began. The PI Meeting is to take place during the Fall of 2022 in Alexandria, Virginia. The PI meeting is an annual opportunity for NSF-sponsored CPS researchers, industry representatives, and Federal agency representatives to gather and review new CPS developments, identify new and emerging applications, and to discuss technology gaps and barriers. The program agenda is community-driven and includes presentations (oral and poster) from PIs, reports of past year program activities, and showcase/pitch new CPS innovations and results. This will be a hybrid PI meeting. It will include both in-person and virtual elements. This will be the first CPS PI meeting that includes extensive in-person attendance since the pandemic. The virtual component of the PI meeting will also enable a larger community of researchers spanning academia, industry, and Government to also participate.

The annual PI Meeting serves as the only opportunity where the NSF-funded CPS Principal Investigators meet to share their research, discuss new research opportunities and challenges, and explore new ideas and partnerships for future work. Furthermore, the PI meeting is also an opportunity for the academic research community to interact with industry entities and government agencies with vested interest in CPS research and development. The PI Meeting is a forum for sharing ideas across the CPS community. It has played a major role in growing the community across a broad range of sectors and technologies, and performing outreach to others who have interest in learning about the program and participating as future proposers, transition partners, or sponsors. The 2022 PI meeting will feature lightning talks from researchers, poster sessions, special topic workshops, demonstrations and keynotes from leaders in the research community.
 

Frankie King

Frankie Denise King is the Assistant Director of the Annapolis Technical Coordination Project Office at Vanderbilt University’s Institute for Software Integrated Systems (VU-ISIS), where she is responsible for managing the coordination of collaborative R&D activities on the Cyber-Physical Systems-Virtual Organization that are sponsored by Federal agencies belonging to the Networking and Information Technology R&D (NITRD) Program. Before joining VU-ISIS, King served as the Technical Coordinator for the High Confidence Software and Systems (HCSS) Program Component Area (PCA) at the National Coordination Office (NCO) for NITRD for nearly seven years.  Ms. King has over twenty-eight years of program development and management experience in domestic and international policy affairs where she has served in high-level capacities in the executive and legislative branches of the U.S. government and the private sector. Ms King’s work experience spans several domains, including the areas of information technology R&D, economics, agriculture, trade, and foreign assistance. Ms. King received an MA degree from the University of Notre Dame in 1984, and a BA degree from Fisk University in 1983, where she graduated Summa Cum Laude.

Performance Period: 06/15/2022 - 05/31/2024
Institution: Vanderbilt University
Award Number: 2228183
Conference: 2024 CPS PI Meeting
Lead PI:
Frankie King
Abstract

The purpose of this project is to plan and organize the 2024 National Science Foundation (NSF) Cyber-Physical Systems (CPS) Principal Investigator (PI) Meeting that is scheduled for the Spring of 2024 in Nashville, Tennessee. This meeting convenes PIs with active CPS Program awards for the 14th time since the program began. PI Meetings are primarily annual opportunities for CPS stakeholders comprising NSF-sponsored CPS researchers, industry representatives, and Federal agency representatives to gather and review new CPS developments, identify new and emerging applications, and to discuss technology gaps and barriers. The 2024 program agenda is community-driven comprising oral PI project presentations via panels, talks, poster presentations and demos of new CPS innovations and results, and networking sessions for PIs to interact. Federal agency program officers will provide status reports of the CPS program solicitation and address questions about preparing successful grant proposals. The meeting also will keynote presentations from invited leaders in the research community. 

PI Meetings serves as the only opportunity where all of the NSF-funded CPS Principal Investigators meet to share their research, discuss new research opportunities and challenges, and explore new ideas and partnerships for future work, and to network over two meeting days. Specific to networking, PIs will have the opportunity to interact with industry representatives, government agency program officers, and non-governmental organizations with vested interest in CPS research and development at networking sessions scheduled for the arrival night, throughout the two full meeting days, and post-meeting after the meeting adjourns. PI Meeting are forums for sharing ideas across the CPS community and play a significant role in growing the community across a broad range of sectors and technologies, and performing outreach to others who have interest in learning about the program and participating as future proposers, transition partners, or sponsors.

Frankie King

Frankie Denise King is the Assistant Director of the Annapolis Technical Coordination Project Office at Vanderbilt University’s Institute for Software Integrated Systems (VU-ISIS), where she is responsible for managing the coordination of collaborative R&D activities on the Cyber-Physical Systems-Virtual Organization that are sponsored by Federal agencies belonging to the Networking and Information Technology R&D (NITRD) Program. Before joining VU-ISIS, King served as the Technical Coordinator for the High Confidence Software and Systems (HCSS) Program Component Area (PCA) at the National Coordination Office (NCO) for NITRD for nearly seven years.  Ms. King has over twenty-eight years of program development and management experience in domestic and international policy affairs where she has served in high-level capacities in the executive and legislative branches of the U.S. government and the private sector. Ms King’s work experience spans several domains, including the areas of information technology R&D, economics, agriculture, trade, and foreign assistance. Ms. King received an MA degree from the University of Notre Dame in 1984, and a BA degree from Fisk University in 1983, where she graduated Summa Cum Laude.

Performance Period: 08/01/2023 - 07/31/2024
Institution: Vanderbilt University
Award Number: 2333932
Collaborative Research: CPS: TTP Option: Medium: Sharing Farm Intelligence via Edge Computing
Lead PI:
Flavio Esposito
Co-PI:
Abstract

In the era of data sharing, it is still challenging, insecure, and time-consuming for scientists to share lessons learned from agricultural data collection and data processing. The focus of this project is to mitigate such challenges by intersecting expertise in plant science, secure networked systems, software engineering, and geospatial science. The proposed cyber-physical system will be evaluated in the laboratory and deployed on real crop farms in Missouri, Illinois, and Tennessee. All results will be shared with international organizations whose goal is to increase food security and improve human health and nutrition.

The proposed system will securely orchestrate data gathered using sensors, such as hyperspectral and thermal cameras to collect imagery on soybean, sorghum, and other crops. Preprocessed plant datasets will be then offered to scientists and farmers in different formats via a web-based system, ready to be processed by deep learning algorithms or consumed by thin clients. Data collected from different crop farms will be used to train distributed deep learning systems, using novel architectures that optimize privacy and training time. Such machine learning systems will be used to predict plant stress and detect pathogens. Finally, the cyber-physical system will integrate novel data processing software with existing NSF-funded hardware platforms, introducing novel algorithmic contributions in edge computing and giving feedback to farmers, closing the loop. The results of this project will impact research on high-value crops with significant levels of automation, such as those in protected agriculture and fish crop hydroponics systems in desert farming. Planned outreach activities will impact solutions for smallholder farmers that collaborators at the International Center for Agricultural Research in the Dry Areas (ICARDA) support. Although this work will focus on enabling data science for farming applications, the work will also inform management of other IoT applications, e.g., smart and connected healthcare or other cyber-human systems.
 

Performance Period: 10/01/2022 - 09/30/2025
Institution: Saint Louis University
Award Number: 2133407
CPS: Medium: Collaborative Research: Transforming Connected and Automated Transportation with Smart Networking, Cooperative Sensing, and Edge Computing
Lead PI:
Feng Qian
Abstract

This NSF Cyber-Physical Systems (CPS) grant will advance the state-of-the-art of Connected and Automated Vehicle (CAV) systems by innovating in the three key areas of networking, sensing, and computation, as well as the synergy among them. This work leverages several emerging technology trends that are expected to transform the ground transportation system: much higher-speed wireless connectivity, improved on-vehicle and infrastructure based sensing capabilities, and advances in machine learning algorithms. So far, most related research and development focused on individual technologies, leading to limited benefits. This project will develop an integrated platform that jointly handles networking, sensing, and computation, by addressing key challenges associated with the operating conditions of the CAVs: e.g., safety-critical, high mobility, scarce on-board computing resources, fluctuating network conditions, limited sensor capabilities. The research team will study how to use the integrated platform to enable real-world CAV applications, such as enhancement of public service personnel's safety, alleviation of congestion at bottleneck areas, and protection of vulnerable road users (VRUs). Given its interdisciplinary nature, this project will yield a broad impact in multiple research communities including transportation engineering, mobile/edge computing, and machine learning. The outcome of this research will benefit multiple stakeholders in the CAV ecosystem: drivers, pedestrians, CAV manufacturers, transportation government agencies, mobile network carriers, etc., ultimately improving the safety and mobility of the nation's transportation system. This project will also provide a platform to conduct various education and outreach activities. 

The intellectual core of this research consists of several foundational contributions to the ground transportation CPS domain. First, it innovates vehicle-to-everything (V2X) communications through strategically aggregating 4G/5G/WiFi/DSRC technologies to enhance network performance. Second, it develops a cooperative sensing and perception framework where nearby vehicles can share raw sensor data with an edge node to create a global view, which can provide extended perceptual range and detection of occluded objects. The key technical contribution is to ensure good scalability - allowing many moving vehicles to efficiently share their data despite limited, fluctuating network resources. Third, it enables partitioning computation across vehicles and the infrastructure to meet the real-time requirements of CAV applications. Fourth, integrating the above building blocks of networking, sensing, and computation, the research team will develop an optimization framework that makes adaptive, resource-aware decisions on what computation needs to be performed where at which quality, to maximize the service quality of CAV applications.
 

Performance Period: 06/01/2021 - 05/31/2024
Institution: University of Minnesota-Twin Cities
Award Number: 2038559
CAREER: Distributionally Robust Learning, Control, and Benefits Analysis of Information Sharing for Connected and Autonomous Vehicles
Lead PI:
Fei Miao
Abstract

The rapid evolution of ubiquitous sensing, communication, and computation technologies has contributed to the revolution of cyber-physical systems (CPS). Learning-based methodologies are integrated to the control of physical systems and demonstrating impressive performance in many CPS domains and connected and autonomous vehicles (CAVs) system is one such example with the development of vehicle-to-everything communication technologies. However, existing literature still lacks understanding of the tridirectional relationship among communication, learning, and control. The main challenges to be solved include (1) how to model dynamic system state and state uncertainties with shared information, (2) how to make robust learning and control decisions under model uncertainties, (3) how to integrate learning and control to guarantee the safety of networked CPS, and (4) how to quantify the benefits of communication.

To address these challenges, this CAREER proposal aims to design integrated communication, learning, and control rules that are robust to hybrid system model uncertainties for safe operation and system efficiency of CAVs. The key intellectual merit is the design of integrated distributionally robust multi-agent reinforcement learning (DRMARL) and control framework with rigorous safety guarantees, considering hybrid system state uncertainties predicted with shared information, and the development of scientific foundation for analyzing and quantifying the benefits of communication. The fundamental theory and algorithm principles will be validated using simulators, small-scale testbeds, and full-scale CAVs field demonstrations, to form a new framework for future connectivity, learning, and control of CAVs and networked CPS. The technical contributions are as follows. (1). With shared information, we will design a cooperative prediction algorithm to improve hybrid system state and model uncertainty representations needed by learning and control. (2). Given enhanced prediction, we will design an integrated DRMARL and control framework with rigorous safety guarantee, and a computationally tractable algorithm to calculate the hybrid system decision-making policy. This integrates the strengths of both learning and control to improve system safety and efficiency. (3). We will define formally and quantify the value of communication given and propose a novel learn to communicate approach, to utilize learning and control to improve the communication actions. This project will also integrate an educational plan with the research goals by developing a learning platform of ``ssCAVs'' as an education tool and new interdisciplinary courses on ?learning and control?, undertaking outreach to the general public and K-12 students and teachers, and directly involving high-school scholars, undergraduate and graduate students in research.

Performance Period: 06/01/2021 - 05/31/2026
Institution: University of Connecticut
Award Number: 2047354
Collaborative Research: CPS: Medium: Wildland Fire Observation, Management, and Evacuation using Intelligent Collaborative Flying and Ground Systems
Lead PI:
Fatemeh Afghah
Abstract

Increasing wildfire costs---a reflection of climate variability and development within wildlands---drive calls for new national capabilities to manage wildfires. The great potential of unmanned aerial systems (UAS) has not yet been fully utilized in this domain due to the lack of holistic, resilient, flexible, and cost-effective monitoring protocols. This project will develop UAS-based fire management strategies to use autonomous unmanned aerial vehicles (UAVs) in an optimal, efficient, and safe way to assist the first responders during the fire detection, management, and evacuation stages. The project is a collaborative effort between Northern Arizona University (NAU), Georgia Institute of Technology (GaTech), Desert Research Institute (DRI), and the National Center for Atmospheric Research (NCAR). The team has established ongoing collaborations with the U.S. Forest Service (USFS) in Pacific Northwest Research Station, Kaibab National Forest (NF), and Arizona Department of Forestry and Fire Management to perform multiple field tests during the prescribed and managed fires. This proposal's objective is to develop an integrated framework satisfying unmet wildland fire management needs, with key advances in scientific and engineering methods by using a network of low-cost and small autonomous UAVs along with ground vehicles during different stages of fire management operations including: (i) early detection in remote and forest areas using autonomous UAVs; (ii) fast active geo-mapping of the fire heat map on flying drones; (iii) real-time video streaming of the fire spread; and (iv) finding optimal evacuation paths using autonomous UAVs to guide the ground vehicles and firefighters for fast and safe evacuation. 

This project will advance the frontier of disaster management by developing: (i) an innovative drone-based forest fire detection and monitoring technology for rapid intervention in hard-to-access areas with minimal human intervention to protect firefighter lives; (ii) multi-level fire modeling to offer strategic, event-scale, and new on-board, low-computation tactics using fast fire mapping from UAVs; and (iii) a bounded reasoning-based planning mechanism where the UAVs identify the fastest and safest evacuation roads for firefighters and fire-trucks in highly dynamic and uncertain dangerous zones. The developed technologies will be translational to a broad range of applications such as disaster (flooding, fire, mud slides, terrorism) management, where quick search, surveillance, and responses are required with limited human interventions. This project will also contribute to future engineering curricula and pursue a substantial integration of research and education while also engaging female and underrepresented minority students, developing hands-on research experiments for K-12 students.

This project is in response to the NSF Cyber-Physical Systems 20-563 solicitation.
 

Performance Period: 10/01/2021 - 04/30/2024
Institution: Clemson University
Award Number: 2204445
Subscribe to