The goals of Automated Driving Systems (ADS) and Advanced Driver Assistance Systems (ADAS) include reduction in accidental deaths, enhanced mobility for differently abled people, and an overall improvement in the quality of life for the general public. Such systems typically operate in open and highly uncertain environments for which robust perception systems are essential. However, despite the tremendous theoretical and experimental progress in computer vision, machine learning, and sensor fusion, the form and conditions under which guarantees should be provided for perception components is still unclear. The state-of-the-art is to perform scenario-based evaluation of data against ground truth values, but this has only limited impact. The lack of formal metrics to analyze the quality of perception systems has already led to several catastrophic incidents and a plateau in ADS/ADAS development. This project develops formal languages for specifying and evaluating the quality and robustness of perception sub-systems within ADS and ADAS applications. To enable broader dissemination of this technology, the project develops graduate and undergraduate curricula to train engineers in the use of such methods, and new educational modules to explain the challenges in developing safe and robust ADS for outreach and public engagement activities. To broaden participation in computing, the investigators target the inclusion of undergraduate women in research and development phases through summer internships.
Shared electric micromobility (SEM) services such as shared electric bikes and scooters, as an emerging example of mobile cyber-physical systems, have been increasingly popular in recent years for short-distance trips such as from bus stops to home, enabling convenient mobility through multi-modal transportation and less environmental impact by reducing emission by traffic congestion. However, the success of the service depends on the effective and efficient management of thousands of electric vehicles (e.g., bikes or scooters). Existing management frameworks mainly focused on balancing demand and supply considering energy recharging. However, most of them, if not all, ignored human interactions with systems (e.g., how users select and use vehicles), which leads to a significant gap between experimental and real-world effectiveness. The objective of this project is to develop an interaction-aware management framework for mobile cyber-physical systems.
The use of artificial intelligence in cyber-physical systems is limited by challenges such as data availability, task environment complexity, and the need for expressive and interpretable high-level knowledge representations. To address these challenges, this project aims to develop a set of neuro-symbolic learning and control tools by integrating machine learning, control theory, and formal methods. The results are expected to find application across cyber-physical systems such as robotic systems, autonomous systems, and networked cyber-physical systems. Validation in a testbed environments should facilitate safe deployments in real-world physical environments with provable guarantees and robustness against potential adversaries.
This NSF Cyber-Physical Systems (CPS) grant will advance the state-of-the-art of Connected and Automated Vehicle (CAV) systems by innovating in the three key areas of networking, sensing, and computation, as well as the synergy among them. This work leverages several emerging technology trends that are expected to transform the ground transportation system: much higher-speed wireless connectivity, improved on-vehicle and infrastructure based sensing capabilities, and advances in machine learning algorithms. So far, most related research and development focused on individual technologies, leading to limited benefits. This project will develop an integrated platform that jointly handles networking, sensing, and computation, by addressing key challenges associated with the operating conditions of the CAVs: e.g., safety-critical, high mobility, scarce on-board computing resources, fluctuating network conditions, limited sensor capabilities. The research team will study how to use the integrated platform to enable real-world CAV applications, such as enhancement of public service personnel's safety, alleviation of congestion at bottleneck areas, and protection of vulnerable road users (VRUs). Given its interdisciplinary nature, this project will yield a broad impact in multiple research communities including transportation engineering, mobile/edge computing, and machine learning. The outcome of this research will benefit multiple stakeholders in the CAV ecosystem: drivers, pedestrians, CAV manufacturers, transportation government agencies, mobile network carriers, etc., ultimately improving the safety and mobility of the nation's transportation system. This project will also provide a platform to conduct various education and outreach activities.
To accommodate rapidly growing food demands and increase the quality and quantity of agricultural production, it is necessary to improve farming management practices and technological developments in agricultural fields. This project will synergize expertise in Control, Robotics, Remote Sensing, and Agricultural Engineering to develop new approaches for automated monitoring of smart agricultural systems as an important class of cyber-physical systems (CPSs). This award supports fundamental research to develop innovative techniques for smart agricultural systems by employing a distributed airborne networked sensor system for a team of Unmanned Aerial Vehicles (UAVs) to survey a farm. Unlike traditional crop management methods that use ground operators or vehicles for monitoring farms, the proposed approach for airborne monitoring of agricultural fields minimizes deployment of on-the-ground operations, avoiding damaging crops on healthy parts of the farms.
This project explores a new vision of cyber-physical systems (CPSs) in which computing power and control methods are jointly considered. The approach is carried out through exploration of new theories for the modeling, analysis, and design of CPSs that operate under computational constraints. The tight coupling between computation, communication, and control pervades the design and application of CPSs. Due to the complexity of such systems, advanced design procedures that cope with the variability and uncertainty introduced by computing resources are mandatory, though the design choices are across many disciplines, which may result in over-design of a system. The project will have significant impact through the reduction in design and development time for complex cyber physical systems including ground, air, and maritime vehicles.
The objective of this work is to generate new fundamental science for computer controlled complex physical systems, a broad class of cyber-physical systems (CPS) and demonstrate this science in aerial vehicles and walking robots. The new science enables autonomous planning and control in the presence of failures and abrupt changes in system variables. A framework for the design of algorithms that exploit awareness of the physical and design constraints to autonomously self-adapt their motion plan and control actions will be generated. The approach exploits elements from geometry, adaptive control, and hybrid control to advance the knowledge on modeling, planning, and design of CPS with constraints, non-smooth, and intertwined continuous and discrete dynamics. Unlike current approaches, which separate the task associated with planning the motion from the design of the algorithm used for control, the algorithms to emerge from this project self-learn and self-adapt in real time to cope with unexpected changes in motion and specification constraints so as to enable autonomous systems to perform robustly and safely, and degrade gracefully under failure conditions. Specifically, the new algorithms will learn and monitor the physical and design constraints in real time and adapt both planner and controller by selecting the appropriate constraints to enforce, with robustness and safety guarantees. The capabilities of the new tools will be demonstrated on multi-legged robots in harsh environments that make them prone to failures, and on aerial vehicles in contested/adversarial environments.