Collaborative Research: CPS: Medium: Closing the Teleoperation Gap: Integrating Scene and Network Understanding for Dexterous Control of Remote Robots
Lead PI:
Keith Winstein
Abstract

The aim of this proposal is to enable people to control robots remotely using virtual reality. Using cameras mounted on the robot and a virtual reality headset, a person can see the environment around the robot. However, controlling the robot using existing technologies is hard: there is a time delay because it?s slow to send high quality video over the Internet. In addition, the fidelity of the image is worse than looking through human eyes, with a fixed and narrow view. This proposal will address these limitations by creating a new system which understands the geometry and appearance of the robot?s environment. Instead of sending high-quality video over the Internet, this new system will only send a smaller amount of information about how the environment?s geometry and appearance has changed over time. Further, understanding the geometry and appearance will let us expand the view visible to the person. Overall, these will improve a human?s ability to remotely control the robot by increasing fidelity and responsiveness. We will demonstrate this technology on household tasks, on assembly tasks, and by manipulating small objects.

The aim of this proposal is to test the hypothesis that integrating scene and networking understanding can enable efficient transmission and rendering for dexterous control of remote robots through virtual reality interfaces. This system will result in dexterous teleoperation that enables remote human operators to perform complex tasks with remote robot manipulators, such as cleaning a room or repairing a machine. Such tasks have not previously been demonstrated to be teleoperated for two reasons: 1) lack of an intuitive awareness and understanding of the scene around the remote robot, and 2) lack of an effective low-latency interface to control the robot. We will address these problems by creating new scene- and network-aware algorithms which tightly couple sensing, display, interaction and transmission, enabling the operator to quickly and intuitively understand the environment around the robot. This project will research new interfaces which allow the operator to use their hand to directly specify the robot?s end effector pose in six degrees of freedom, combined with spatial- and semantic-object-based models that allow safe high-level commands. This project will evaluate the proposed system by assessing the speed and accuracy of the remote operator?s ability to complete complex tasks, including assembly tasks; the aim will be to complete unstructured assembly tasks that have never been demonstrated to be remotely teleoperated before.

This project is in response to the NSF Cyber-Physical Systems 20-563 solicitation.

Keith Winstein
Performance Period: 02/15/2021 - 01/31/2025
Institution: Stanford University
Sponsor: National Science Foundation
Award Number: 2039070
Collaborative Research: CPS: Medium: Mutualistic Cyber-Physical Interaction for Self-Adaptive Multi-Damage Monitoring of Civil Infrastructure
Lead PI:
Kaijian Liu
Co-Pi:
Abstract

This project aims to enable mutualistic interaction of cyber damage prognostics and physical reconfigurable sensing for mutualistic and self-adaptive cyber-physical systems (CPS). Drawing inspiration from mutualism in biology where two species interact in a way that benefits both, the cyber and the physical interact in a way that they simultaneously benefit from and contribute to each other to enhance the ability of the CPS to predict, reconfigure, and adapt. Such interaction is generalizable, allowing it to enhance CPS applications in various domains. In the civil infrastructure systems domain, the mutualistic interaction-enabled CPS will allow for reconfiguring a single type of sensor, adaptively based on damage prognostics, to monitor multiple classes of infrastructure damages ? thereby improving the cost-effectiveness of multi-damage infrastructure monitoring by reducing the types and number of sensors needed and maximizing the timeliness and accuracy of damage assessment and prediction at the same time. Enabling cost-effective multi-damage monitoring is promising to leapfrog the development of safer, more resilient, and sustainable infrastructure, which would stimulate economic growth and social welfare for the benefit of the nation and its people. This project will also contribute to NSF?s commitment to broadening participation in engineering (BPE) by developing innovative, interdisciplinary, and inclusive BPE programs to attract, train, and reward the next-generation engineering researchers and practitioners who are capable creators of CPS technology and not only passive consumers, thereby enhancing the U.S. economy, security, and well-being.

The envisioned CPS includes three integrated components: (1) data-driven, knowledge-informed deep learning methods for generalizable damage prognostics to predict the onset and propagation of infrastructure damages, providing information about target damages to inform reconfigurable sensing, (2) signal difference maximization theory-based reconfigurable sensing methods to optimize and physically control the configurations of the sensors to actively seek to monitor each of the predicted target damages, providing damage-seeking feedback to inform damage prognostics, and (3) quality-aware edge cloud computing methods for efficient and effective damage information extraction from raw sensing signals, serving as the bridge between damage prognostics and reconfigurable sensing. The proposed CPS will be tested in multi-damage monitoring of bridges using simulation-based and actual CPS prototypes, and would be generalized to monitoring other civil infrastructure in the future. The proposed CPS methods have the potential to transform the way we design, create, and operate CPS to enable the next-generation CPS that have greater predictive ability, reconfigurability, and adaptability.

Kaijian Liu
Performance Period: 08/01/2023 - 07/31/2026
Institution: Stevens Institute of Technology
Sponsor: National Science Foundation
Award Number: 2305882
CAREER: A Framework for Logic-based Requirements to guide Safe Deep Learning for Autonomous Mobile Systems
Lead PI:
Jyotirmoy Deshmukh
Abstract

The future where non-autonomous systems like human-driven cars are replaced by autonomous, driverless cars is now within reach. This reduction in human effort comes at a cost: in existing systems, human operators implicitly define high-level system objectives through their actions; autonomous systems lack this guidance. Popular design techniques for autonomy such as those based on deep reinforcement learning obtain such guidance from user-specified, state-based reward functions or user-provided demonstrations. Unfortunately, such techniques generally do not provide guarantees on the safe behavior of the trained controllers. This project argues for a different approach where mathematically unambiguous, system-level behavioral specifications expressed in temporal logic are used to guide deep reinforcement learning algorithms to train neural network-based controllers. It allows reasoning about the safety of learning-based control through scalable methods for formal verification of the trained controllers against the given specifications.

To address lack of explainability of neural controllers, this project devises new techniques to distill the neural-network-controlled autonomous system into human-interpretable symbolic automata. The project blends methods from statistical learning, control theory, optimization, and formal methods to give deterministic or probabilistic guarantees on the safe behavior of autonomous systems. It integrates education and research through new graduate courses on verifiable reinforcement learning. The investigator will broadly disseminate the scientific outcomes of the project through technology transfer to industrial partners and through publications at top research conferences and journals. The expected societal impact is improved safety and explainable control for future autonomous cyber-physical systems in various application domains.

Jyotirmoy Deshmukh
Performance Period: 03/01/2021 - 02/28/2026
Institution: University of Southern California
Sponsor: National Science Foundation
Award Number: 2048094
Collaborative Research: CPS: Medium: Spatio-Temporal Logics for Analyzing and Querying Perception Systems
Lead PI:
Jyotirmoy Deshmukh
Abstract

The goals of Automated Driving Systems (ADS) and Advanced Driver Assistance Systems (ADAS) include reduction in accidental deaths, enhanced mobility for differently abled people, and an overall improvement in the quality of life for the general public. Such systems typically operate in open and highly uncertain environments for which robust perception systems are essential. However, despite the tremendous theoretical and experimental progress in computer vision, machine learning, and sensor fusion, the form and conditions under which guarantees should be provided for perception components is still unclear. The state-of-the-art is to perform scenario-based evaluation of data against ground truth values, but this has only limited impact. The lack of formal metrics to analyze the quality of perception systems has already led to several catastrophic incidents and a plateau in ADS/ADAS development. This project develops formal languages for specifying and evaluating the quality and robustness of perception sub-systems within ADS and ADAS applications. To enable broader dissemination of this technology, the project develops graduate and undergraduate curricula to train engineers in the use of such methods, and new educational modules to explain the challenges in developing safe and robust ADS for outreach and public engagement activities. To broaden participation in computing, the investigators target the inclusion of undergraduate women in research and development phases through summer internships.

The formal language developed in this project is based on a new spatio-temporal logic pioneered by the investigators. This logic allows one to simultaneously perform temporal reasoning about streaming perception data, and spatially reason about objects both within a single frame of the data and across frames. The project also develops quantitative semantics for this logic, which provides the user with quantifiable quality metrics for perception sub-systems. These semantics enable comparisons between different perception systems and architectures. Crucially, the formal language facilitates the process of abstracting away implementation details, which in turn allows system designers and regulators to specify assumptions and guarantees for system performance at a higher-level of abstraction. An interesting benefit of this formal language is that it enables querying of databases with perception data for specific driving scenarios without the need for the highly manual process of creating ground truth annotations. Such a formal language currently does not exist, and this is a huge impediment to building a thriving marketplace for perception components used in safety-critical systems. This framework sets the foundation for a requirements language between suppliers of perception components and automotive companies. The open source and publicly available software tools developed in this project will assist with testing of perception systems by engineers and governmental agencies.
 

Jyotirmoy Deshmukh
Performance Period: 01/01/2021 - 12/31/2023
Institution: University of Southern California
Sponsor: National Science Foundation
Award Number: 2039087
CPS Medium: Collaborative Research: Physics-Informed Learning and Control of Passive and Hybrid Conditioning Systems in Buildings
Lead PI:
Sandipan Mishra
Co-Pi:
Abstract

This Cyber-Physical Systems (CPS) project will develop advanced artificial intelligence and machine-learning (AI/ML) techniques to harness the extensive untapped climatic resources that exist for direct solar heating, natural ventilation, and radiative and evaporative cooling in buildings. Although these mechanisms for building environment conditioning are colloquially termed "passive," their performance depends strongly on the intelligent control of operable elements such as windows and shading, as well as fans in hybrid systems. Towards this goal, this project will create design methodologies for climate- and occupant-responsive strategies that control these operable elements intelligently in coordination with existing building heating ventilation and air conditioning systems, based on sensor measurements of the indoor and outdoor environments, weather and energy forecasts, occupancy, and occupant preferences. The solutions developed in this project can potentially result in substantial reduction in greenhouse gas emissions generated from space heating, cooling, and ventilation. The developed techniques may be particularly valuable in affordable housing by reducing energy costs under normal conditions and improving passive survivability during extreme events and power outages.

Sandipan Mishra
Performance Period: 06/01/2023 - 05/31/2026
Institution: Rensselaer Polytechnic Institute
Sponsor: NSF
Award Number: 2241795
CPS: Medium: Making Every Drop Count: Accounting for Spatiotemporal Variability of Water Needs for Proactive Scheduling of Variable Rate Irrigation Systems
Lead PI:
Sangmi Pallickara
Co-Pi:
Abstract

We all depend on agriculture for sustenance. When compared to seafood and livestock, cropping systems provide the primary source of nutrition. Yields and productivity of cropping systems must grow to meet the demands of a growing population. Once seeds are available, a successful cropping season is determined by water. There are two sources for this: irrigation and precipitation. Irrigation water is a major input to agriculture, especially in semi-arid and arid regions. In a recent appraisal for the Soil and Water Resources Conservation Act, the USDA identified irrigation water conservation as a national need. Under-watering induces stresses and adversely impacts both crop growth and yields. Over-watering, on the other hand, leads to nutrient runoff, soil erosion, and water waste. Farms are also impacted by the adverse effects of droughts, variability in precipitation, and lengthening of the growing season. The proposed effort with its emphasis on water management and conservation represents an adaptation to the head winds often encountered at farms. The effort addresses the interrelated aspects of over-watering (soil erosion and nutrient runoff) and underwatering (adverse crop yields and stress) while ensuring sustainability and profitability of agricultural systems.

Sangmi Pallickara
Performance Period: 08/01/2023 - 07/31/2026
Institution: Colorado State University
Sponsor: NSF
Award Number: 2312319
CPS: Medium: Real-Time Learning and Control of Stochastic Nanostructure Growth Processes Through in situ Dynamic Imaging
Lead PI:
Sarbajit Banerjee
Co-Pi:
Abstract

This Cyber-Physical Systems (CPS) grant will support research that will contribute new knowledge related to emerging monitoring and control techniques of the growth of nanomaterials, which are crucial for applications such as new types of batteries and photovoltaic devices, because precise structuring of matter is essential to realize the desired charge, mass, and energy flow patterns that underpin energy conversion and storage. With the fast arrival of tremendous amount of data produced by dynamic nanoscale imaging, the National Nanotechnology Initiative has identified the lack of in-process monitoring and control as a grand challenge impeding the design and discovery of new materials, because "existing methods are time-consuming, expensive, and require high-tech infrastructure and high skill levels to perform." This grant supports a multidisciplinary team, comprising experts from data science, control, circuit design, and material sciences, aiming to tackle this challenge by designing a cyber-physical system that can reliably convert dynamic imaging data to machine intelligible information for process monitoring and control. The results from this research will benefit nanomaterial discovery and pave a path to scalable production. The multidisciplinary approach will help broaden participation of underrepresented groups in research and positively impact science and engineering education.

Sarbajit Banerjee
Performance Period: 01/01/2021 - 12/31/2024
Institution: Texas A&M Engineering Experiment Station
Sponsor: NSF
Award Number: 2038625
SHF: Small: Probabilistic Programming and Statistical Verification for Safe Autonomy
Lead PI:
Sasa Misailovic
Co-Pi:
Abstract

Autonomous systems such as drones and self-driving cars are quickly entering human-dominated fields and becoming tangible technologies that will impact the human experience. However, as these systems share space and operate among humans, safety and reliability of autonomous systems become primary concerns. An important challenge for safety and reliability in autonomous systems is coping with uncertainty. This project focuses on three important forms of uncertainty: (1) noisy data from sensors, (2) asynchrony of distributed computation, and (3) heuristic computation of decision-making software. They bring various challenges for developing and validating software modules of autonomous systems.

Sasa Misailovic
Performance Period: 07/01/2020 - 06/30/2024
Institution: University of Illinois at Urbana-Champaign
Award Number: 2008883
NSF Workshop on State-of-the-Art and Challenges in Resilience
Lead PI:
Saurabh Bagchi
Abstract

Society depends on the interconnection of systems including hardware and software. They make up the built environment and the infrastructure that we depend upon. Today?s systems are subject to an increasing number of hazards and disasters both natural or manmade often leading to failures that have major impact on society. We want to know how to design systems to avoid such failures and how to bounce back quickly if such failures occur.
The workshop will reflect on the current state-of-art and state-of-practice for the above two questions. It will then bring out the research and the translational challenges to make our infrastructures truly resilient. This workshop will be hosted in a hybrid mode, with both in-person and virtual participation. The workshop will bring together external thought and action leaders in the area of resilient systems, drawn from universities, federal laboratories, and commercial organizations and providing multi-disciplinary and convergent perspectives. The workshop will be broad-based considering areas of resilient and adaptive cyberinfrastructures, resilient cyber-physical systems, and scientific foundations of resilient socio-technical systems. The workshop will be hosted by Purdue?s Center for Resilient Infrastructures, Systems, and Processes and will address three broad technical themes. An objective is to develop technology for research concepts suitable for a near-term (1-5 year) and mid-term (5-10 years) considering theoretical and practical advancements in technology.

Saurabh Bagchi
Performance Period: 10/01/2021 - 05/31/2024
Institution: Purdue University
Sponsor: NSF
Award Number: 2140139
CAREER: InteractiveRF: Fully-Adaptive, Physics-Aware RF-Enabled Cyber-Physical Human Systems
Lead PI:
Sevgi Zubeyde Gurbuz
Abstract

As technology advances and an increasing number of devices enter our homes and workplace, humans have become an integral component of cyber-physical systems (CPS). One of the grand challenges of cyber-physical human systems (CPHS) is how to design autonomous systems where human-system collaboration is optimized through improved understanding of human behavior. A new frontier within this landscape is afforded by the advent of low-cost, low-power millimeter-wave radio frequency (RF) transceivers, which can be exploited almost anywhere as part of the Internet-of-Things, smart environments, and personal devices. RF sensors provide a unique, information rich dataset of high-resolution measurements of distance, direction-of-arrival, and micro-Doppler signature in a non-contact, non-intrusive fashion in most weather conditions and in the dark. This CAREER project aims to pave the way for new and innovative RF-enabled CPHS applications in service of society and a better quality-of-life by transforming current fixed-transmission RF sensors into intelligent devices that can autonomously respond to human and environmental dynamics to optimize CPHS performance. Due to the burgeoning commercial sector utilizing radar across a variety of fields, such as transportation, health and human-computer interaction, this project features integrated academic preparation for multi-disciplinary, convergence research at both undergraduate and graduate levels to educate a new generation of engineers with experience in RF sensing, machine learning, signal processing and CPHS applications. Through K-12 outreach activities and recruiting at local historically black colleges and universities (HBCUs), this project will enrich and motivate students to study STEM fields, laying the foundations for a diverse and globally competitive STEM workforce for the future.

Sevgi Zubeyde Gurbuz
Sevgi Z. Gurbuz (S’01–M’10–SM’17) received the B.S. degree in electrical engineering with minor in mechanical engineering and the M.Eng. degree in electrical engineering and computer science from the Massachusetts Institute of Technology, Cambridge, MA, USA, in 1998 and 2000, respectively, and the Ph.D. degree in electrical and computer engineering from Georgia Institute of Technology, Atlanta, GA, USA, in 2009. From February 2000 to January 2004, she worked as a Radar Signal Processing Research Engineer with the U.S. Air Force Research Laboratory, Sensors Directorate, Rome, NY, USA. Formerly an Assistant Professor in the Department of Electrical-Electronics Engineering at TOBB University, Ankara, Turkey and Senior Research Scientist with the TUBITAK Space Technologies Research Institute, Ankara, Turkey, she is currently an Assistant Professor in the University of Alabama at Tuscaloosa, Department of Electrical and Computer Engineering. Her current research interests include RF sensor-enabled cyber-physical human systems (CPHS) for biomedical engineering and remote health monitoring, autonomous vehicles, and human computer interaction (HCI) applications. She has recently received a patent in April 2022 relating to radar-based American Sign Language (ASL) recognition. Dr. Gurbuz is a recipient of the 2023 NSF CAREER Award, the 2022 American Association of University Women Research Publication Grant in Engineering, Medicine and Science, the IEEE Harry Rowe Mimno Award for the Best IEEE AES Magazine Paper of 2019, the 2020 SPIE Rising Researcher Award, an EU Marie Curie Research Fellowship, and the 2010 IEEE Radar Conference Best Student Paper Award. Dr. Gurbuz also serves as a member of the IEEE Radar Systems Panel and is an Associate Editor for the IEEE Transactions of Aerospace and Electronic Systems (T-AES) and the IEEE Transactions on Radar Systems (T-RS). She is a member o the Editorial Board for the IET Radar, Sonar, and Navigation (RSN) journal. Dr. Gurbuz is a Senior Member of the IEEE, and a member of the SPIE and ACM.
Performance Period: 05/01/2023 - 04/30/2028
Institution: University of Alabama Tuscaloosa
Sponsor: NSF
Award Number: 2238653
Subscribe to