CPS: Synergy: Collaborative Research: Architectural and Algorithmic Solutions for Large Scale PEV Integration into Power Grids
Lead PI:
Vijay Gupta
Abstract
This project designs algorithms for the integration of plug-in hybrid electric vehicles (PEVs) into the power grid. Specifically, the project will formulate and solve optimization problems critical to various entities in the PEV ecosystem -- PEV owners, commercial charging station owners, aggregators, and distribution companies -- at the distribution / retail level. Charging at both commercial charging stations and at residences will be considered, for both the case when PEVs only function as loads, and the case when they can also function as sources, equipped with vehicle-to-home (V2H) or vehicle-to-grid (V2G) energy reinjection capability. The focus of the project is on distributed decision making by various individual players to achieve analytical system-level performance guarantees. Electrification of the transportation market offers revenue growth for utility companies and automobile manufacturers, lower operational costs for consumers, and benefits to the environment. By addressing problems that will arise as PEVs impose extra load on the grid, and by solving challenges that currently impede the use of PEVs as distributed storage resources, this research will directly impact the society. The design principles gained will also be applicable to other cyber-physical infrastructural systems. A close collaboration with industrial partners will ground the research in real problems and ensure quick dissemination of results to the marketplace. A strong educational component will integrate the proposed research into the classroom to allow better training of both undergraduate and graduate students.
Performance Period: 10/01/2012 - 01/31/2013
Institution: California Institute of Technology
Sponsor: National Science Foundation
Award Number: 1238984
RAPID: Realization of a Medical Cyber-Physical System to Enhance Safety of Ebola Workers
Lead PI:
Taskin Padir
Abstract
Motivated by the fact that the 2014 Ebola outbreak is the largest in history and there is a pressing need to understand how to improve delivery of care with the right technological interventions at the right place, this Rapid Response Research is aimed at realizing a human-in-the-loop medical cyber-physical system (CPS) for monitoring patients, insuring compliance with relevant safety protocols, and collecting data for advancing multidisciplinary research on infectious disease control. The ultimate goal is to enhance safety of Ebola workers by minimizing their contact with potentially contaminated surfaces and materials through integration of methods and technologies to realize smart and connected treatment clinics. This project could impact the response to infectious disease outbreaks by augmenting existing treatment clinics with cost-effective, modular, reconfigurable and
 open-design CPS technologies. The project will train a new cadre of engineering students, researchers and innovators to be 
sensitive to societal needs and national priorities by involving K-Gray, undergraduate and graduate students in all aspects of the project, especially at the co-ideation and co-design stages. The project will bring together a multidisciplinary team of engineers, scientists, technologists, medical experts, and humanitarian aid workers to develop holistic solutions to infectious disease control. The broader impacts also include operational cost savings in treatment clinics by reducing the need and use of the personal protective equipment and preserve resources such as water by reducing consumption. In order to prevent, detect and respond to current Ebola outbreak and future similar infectious disease outbreaks, this research plan has the following interconnected aims: (1) contribute new knowledge, methods, and tools to better understand the operational procedures in an infectious disease treatment clinic, (2) design, implement and validate a treatment ward augmented with a medical CPS for patient monitoring, (3) apply intuitive control interfaces and data visualization tools for practical human-robot interaction, (4) realize traded, coordinated and collaborative shared control techniques for safe and effective mobile robot navigation inside a treatment facility, (5) assess acceptability and effectiveness of the technology among health care workers and patients. The team will develop a self-contained, modular and reconfigurable system composed of a connected sensor network for patient monitoring and a mobile robot platform for telemedicine that will primarily focus on the interoperability and integration of existing standardized 
hardware and software systems to realize a testbed for verification and validation of a medical CPS. Medical, emergency response and humanitarian aid experts will be engaged to critically assess user-experiences and acceptability among medical staff to develop pathways for fielding the system in a treatment clinic. This RAPID project will lead the way in designing the next generation of human-in-the-loop medical CPS for empowering health care workers worldwide in treating patients during infectious disease outbreaks.
Performance Period: 12/01/2014 - 08/31/2016
Institution: Worcester Polytechnic Institute
Sponsor: National Science Foundation
Award Number: 1509782
CPS: Synergy: Collaborative Research: Real-time Data Analytics for Energy Cyber-Physical Systems
Lead PI:
Mihai Anitescu
Abstract
Inadequate system understanding and inadequate situational awareness have caused large-scale power outages in the past. With the increased reliance on variable energy supply sources, system understanding and situational awareness of a complex energy system become more challenging. This project leverages the power of big data analytics to directly improve system understanding and situational awareness. The research provides the methodology for detecting anomalous events in real-time, and therefore allow control centers to take appropriate control actions before minor events develop into major blackouts. The significance for the society and for the power industry is profound. Energy providers will be able to prevent large-scale power outages and reduce revenue losses, and customers will benefit from reliable energy delivery with service guarantees. Students, including women and underrepresented groups, will be trained for the future workforce in this area. The project includes four major thrusts: 1) real-time anomaly detection from measurement data; 2) real-time event diagnosis and interpretation of changes in the state of the network; 3) real-time optimal control of the power grid; 4) scientific foundations underpinning cyber-physical systems. The major outcome of this project is practical solutions to event or fault detection and diagnosis in the power grid, as well as prediction and prevention of large-scale power outages.
Performance Period: 09/15/2015 - 08/31/2019
Institution: University of Chicago
Sponsor: National Science Foundation
Award Number: 1545046
CPS/Synergy/Collaborative Research: Safe and Efficient Cyber-Physical Operation System for Construction Equipment
Lead PI:
Chinemelu Anumba
Abstract
Equipment operation represents one of the most dangerous tasks on a construction sites and accidents related to such operation often result in death and property damage on the construction site and the surrounding area. Such accidents can also cause considerable delays and disruption, and negatively impact the efficiency of operations. This award will conduct research to improve the safety and efficiency of cranes by integrating advances in robotics, computer vision, and construction management. It will create tools for quick and easy planning of crane operations and incorporate them into a safe and efficient system that can monitor a crane's environment and provide control feedback to the crane and the operator. Resulting gains in safety and efficiency will reduce fatal and non-fatal crane accidents. Partnerships with industry will also ensure that these advances have a positive impact on construction practice, and can be extended broadly to smart infrastructure, intelligent manufacturing, surveillance, traffic monitoring, and other application areas. The research will involve undergraduates and includes outreach to K-12 students. The work is driven by the hypothesis that the monitoring and control of cranes can be performed autonomously using robotics and computer vision algorithms, and that detailed and continuous monitoring and control feedback can lead to improved planning and simulation of equipment operations. It will particularly focus on developing methods for (a) planning construction operations while accounting for safety hazards through simulation; (b) estimating and providing analytics on the state of the equipment; (c) monitoring equipment surrounding the crane operating environment, including detection of safety hazards, and proximity analysis to dynamic resources including materials, equipment, and workers; (d) controlling crane stability in real-time; and (e) providing feedback to the user and equipment operators in a "transparent cockpit" using visual and haptic cues. It will address the underlying research challenges by improving the efficiency and reliability of planning through failure effects analysis and creating methods for contact state estimation and equilibrium analysis; improving monitoring through model-driven and real-time 3D reconstruction techniques, context-driven object recognition, and forecasting motion trajectories of objects; enhancing reliability of control through dynamic crane models, measures of instability, and algorithms for finding optimal controls; and, finally, improving efficiency of feedback loops through methods for providing visual and haptic cues.
Performance Period: 01/01/2016 - 04/30/2017
Institution: Pennsylvania State University
Sponsor: National Science Foundation
Award Number: 1544973
CPS: TTP Option: Synergy: Collaborative Research: Nested Control of Assistive Robots through Human Intent Inference
Lead PI:
Paolo Bonato
Abstract
Part 1: Upper-limb motor impairments arise from a wide range of clinical conditions including amputations, spinal cord injury, or stroke. Addressing lost hand function, therefore, is a major focus of rehabilitation interventions; and research in robotic hands and hand exoskeletons aimed at restoring fine motor control functions gained significant speed recently. Integration of these robots with neural control mechanisms is also an ongoing research direction. We will develop prosthetic and wearable hands controlled via nested control that seamlessly blends neural control based on human brain activity and dynamic control based on sensors on robots. These Hand Augmentation using Nested Decision (HAND) systems will also provide rudimentary tactile feedback to the user. The HAND design framework will contribute to the assistive and augmentative robotics field. The resulting technology will improve the quality of life for individuals with lost limb function. The project will help train engineers skilled in addressing multidisciplinary challenges. Through outreach activities, STEM careers will be promoted at the K-12 level, individuals from underrepresented groups in engineering will be recruited to engage in this research project, which will contribute to the diversity of the STEM workforce. Part 2: The team previously introduced the concept of human-in-the-loop cyber-physical systems (HILCPS). Using the HILCPS hardware-software co-design and automatic synthesis infrastructure, we will develop prosthetic and wearable HAND systems that are robust to uncertainty in human intent inference from physiological signals. One challenge arises from the fact that the human and the cyber system jointly operate on the same physical element. Synthesis of networked real-time applications from algorithm design environments poses a framework challenge. These will be addressed by a tightly coupled optimal nested control strategy that relies on EEG-EMG-context fusion for human intent inference. Custom distributed embedded computational and robotic platforms will be built and iteratively refined. This work will enhance the HILCPS design framework, while simultaneously making novel contributions to body/brain interface technology and assistive/augmentative robot technology. Specifically we will (1) develop a theoretical EEG-EMG-context fusion framework for agile HILCPS application domains; (2) develop theory for and design novel control theoretic solutions to handle uncertainty, blend motion/force planning with high-level human intent and ambient intelligence to robustly execute daily manipulation activities; (3) further develop and refine the HILCPS domain-specific design framework to enable rapid deployment of HILCPS algorithms onto distributed embedded systems, empowering a new class of real-time algorithms that achieve distributed embedded sensing, analysis, and decision making; (4) develop new paradigms to replace, retrain or augment hand function via the prosthetic/wearable HAND by optimizing performance on a subject-by-subject basis.
Performance Period: 10/01/2015 - 09/30/2019
Institution: Spaulding Rehabilitation Hospital
Sponsor: National Science Foundation
Award Number: 1544815
CPS: TTP Option: Synergy: Collaborative Research: The Science of Activity-Predictive Cyber-Physical Systems (APCPS)
Lead PI:
Diane Cook
Co-PI:
Abstract
This project aims to design algorithmic techniques to perform activity discovery, recognition, and prediction from sensor data. These techniques will form the foundation for the science of Activity- Prediction Cyber-Physical Systems, including potential improvement in the responsiveness and adaptiveness of the systems. The outcome of this work is also anticipated to have important implications in the specific application areas of health care and sustainability, two priority areas of societal importance. The first application will allow for health interventions to be provided that adapt to an individual's daily routine and operate in that person's everyday environment. The second application will offer concrete tools for building automation that improve sustainability without disrupting an individual's current or upcoming activities. The project investigators will leverage existing training programs to involve students from underrepresented groups in this research. Bi-annual tours and a museum exhibit will reach K-12 teachers, students and visitors, and ongoing commercialization efforts will ensure that the designed technologies are made available for the public to use. Deploying activity-predictive cyber-physical systems "in the wild" requires a number of robust computational components for activity learning, knowledge transfer, and human-in- the-loop computing that are introduced as part of this project. These components then create cyber physical systems that funnel information from a sensed environment (the physical setting as well as humans in the environment), to activity models in the cloud, to mobile device interfaces, to the smart grid, and then back to the environment. The proposed research centers on defining the science of activity-predictive cyber-physical systems, organized around the following thrusts: (1) the design of scalable and generalizable algorithms for activity discovery, recognition, and prediction; (2) the design of transfer learning methods to increase the the ability to generalize activity-predictive cyber-physical systems; (3) the design of human-in-the-loop computing methods to increase the sensitivity of activity-predictive cyber-physical systems; (4) the introduction of evaluation metrics for activity-predictive cyber-physical systems; and (5) transition of activity-predictive cyber-physical systems to practical applications including health monitoring/intervention and smart/sustainable cities.
Performance Period: 10/01/2015 - 09/30/2019
Institution: Washington State University
Sponsor: National Science Foundation
Award Number: 1543656
CPS/Synergy/Collaborative Research: Cybernizing Mechanical Structures through Integrated Sensor-Structure Fabrication
Lead PI:
Yu Ding
Abstract
The timely and accurate in-service identification of faults in mechanical structures, such as airplanes, can play a vitally important role in avoiding catastrophes. One major challenge, however, is that the sensing system relies on high frequency signals, the coordination of which is difficult to achieve throughout a large structure. To tackle this fundamental issue, the research team will take advantage of 3D printing technology to fabricate integrated sensor-structure components. Specifically, the team plans to innovate a novel printing scheme that can embed piezoelectric transducers (namely, sensor/actuator coupled elements) into layered composites. As the transducers are densely distributed throughout the entire structure, they function like a nerve system embedded into the structure. Such a sensor nerve system, when combined with new control and command systems and advanced data and signal processing capability, can fully unleash the latest computing power to pinpoint the fault location. The new framework of utilizing emerging additive manufacturing technology to produce a structural system with integrated, densely distributed active sensing elements will potentially lead to paradigm-shifting progress in structural self-diagnosis. This advancement may allow the acquisition of high-quality, active interrogation data throughout the entire structure, which can then be used to facilitate highly accurate and robust decision-making. It will lead to intellectual contributions including: 1) development of a new sensing modality with mechanical-electrical dual-field adaptivity, that yields rich and high-quality data throughout the structure; 2) design of an additive manufacturing scheme that inserts piezoelectric micro transducer arrays throughout the structure to enable active interrogation; and 3) formulation of new data analytics and inverse analysis that can accurately identify the fault location/severity and guide the fine-tuning of the sensor system.
Performance Period: 01/01/2016 - 12/31/2018
Institution: Texas A&M Engineering Experiment Station
Sponsor: National Science Foundation
Award Number: 1545038
CPS: TTP Option: Synergy: Traffic Signal Control with Connected and Autonomous Vehicles in the Traffic Stream
Lead PI:
Lily-Ageliki Elefteriadou
Co-PI:
Abstract
Recent progress in autonomous and connected vehicle technologies coupled with Federal and State initiatives to facilitate their widespread use provide significant opportunities in enhancing mobility and safety for highway transportation. This project develops signalized intersection control strategies and other enabling sensor mechanisms for jointly optimizing vehicle trajectories and signal control by taking advantage of existing advanced technologies (connected vehicles and vehicle to infrastructure communications, sensors, autonomous vehicle technologies, etc.) Traffic signal control is a critical component of the existing transportation infrastructure and it has a significant impact on transportation system efficiency, as well as energy consumption and environmental impacts. In addition to advanced vehicle technologies, the strategies developed consider the presence of conventional vehicles in the traffic stream to facilitate transition to these new strategies in a mixed vehicle environment. The project also develops and uses simulation tools to evaluate these strategies as well as to provide tools that can be used in practice to consider the impacts of automated and connected vehicles in arterial networks. The project involves two industry partners (ISS and Econolite) to help facilitate new product development in anticipation of increased market penetration of connected and autonomous vehicles. The approach will be tested through simulation at University of Florida, through field tests at the Turner Fairbank Highway Research Center (TFHRC) and through the control algorithms that also will be deployed and tested in the field. The project will support multiple graduate students and will support creation of on-line classes. The project is at the intersection of several different disciplines (optimization, sensors, automated vehicles, transportation engineering) required to produce a real-time engineered system that depends on the seamless integration of several components: sensor functionality, connected and autonomous vehicle information communication, signal control optimization strategy, missing and erroneous information, etc. The project develops and implements optimization processes and strategies considering a seamless fusion of multiple data sources, as well as a mixed vehicle stream (autonomous, connected, and conventional vehicles) under real-world conditions of uncertain and missing data. Since trajectories for connected and conventional vehicles cannot be optimized or guaranteed, the project examines the impacts of the presence of automated vehicles on the following vehicles in a queue. The project also integrates advanced sensing technology needed to control a mixed vehicle stream, as well as address malfunctioning communications in connected and autonomous vehicles.
Performance Period: 09/01/2015 - 08/31/2020
Institution: University of Florida
Sponsor: National Science Foundation
Award Number: 1446813
CAREER: Secure Perception for Autonomous Systems
Lead PI:
Todd Humphries
Abstract
Today's automobiles are increasingly autonomous. The latest Mercedes S-class sedan applies corrective action when its driver strays out of lane or tailgates too closely. Semi-autonomy will soon yield to full autonomy. Nissan has promised a line of self-driving cars by 2020. Maritime craft are likewise moving from rudimentary autopilots to full autonomy, and autonomous aerial vehicles will doubtless play a significant role in the future economy. Current versions of these vehicles are cocooned in an array of sensors, but neither the sensors nor the timing, navigation, and collision avoidance algorithms they feed have been designed for security against malicious attacks. Radar and acoustic sensors transmit predictable, uncoded signals; vehicle-to-vehicle communication protocols are either unauthenticated or critically dependent on insecure civil GPS signals (or both); and vehicle state estimators are designed for robustness but not security. These vulnerabilities are not merely conceptual: GPS spoofing attacks have been demonstrated against a drone and an ocean vessel, causing the drone to crash and the vessel to veer off course; likewise, it appears possible to cause road accidents by fooling a car's radar sensor into thinking a crash is imminent, thus triggering automatic braking. This proposal seeks funding to fix these vulnerabilities by developing sensors and high-level decision-making algorithms that are hardened against such so-called field attacks. The goal of secure control systems is to survive and operate safely despite sensor measurements or control commands being compromised. This proposal focuses on an emergent category of cyber-physical attack that has seen little scrutiny in the secure control literature. Like cyber attacks, these attacks are hard to detect and can be executed from a distance, but unlike cyber attacks, they are effective even against control systems whose software, data, and communications networks are secure, and so can be considered a more menacing long-term threat. These are attacks on the physical fields such as electromagnetic, magnetic, acoustic, etc. measured by system sensors. As specialized sensor attacks, field attacks seek to compromise a system's perception of reality non-invasively from without, not from within. We emphasize field attacks against navigation, collision avoidance, and synchronization sensors, as these are of special importance to the rise of autonomous vehicles and the smart grid. This proposal's goal is to develop a coherent analytical foundation for secure perception in the presence of field attacks and to develop a suite of algorithms and tools to detect such attacks. A key insight behind this proposal's approach is that the physics of field attacks impose fundamental difficulties on the attacker that can be exploited and magnified to enable attack detection. This work will progressively build security into navigation, collision avoidance, and timing perception from the physical sensory layer to the top-level state estimation algorithms. The outcome of this work will be smarter, more skeptical sensor systems for autonomous vehicles and other autonomous systems.
Todd Humphries

Dr. Humphreys specializes in the application of optimal detection and estimation techniques to problems in satellite navigation, autonomous systems, and signal processing. He directs the Radionavigation Laboratory and is associate director of UT SAVES. His recent focus has been on assured perception for autonomous systems, including navigation, timing, and collision avoidance, and on centimeter-accurate location for the mass market.

Dr. Humphreys is also on the graduate study committee of the UT Department of Electrical and Computer Engineering and a faculty member of the Wireless Networking and Communications Group (WNCG). He received the UT Regents' Outstanding Teaching Award in 2012, the NSF Career Award in 2015, the Institute of Navigation Thurlow Award in 2015, and the Presidential Early Career Award for Scientists and Engineers (PECASE, via National Science Foundation) in 2019. He is a Fellow of the Institute of Navigation. Dr. Humphreys joined the faculty of the Cockrell School of Engineering in Fall 2009.

Performance Period: 04/01/2015 - 03/31/2020
Institution: University of Texas at Austin
Sponsor: National Science Foundation
Award Number: 1454474
CPS: Synergy: Collaborative Research: Adaptive Intelligence for Cyber-Physical Automotive Active Safety - System Design and Evaluation
Lead PI:
Laurent Itti
Abstract
The automotive industry finds itself at a cross-roads. Current advances in MEMS sensor technology, the emergence of embedded control software, the rapid progress in computer technology, digital image processing, machine learning and control algorithms, along with an ever increasing investment in vehicle-to-vehicle (V2V) and vehicle-to-infrastructure (V2I) technologies, are about to revolutionize the way we use vehicles and commute in everyday life. Automotive active safety systems, in particular, have been used with enormous success in the past 50 years and have helped keep traffic accidents in check. Still, more than 30,000 deaths and 2,000,000 injuries occur each year in the US alone, and many more worldwide. The impact of traffic accidents on the economy is estimated to be as high as $300B/yr in the US alone. Further improvement in terms of driving safety (and comfort) necessitates that the next generation of active safety systems are more proactive (as opposed to reactive) and can comprehend and interpret driver intent. Future active safety systems will have to account for the diversity of drivers' skills, the behavior of drivers in traffic, and the overall traffic conditions. This research aims at improving the current capabilities of automotive active safety control systems (ASCS) by taking into account the interactions between the driver, the vehicle, the ASCS and the environment. Beyond solving a fundamental problem in automotive industry, this research will have ramifications in other cyber-physical domains, where humans manually control vehicles or equipment including: flying, operation of heavy machinery, mining, tele-robotics, and robotic medicine. Making autonomous/automated systems that feel and behave "naturally" to human operators is not always easy. As these systems and machines participate more in everyday interactions with humans, the need to make them operate in a predictable manner is more urgent than ever. To achieve the goals of the proposed research, this project will use the estimation of the driver's cognitive state to adapt the ASCS accordingly, in order to achieve a seamless operation with the driver. Specifically, new methodologies will be developed to infer long-term and short-term behavior of drivers via the use of Bayesian networks and neuromorphic algorithms to estimate the driver's skills and current state of attention from eye movement data, together with dynamic motion cues obtained from steering and pedal inputs. This information will be injected into the ASCS operation in order to enhance its performance by taking advantage of recent results from the theory of adaptive and real-time, model-predictive optimal control. The correct level of autonomy and workload distribution between the driver and ASCS will ensure that no conflicts arise between the driver and the control system, and the safety and passenger comfort are not compromised. A comprehensive plan will be used to test and validate the developed theory by collecting measurements from several human subjects while operating a virtual reality-driving simulator.
Performance Period: 09/15/2015 - 08/31/2018
Institution: University of Southern California
Sponsor: National Science Foundation
Award Number: 1545089
Subscribe to