The terms denote engineering domains that have high CPS content.
Motivated by the fact that the 2014 Ebola outbreak is the largest in history and there is a pressing need to understand how to improve delivery of care with the right technological interventions at the right place, this Rapid Response Research is aimed at realizing a human-in-the-loop medical cyber-physical system (CPS) for monitoring patients, insuring compliance with relevant safety protocols, and collecting data for advancing multidisciplinary research on infectious disease control. The ultimate goal is to enhance safety of Ebola workers by minimizing their contact with potentially contaminated surfaces and materials through integration of methods and technologies to realize smart and connected treatment clinics. This project could impact the response to infectious disease outbreaks by augmenting existing treatment clinics with cost-effective, modular, reconfigurable and
 open-design CPS technologies. The project will train a new cadre of engineering students, researchers and innovators to be 
sensitive to societal needs and national priorities by involving K-Gray, undergraduate and graduate students in all aspects of the project, especially at the co-ideation and co-design stages. The project will bring together a multidisciplinary team of engineers, scientists, technologists, medical experts, and humanitarian aid workers to develop holistic solutions to infectious disease control. The broader impacts also include operational cost savings in treatment clinics by reducing the need and use of the personal protective equipment and preserve resources such as water by reducing consumption. In order to prevent, detect and respond to current Ebola outbreak and future similar infectious disease outbreaks, this research plan has the following interconnected aims: (1) contribute new knowledge, methods, and tools to better understand the operational procedures in an infectious disease treatment clinic, (2) design, implement and validate a treatment ward augmented with a medical CPS for patient monitoring, (3) apply intuitive control interfaces and data visualization tools for practical human-robot interaction, (4) realize traded, coordinated and collaborative shared control techniques for safe and effective mobile robot navigation inside a treatment facility, (5) assess acceptability and effectiveness of the technology among health care workers and patients. The team will develop a self-contained, modular and reconfigurable system composed of a connected sensor network for patient monitoring and a mobile robot platform for telemedicine that will primarily focus on the interoperability and integration of existing standardized 
hardware and software systems to realize a testbed for verification and validation of a medical CPS. Medical, emergency response and humanitarian aid experts will be engaged to critically assess user-experiences and acceptability among medical staff to develop pathways for fielding the system in a treatment clinic. This RAPID project will lead the way in designing the next generation of human-in-the-loop medical CPS for empowering health care workers worldwide in treating patients during infectious disease outbreaks.
Off
Worcester Polytechnic Institute
-
National Science Foundation
Sonia Chernova
Michael Gennert
Jeanine Skorinko
Taskin Padir Submitted by Taskin Padir on September 28th, 2016
Inadequate system understanding and inadequate situational awareness have caused large-scale power outages in the past. With the increased reliance on variable energy supply sources, system understanding and situational awareness of a complex energy system become more challenging. This project leverages the power of big data analytics to directly improve system understanding and situational awareness. The research provides the methodology for detecting anomalous events in real-time, and therefore allow control centers to take appropriate control actions before minor events develop into major blackouts. The significance for the society and for the power industry is profound. Energy providers will be able to prevent large-scale power outages and reduce revenue losses, and customers will benefit from reliable energy delivery with service guarantees. Students, including women and underrepresented groups, will be trained for the future workforce in this area. The project includes four major thrusts: 1) real-time anomaly detection from measurement data; 2) real-time event diagnosis and interpretation of changes in the state of the network; 3) real-time optimal control of the power grid; 4) scientific foundations underpinning cyber-physical systems. The major outcome of this project is practical solutions to event or fault detection and diagnosis in the power grid, as well as prediction and prevention of large-scale power outages.
Off
University of Chicago
-
National Science Foundation
Submitted by Mihai Anitescu on September 24th, 2016
Equipment operation represents one of the most dangerous tasks on a construction sites and accidents related to such operation often result in death and property damage on the construction site and the surrounding area. Such accidents can also cause considerable delays and disruption, and negatively impact the efficiency of operations. This award will conduct research to improve the safety and efficiency of cranes by integrating advances in robotics, computer vision, and construction management. It will create tools for quick and easy planning of crane operations and incorporate them into a safe and efficient system that can monitor a crane's environment and provide control feedback to the crane and the operator. Resulting gains in safety and efficiency will reduce fatal and non-fatal crane accidents. Partnerships with industry will also ensure that these advances have a positive impact on construction practice, and can be extended broadly to smart infrastructure, intelligent manufacturing, surveillance, traffic monitoring, and other application areas. The research will involve undergraduates and includes outreach to K-12 students. The work is driven by the hypothesis that the monitoring and control of cranes can be performed autonomously using robotics and computer vision algorithms, and that detailed and continuous monitoring and control feedback can lead to improved planning and simulation of equipment operations. It will particularly focus on developing methods for (a) planning construction operations while accounting for safety hazards through simulation; (b) estimating and providing analytics on the state of the equipment; (c) monitoring equipment surrounding the crane operating environment, including detection of safety hazards, and proximity analysis to dynamic resources including materials, equipment, and workers; (d) controlling crane stability in real-time; and (e) providing feedback to the user and equipment operators in a "transparent cockpit" using visual and haptic cues. It will address the underlying research challenges by improving the efficiency and reliability of planning through failure effects analysis and creating methods for contact state estimation and equilibrium analysis; improving monitoring through model-driven and real-time 3D reconstruction techniques, context-driven object recognition, and forecasting motion trajectories of objects; enhancing reliability of control through dynamic crane models, measures of instability, and algorithms for finding optimal controls; and, finally, improving efficiency of feedback loops through methods for providing visual and haptic cues.
Off
Pennsylvania State University
-
National Science Foundation
John Messner
Submitted by Chinemelu Anumba on September 24th, 2016
Part 1: Upper-limb motor impairments arise from a wide range of clinical conditions including amputations, spinal cord injury, or stroke. Addressing lost hand function, therefore, is a major focus of rehabilitation interventions; and research in robotic hands and hand exoskeletons aimed at restoring fine motor control functions gained significant speed recently. Integration of these robots with neural control mechanisms is also an ongoing research direction. We will develop prosthetic and wearable hands controlled via nested control that seamlessly blends neural control based on human brain activity and dynamic control based on sensors on robots. These Hand Augmentation using Nested Decision (HAND) systems will also provide rudimentary tactile feedback to the user. The HAND design framework will contribute to the assistive and augmentative robotics field. The resulting technology will improve the quality of life for individuals with lost limb function. The project will help train engineers skilled in addressing multidisciplinary challenges. Through outreach activities, STEM careers will be promoted at the K-12 level, individuals from underrepresented groups in engineering will be recruited to engage in this research project, which will contribute to the diversity of the STEM workforce. Part 2: The team previously introduced the concept of human-in-the-loop cyber-physical systems (HILCPS). Using the HILCPS hardware-software co-design and automatic synthesis infrastructure, we will develop prosthetic and wearable HAND systems that are robust to uncertainty in human intent inference from physiological signals. One challenge arises from the fact that the human and the cyber system jointly operate on the same physical element. Synthesis of networked real-time applications from algorithm design environments poses a framework challenge. These will be addressed by a tightly coupled optimal nested control strategy that relies on EEG-EMG-context fusion for human intent inference. Custom distributed embedded computational and robotic platforms will be built and iteratively refined. This work will enhance the HILCPS design framework, while simultaneously making novel contributions to body/brain interface technology and assistive/augmentative robot technology. Specifically we will (1) develop a theoretical EEG-EMG-context fusion framework for agile HILCPS application domains; (2) develop theory for and design novel control theoretic solutions to handle uncertainty, blend motion/force planning with high-level human intent and ambient intelligence to robustly execute daily manipulation activities; (3) further develop and refine the HILCPS domain-specific design framework to enable rapid deployment of HILCPS algorithms onto distributed embedded systems, empowering a new class of real-time algorithms that achieve distributed embedded sensing, analysis, and decision making; (4) develop new paradigms to replace, retrain or augment hand function via the prosthetic/wearable HAND by optimizing performance on a subject-by-subject basis.
Off
Spaulding Rehabilitation Hospital
-
National Science Foundation
Submitted by Paolo Bonato on September 24th, 2016
This project aims to design algorithmic techniques to perform activity discovery, recognition, and prediction from sensor data. These techniques will form the foundation for the science of Activity- Prediction Cyber-Physical Systems, including potential improvement in the responsiveness and adaptiveness of the systems. The outcome of this work is also anticipated to have important implications in the specific application areas of health care and sustainability, two priority areas of societal importance. The first application will allow for health interventions to be provided that adapt to an individual's daily routine and operate in that person's everyday environment. The second application will offer concrete tools for building automation that improve sustainability without disrupting an individual's current or upcoming activities. The project investigators will leverage existing training programs to involve students from underrepresented groups in this research. Bi-annual tours and a museum exhibit will reach K-12 teachers, students and visitors, and ongoing commercialization efforts will ensure that the designed technologies are made available for the public to use. Deploying activity-predictive cyber-physical systems "in the wild" requires a number of robust computational components for activity learning, knowledge transfer, and human-in- the-loop computing that are introduced as part of this project. These components then create cyber physical systems that funnel information from a sensed environment (the physical setting as well as humans in the environment), to activity models in the cloud, to mobile device interfaces, to the smart grid, and then back to the environment. The proposed research centers on defining the science of activity-predictive cyber-physical systems, organized around the following thrusts: (1) the design of scalable and generalizable algorithms for activity discovery, recognition, and prediction; (2) the design of transfer learning methods to increase the the ability to generalize activity-predictive cyber-physical systems; (3) the design of human-in-the-loop computing methods to increase the sensitivity of activity-predictive cyber-physical systems; (4) the introduction of evaluation metrics for activity-predictive cyber-physical systems; and (5) transition of activity-predictive cyber-physical systems to practical applications including health monitoring/intervention and smart/sustainable cities.
Off
Washington State University
-
National Science Foundation
Maureen Schmitter-Edgecombe
Janardhan Rao Doppa
Submitted by Diane Cook on September 24th, 2016
The timely and accurate in-service identification of faults in mechanical structures, such as airplanes, can play a vitally important role in avoiding catastrophes. One major challenge, however, is that the sensing system relies on high frequency signals, the coordination of which is difficult to achieve throughout a large structure. To tackle this fundamental issue, the research team will take advantage of 3D printing technology to fabricate integrated sensor-structure components. Specifically, the team plans to innovate a novel printing scheme that can embed piezoelectric transducers (namely, sensor/actuator coupled elements) into layered composites. As the transducers are densely distributed throughout the entire structure, they function like a nerve system embedded into the structure. Such a sensor nerve system, when combined with new control and command systems and advanced data and signal processing capability, can fully unleash the latest computing power to pinpoint the fault location. The new framework of utilizing emerging additive manufacturing technology to produce a structural system with integrated, densely distributed active sensing elements will potentially lead to paradigm-shifting progress in structural self-diagnosis. This advancement may allow the acquisition of high-quality, active interrogation data throughout the entire structure, which can then be used to facilitate highly accurate and robust decision-making. It will lead to intellectual contributions including: 1) development of a new sensing modality with mechanical-electrical dual-field adaptivity, that yields rich and high-quality data throughout the structure; 2) design of an additive manufacturing scheme that inserts piezoelectric micro transducer arrays throughout the structure to enable active interrogation; and 3) formulation of new data analytics and inverse analysis that can accurately identify the fault location/severity and guide the fine-tuning of the sensor system.
Off
Texas A&M Engineering Experiment Station
-
National Science Foundation
Submitted by Yu Ding on September 24th, 2016
Recent progress in autonomous and connected vehicle technologies coupled with Federal and State initiatives to facilitate their widespread use provide significant opportunities in enhancing mobility and safety for highway transportation. This project develops signalized intersection control strategies and other enabling sensor mechanisms for jointly optimizing vehicle trajectories and signal control by taking advantage of existing advanced technologies (connected vehicles and vehicle to infrastructure communications, sensors, autonomous vehicle technologies, etc.) Traffic signal control is a critical component of the existing transportation infrastructure and it has a significant impact on transportation system efficiency, as well as energy consumption and environmental impacts. In addition to advanced vehicle technologies, the strategies developed consider the presence of conventional vehicles in the traffic stream to facilitate transition to these new strategies in a mixed vehicle environment. The project also develops and uses simulation tools to evaluate these strategies as well as to provide tools that can be used in practice to consider the impacts of automated and connected vehicles in arterial networks. The project involves two industry partners (ISS and Econolite) to help facilitate new product development in anticipation of increased market penetration of connected and autonomous vehicles. The approach will be tested through simulation at University of Florida, through field tests at the Turner Fairbank Highway Research Center (TFHRC) and through the control algorithms that also will be deployed and tested in the field. The project will support multiple graduate students and will support creation of on-line classes. The project is at the intersection of several different disciplines (optimization, sensors, automated vehicles, transportation engineering) required to produce a real-time engineered system that depends on the seamless integration of several components: sensor functionality, connected and autonomous vehicle information communication, signal control optimization strategy, missing and erroneous information, etc. The project develops and implements optimization processes and strategies considering a seamless fusion of multiple data sources, as well as a mixed vehicle stream (autonomous, connected, and conventional vehicles) under real-world conditions of uncertain and missing data. Since trajectories for connected and conventional vehicles cannot be optimized or guaranteed, the project examines the impacts of the presence of automated vehicles on the following vehicles in a queue. The project also integrates advanced sensing technology needed to control a mixed vehicle stream, as well as address malfunctioning communications in connected and autonomous vehicles.
Off
University of Florida
-
National Science Foundation
Carl Crane
Submitted by Lily-Ageliki Elefteriadou on September 24th, 2016
Today's automobiles are increasingly autonomous. The latest Mercedes S-class sedan applies corrective action when its driver strays out of lane or tailgates too closely. Semi-autonomy will soon yield to full autonomy. Nissan has promised a line of self-driving cars by 2020. Maritime craft are likewise moving from rudimentary autopilots to full autonomy, and autonomous aerial vehicles will doubtless play a significant role in the future economy. Current versions of these vehicles are cocooned in an array of sensors, but neither the sensors nor the timing, navigation, and collision avoidance algorithms they feed have been designed for security against malicious attacks. Radar and acoustic sensors transmit predictable, uncoded signals; vehicle-to-vehicle communication protocols are either unauthenticated or critically dependent on insecure civil GPS signals (or both); and vehicle state estimators are designed for robustness but not security. These vulnerabilities are not merely conceptual: GPS spoofing attacks have been demonstrated against a drone and an ocean vessel, causing the drone to crash and the vessel to veer off course; likewise, it appears possible to cause road accidents by fooling a car's radar sensor into thinking a crash is imminent, thus triggering automatic braking. This proposal seeks funding to fix these vulnerabilities by developing sensors and high-level decision-making algorithms that are hardened against such so-called field attacks. The goal of secure control systems is to survive and operate safely despite sensor measurements or control commands being compromised. This proposal focuses on an emergent category of cyber-physical attack that has seen little scrutiny in the secure control literature. Like cyber attacks, these attacks are hard to detect and can be executed from a distance, but unlike cyber attacks, they are effective even against control systems whose software, data, and communications networks are secure, and so can be considered a more menacing long-term threat. These are attacks on the physical fields such as electromagnetic, magnetic, acoustic, etc. measured by system sensors. As specialized sensor attacks, field attacks seek to compromise a system's perception of reality non-invasively from without, not from within. We emphasize field attacks against navigation, collision avoidance, and synchronization sensors, as these are of special importance to the rise of autonomous vehicles and the smart grid. This proposal's goal is to develop a coherent analytical foundation for secure perception in the presence of field attacks and to develop a suite of algorithms and tools to detect such attacks. A key insight behind this proposal's approach is that the physics of field attacks impose fundamental difficulties on the attacker that can be exploited and magnified to enable attack detection. This work will progressively build security into navigation, collision avoidance, and timing perception from the physical sensory layer to the top-level state estimation algorithms. The outcome of this work will be smarter, more skeptical sensor systems for autonomous vehicles and other autonomous systems.
Off
University of Texas at Austin
-
National Science Foundation
Submitted by Todd Humphries on September 23rd, 2016
The automotive industry finds itself at a cross-roads. Current advances in MEMS sensor technology, the emergence of embedded control software, the rapid progress in computer technology, digital image processing, machine learning and control algorithms, along with an ever increasing investment in vehicle-to-vehicle (V2V) and vehicle-to-infrastructure (V2I) technologies, are about to revolutionize the way we use vehicles and commute in everyday life. Automotive active safety systems, in particular, have been used with enormous success in the past 50 years and have helped keep traffic accidents in check. Still, more than 30,000 deaths and 2,000,000 injuries occur each year in the US alone, and many more worldwide. The impact of traffic accidents on the economy is estimated to be as high as $300B/yr in the US alone. Further improvement in terms of driving safety (and comfort) necessitates that the next generation of active safety systems are more proactive (as opposed to reactive) and can comprehend and interpret driver intent. Future active safety systems will have to account for the diversity of drivers' skills, the behavior of drivers in traffic, and the overall traffic conditions. This research aims at improving the current capabilities of automotive active safety control systems (ASCS) by taking into account the interactions between the driver, the vehicle, the ASCS and the environment. Beyond solving a fundamental problem in automotive industry, this research will have ramifications in other cyber-physical domains, where humans manually control vehicles or equipment including: flying, operation of heavy machinery, mining, tele-robotics, and robotic medicine. Making autonomous/automated systems that feel and behave "naturally" to human operators is not always easy. As these systems and machines participate more in everyday interactions with humans, the need to make them operate in a predictable manner is more urgent than ever. To achieve the goals of the proposed research, this project will use the estimation of the driver's cognitive state to adapt the ASCS accordingly, in order to achieve a seamless operation with the driver. Specifically, new methodologies will be developed to infer long-term and short-term behavior of drivers via the use of Bayesian networks and neuromorphic algorithms to estimate the driver's skills and current state of attention from eye movement data, together with dynamic motion cues obtained from steering and pedal inputs. This information will be injected into the ASCS operation in order to enhance its performance by taking advantage of recent results from the theory of adaptive and real-time, model-predictive optimal control. The correct level of autonomy and workload distribution between the driver and ASCS will ensure that no conflicts arise between the driver and the control system, and the safety and passenger comfort are not compromised. A comprehensive plan will be used to test and validate the developed theory by collecting measurements from several human subjects while operating a virtual reality-driving simulator.
Off
University of Southern California
-
National Science Foundation
Submitted by Laurent Itti on September 23rd, 2016
As information technology has transformed physical systems such as the power grid, the interface between these systems and their human users has become both richer and much more complex. For example, from the perspective of an electricity consumer, a whole host of devices and technologies are transforming how they interact with the grid: demand response programs; electric vehicles; "smart" thermostats and appliances; etc. These novel technologies are also forcing us to rethink how the grid interacts with its users, because critical objectives such as stability and robustness require effective integration among the many diverse users in the grid. This project studies the complex interweaving of humans and physical systems. Traditionally, a separation principle has been used to isolate humans from physical systems. This principle requires users to have preferences that are well-defined, stable, and quickly discoverable. These assumptions are increasingly violated in practice: users' preferences are often not well-defined; unstable over time; and take time to discover. Our project articulates a new framework for interactions between physical systems and their users, where users' preferences must be repeatedly learned over time while the system continually operates with respect to imperfect preference information. We focus on the area of power systems. Our project has three main thrusts. First, user models are rethought to reflect the fact this new dynamic view of user preferences, where even the users are learning over time. The second thrust focuses on developing a new system model that learns about users, since we cannot understand users in a "single-shot"; rather, repeated interaction with the user is required. We then focus on the integration of these two new models. How do we control and operate a physical system, in the presence of the interacting "learning loops", while mediating between many competing users? We apply ideas from mean field games and optimal power flow to capture, analyze, and transform the interaction between the system and the ongoing preference discovery process. Our methods will yield guidance for market design in power systems where user preferences are constantly evolving. If successful, our project will usher in a fundamental change in interfacing physical systems and users. For example, in the power grid, our project directly impacts how utilities design demand response programs; how smart devices learn from users; and how the smart grid operates. In support of this goal, the PIs intend to develop avenues for knowledge transfer through interactions with industry. The PIs will also change their education programs to reflect a greater entanglement between physical systems and users.
Off
Stanford University
-
National Science Foundation
Submitted by Ramesh Johari on September 23rd, 2016
Subscribe to CPS Domains