CPS: Synergy: Collaborative Research: Cyber-Physical Sensing, Modeling, and Control with Augmented Reality for Smart Manufacturing Workforce Training and Operations Management
Lead PI:
Zhaozheng Yin
Abstract
Smart manufacturing integrates information, technology, and human ingenuity to inspire the next revolution in the manufacturing industry. Manufacturing has been identified as a key strategic investment area by the U.S. government, private sector, and university leaders to spur innovation and keep America competitive. However, the lack of new methodologies and tools is challenging continuous innovation in the smart manufacturing industry. This award supports fundamental research to develop a cyber-physical sensing, modeling, and control infrastructure, coupled with augmented reality, to significantly improve the efficiency of future workforce training, performance of operations management, safety and comfort of workers for smart manufacturing. Results from this research are expected to transform the practice of worker-machine-task coordination and provide a powerful tool for operations management. This research involves several disciplines including sensing, data analytics, modeling, control, augmented reality, and workforce training and will provide unique interdisciplinary training opportunities for students and future manufacturing engineers. An effective way for manufacturers to tackle and outpace the increasing complexity of product designs and ever-shortening product lifecycles is to effectively develop and assist the workforce. Yet the current management of manufacturing workforce systems relies mostly on the traditional methods of data collection and modeling, such as subjective observations and after-the-fact statistics of workforce performance, which has reached a bottleneck in effectiveness. The goal of this project is to investigate an integrated set of cyber-physical system methods and tools to sense, understand, characterize, model, and optimize the learning and operation of manufacturing workers, so as to achieve significantly improved efficiency in worker training, effectiveness of behavioral operations management, and safety of front-line workers. The research team will instrument a suite of sensors to gather real-time data about individual workers, worker-machine interactions, and the working environment,develop advanced methods and tools to track and understand workers' actions and physiological status, and detect their knowledge and skill deficiencies or assistance needs in real time. The project will also establish mathematical models that encode the manufacturing process in the research sensing and analysis framework, characterize the efficiency of worker-machine-task coordination, model the learning curves of individual workers, investigate various multi-modal augmented reality-based visualization, guidance, control, and intervention schemes to improve task efficiency and worker safety, and deploy, test, and conduct comprehensive performance assessments of the Researched technologies.
Performance Period: 02/01/2017 - 01/31/2020
Institution: Missouri University of Science and Technology
Sponsor: National Science Foundation
Award Number: 1646162
CPS: Frontier: Collaborative Research: Data-Driven Cyberphysical Systems
Lead PI:
Mario Sznaier
Abstract
Data-driven cyber-physical systems are ubiquitous in many sectors including manufacturing, automotive, transportation, utilities and health care. This project develops the theory, methods and tools necessary to answer the central question "how can we, in a data-rich world, design and operate cyber-physical systems differently?" The resulting data-driven techniques will transform the design and operation process into one in which data and models - and human designers and operators - continuously and fluently interact. This integrated view promises capabilities beyond its parts. Explicitly integrating data will lead to more efficient decision-making and help reduce the gap from model-based design to system deployment. Furthermore, it will blend design- and run-time tasks, and help develop cyber-physical systems not only for their initial deployment but also for their lifetime. While proposed theory, methods and tools will cut across the spectrum of cyber-physical systems, the project focuses on their implications in the emerging application of additive manufacturing. Even though a substantial amount of engineering time is spent, additive manufacturing processes often fail to produce acceptable geometric, material or electro-mechanical properties. Currently, there is no mechanism for predicting and correcting these systematic, repetitive errors nor to adapt the design process to encompass the peculiarities of this manufacturing style. A data-driven cyber-physical systems perspective has the potential to overcome these challenges in additive manufacturing. The project's education plan focuses on the already much needed transformation of the undergraduate and graduate curricula to train engineers and computer scientists who will create the next-generation of cyber-physical with a data-driven mindset. The team will reach out to K-12 students and educators through a range of activities, and to undergraduate students from underrepresented groups through year-long research projects. All educational material generated by the project will be shared publicly.
Performance Period: 10/01/2017 - 09/30/2020
Institution: Northeastern University
Sponsor: National Science Foundation
Award Number: 1646121
CPS: Synergy: Collaborative Research: Mapping and Querying Underground Infrastructure Systems
Lead PI:
Goce Trajcevski
Abstract
One of the challenges toward achieving the vision of smart cities is improving the state of the underground infrastructure. For example, large US cities have thousands of miles of aging water mains, resulting in hundreds of breaks every year, and a large percentage of water consumption that is unaccounted for. The goal of this project is to develop models and methods to generate, analyze, and share data on underground infrastructure systems, such as water, gas, electricity , and sewer networks. The interdisciplinary team of investigators from the University of Illinois at Chicago, Brown University, and Northwestern University will leverage partnerships with the cities of Chicago and Evanston, Illinois, to make the approach and findings relevant to their stakeholders. Research results will be incorporated in courses at the three institutions. Outreach efforts include events for K-12 students to develop awareness about underground infrastructure from a data and computational perspective. The results of the project will ultimately help municipalities maintain and renovate civil infrastructure in a more effective manner. Cities are cyber-physical systems on a grand scale, and developing a precise knowledge of their infrastructure is critical to building a foundation for the future smart city. This proposal takes an information centric approach based on the complex interaction among thematic data layers to developing, visualizing, querying, analyzing, and providing access to a comprehensive representation of the urban underground infrastructure starting from incomplete and imprecise data. Specifically, the project has the following main technical components: (1) Generation of accurate GIS-based representations of underground infrastructure systems from paper maps, CAD drawings, and other legacy data sources; (2) Visualization of multi-layer networks combining schematic overview diagrams with detailed geometric representations; (3) Query processing algorithms for integrating spatial, temporal, and network data about underground infrastructure systems; (4) Data analytics spanning heterogeneous geospatial data sources and incorporating uncertainty and constraints; (5) Selective access to stakeholders on a need-to-know basis and facilitating data sharing; and (6) Evaluation in collaboration with the cities of Chicago and Evanston.
Performance Period: 09/01/2016 - 08/31/2019
Institution: Northwestern University
Sponsor: National Science Foundation
Award Number: 1646107
CPS: Synergy: Collaborative Research: Cyber-Physical Sensing, Modeling, and Control with Augmented Reality for Smart Manufacturing Workforce Training and Operations Management
Lead PI:
Zhihai He
Abstract
Smart manufacturing integrates information, technology, and human ingenuity to inspire the next revolution in the manufacturing industry. Manufacturing has been identified as a key strategic investment area by the U.S. government, private sector, and university leaders to spur innovation and keep America competitive. However, the lack of new methodologies and tools is challenging continuous innovation in the smart manufacturing industry. This award supports fundamental research to develop a cyber-physical sensing, modeling, and control infrastructure, coupled with augmented reality, to significantly improve the efficiency of future workforce training, performance of operations management, safety and comfort of workers for smart manufacturing. Results from this research are expected to transform the practice of worker-machine-task coordination and provide a powerful tool for operations management. This research involves several disciplines including sensing, data analytics, modeling, control, augmented reality, and workforce training and will provide unique interdisciplinary training opportunities for students and future manufacturing engineers. An effective way for manufacturers to tackle and outpace the increasing complexity of product designs and ever-shortening product lifecycles is to effectively develop and assist the workforce. Yet the current management of manufacturing workforce systems relies mostly on the traditional methods of data collection and modeling, such as subjective observations and after-the-fact statistics of workforce performance, which has reached a bottleneck in effectiveness. The goal of this project is to investigate an integrated set of cyber-physical system methods and tools to sense, understand, characterize, model, and optimize the learning and operation of manufacturing workers, so as to achieve significantly improved efficiency in worker training, effectiveness of behavioral operations management, and safety of front-line workers. The research team will instrument a suite of sensors to gather real-time data about individual workers, worker-machine interactions, and the working environment,develop advanced methods and tools to track and understand workers' actions and physiological status, and detect their knowledge and skill deficiencies or assistance needs in real time. The project will also establish mathematical models that encode the manufacturing process in the research sensing and analysis framework, characterize the efficiency of worker-machine-task coordination, model the learning curves of individual workers, investigate various multi-modal augmented reality-based visualization, guidance, control, and intervention schemes to improve task efficiency and worker safety, and deploy, test, and conduct comprehensive performance assessments of the Researched technologies.
Performance Period: 02/01/2017 - 01/31/2020
Institution: University of Missouri-Columbia
Sponsor: National Science Foundation
Award Number: 1646065
CPS: Synergy: Collaborative Research: Foundations of Secure Cyber-Physical Systems of Systems
Lead PI:
Stephen Checkoway
Abstract
Factories, chemical plants, automobiles, and aircraft have come to be described today as cyber-physical systems of systems--distinct systems connected to form a larger and more complex system. For many such systems, correct operation is critical to safety, making their security of paramount importance. Unfortunately, because of their heterogeneous nature and special purpose, it is very difficult to determine whether a malicious attacker can make them behave in a manner that causes harm. This type of security analysis is an essential step in building and certifying secure systems. Unfortunately, today's state of the art security analysis tools are tailored to the analysis of server, desktop, and mobile software. We currently lack the tools for analyzing the security of cyber physical systems of systems. The proposed work will develop new techniques for testing and analyzing security properties of such systems. These techniques will be used to build a new generation of tools that can handle the complexity of modern cyber-physical systems and thus make these critical systems more secure.The technical approach taken by the investigators is to applying proven dynamic analysis techniques, including dynamic information flow tracking and symbolic execution, to this problem. Existing tools, while powerful, are monolithic, designed to apply a single technique to a single system. Scaling them to multiple heterogeneous systems is the main contribution of the proposed work. To do so, the investigators will develop a common platform for cross-system dynamic analysis supporting arbitrary combinations of component execution modes (physical, simulated, and emulated), requiring new coordination mechanisms. Second, building on the platform above, they will implement cross-system dynamic information flow tracking, allowing dynamic information flow tracking across simulated, emulated, and potentially physical components. Third, they will extend existing symbolic/concrete execution techniques to execution across multiple heterogeneous systems. Fourth, they will introduce new ways of handling special-purpose hardware, a problem faced by dynamic analysis tools in general.
Performance Period: 10/01/2016 - 09/30/2019
Institution: University of Illinois at Chicago
Sponsor: National Science Foundation
Award Number: 1646063
CPS: Synergy: Connected Testbeds for Connected Vehicles
Lead PI:
Tulga Ersal
Co-PI:
Abstract
This research team envisions that connected testbeds, i.e., remotely accessible testbeds integrated over a network in closed loop, will provide an affordable, repeatable, scalable, and high-fidelity solution for early cyber-physical evaluation of connected automated vehicle (CAV) technologies. Engineering testbeds are critical for empirical validation of new concepts and transitioning new theory to practice. However, the high cost of establishing new testbeds or scaling the existing ones up hinders their wide utilization. This project aims to develop a scientific foundation to support this vision and demonstrate its utility for developing CAV technologies. This application is significant, because a synergistic combination of connected vehicles and automated driving technologies is poised to transform the sustainability of our transportation system; automated driving technologies can leverage the information available from vehicle-to-vehicle (V2V) connectivity in optimal ways to dramatically reduce fuel consumption and emissions. However, state-of-the-art simulation and experimental capabilities fall short of addressing the need for realistic, repeatable, scalable, and affordable means to evaluate new CAV concepts and technologies. The goal of this project is to enable a high-fidelity integration of geographically dispersed powertrain testbeds and use this novel experimental capability to develop and test powertrain-level strategies to increase sustainability benefits of CAVs. To realize this vision, the first objective of this research is to develop a cyber-integration interface to increase coupling fidelity in connected testbeds. This objective will be pursued through a model-free predictor framework to compensate for network delays robustly. The second objective is to leverage this cyber-integration interface to create a connected testbed for CAVs. To this end, existing powertrain testbeds distributed across the University of Michigan campus and Environmental Protection Agency will be leveraged. The third objective is to use this connected testbed for (i) developing powertrain-level strategies to minimize fuel consumption and emissions in CAV platoons of mixed vehicle types, including light-, medium-, and heavy duty vehicles, (ii) uncovering the untapped potential of aggressively downsized powertrains, and (iii) understanding the limits of the benefits of connectivity due to various V2V communication issues. This research area provides a rich space to advance the science of cyber-physical systems and demonstrate their impact, as it spans multiple disciplines including time delay systems, system dynamics and control, hardware-in-the-loop simulation, engine control, powertrain management, and communication networks. The potential of CAVs to improve the sustainability of transportation is an outstanding example of how cyber-physical systems can have a societal impact. The connected testbeds concept, on the other hand, can benefit not only CAVs, but also a wide range of applications such as telerobotics, haptics, networked control systems, earthquake engineering, manufacturing, and aerospace. It can open new doors for researchers to perform unparalleled integrative collaborations by enabling them to leverage each other's testbeds remotely.
Performance Period: 10/01/2016 - 09/30/2019
Institution: University of Michigan Ann Arbor
Sponsor: National Science Foundation
Award Number: 1646019
CPS: Synergy: Collaborative Research: Closed-loop Hybrid Exoskeleton utilizing Wearable Ultrasound Imaging Sensors for Measuring Fatigue
Lead PI:
Nitin Sharma
Abstract
The goal of this project is to develop an automated assistive device capable of restoring walking and standing functions in persons with motor impairments. Although research on assistive devices, such as active and passive orthoses and exoskeletons, has been ongoing for several decades, the improvements in mobility have been modest due to a number of limitations. One major challenge has been the limited ability to sense and interpret the state of the human, including volitional motor intent and fatigue. The proposed device will consist of powered electric motors, as well as the power generated by the person's own muscles. This work proposes to develop novel sensors to monitor muscle function, and, muscle fatigue is identified, the system will switch to the electric motors until the muscles recover. Through research on methods of seamless automated control of a hybrid assistive device while minimizing muscle fatigue, this study addresses significant limitations of prior work. The proposed project has the long-term potential to significantly improve walking and quality of life of individuals with spinal cord injuries and stroke. The proposed work will also contribute to new science of cyber-physical systems by integrating wearable image-based biosensing with physical exoskeleton systems through computational algorithms. This project will provide immersive interdisciplinary training for graduate and undergraduate students to integrate computational methods with imaging, robotics, human functional activity and artificial devices for solving challenging public health problems. A strong emphasis will be placed on involving undergraduate students in research as part of structured programs at our institutions. Additionally, students with disabilities will be involved in this research activities by leveraging an ongoing NSF-funded project. This project includes the development of wearable ultrasound imaging sensors and real-time image analysis algorithms that can provide direct measurement of the function and status of the underlying muscles. This will allow development of dynamic control allocation algorithms that utilize this information to distribute control between actuation and stimulation. This approach for closed-loop control based on muscle-specific feedback represents a paradigm shift from conventional lower extremity exoskeletons that rely only on joint kinematics for feedback. As a testbed for this new approach, the team will utilize a hybrid exoskeleton that combines active joint actuators with functional electrical stimulation of a person's own muscles. Repetitive electrical stimulation leads to the rapid onset of muscle fatigue that limits the utility of these hybrid systems and potentially increases risk of injury. The goals of the project are: develop novel ultrasound sensing technology and image analysis algorithms for real-time sensing of muscle function and fatigue; investigate closed-loop control allocation algorithms utilizing measured muscle contraction rates to minimize fatigue; integrate sensing and control methods into a closed loop hybrid exoskeleton system and evaluate on patients with spinal cord injury. The proposed approach will lead to innovative CPS science by (1) integrating a human-in-the-loop physical exoskeleton system with novel image-based real-time robust sensing of complex time-varying physical phenomena, such as dynamic neuromuscular activity and fatigue, and (2) developing novel computational models to interpret such phenomena and effectively adapt control strategies. This research will enable practical wearable image-based biosensing, with broader applications in healthcare. This framework can be widely applicable in a number of medical CPS problems that involve a human in the loop, including upper and lower extremity prostheses and exoskeletons, rehabilitation and surgical robots. The new control allocation algorithms relying on sensor measurements could have broader applicability in fault-tolerant and redundant actuator systems, and reliable fault-tolerant control of unmanned aerial vehicles.
Performance Period: 01/01/2017 - 12/31/2020
Institution: University of Pittsburgh
Sponsor: National Science Foundation
Award Number: 1646009
CPS: Frontier: Collaborative Research: Data-Driven Cyberphysical Systems
Lead PI:
Alberto Sangiovanni Vincentelli
Abstract
Data-driven cyber-physical systems are ubiquitous in many sectors including manufacturing, automotive, transportation, utilities and health care. This project develops the theory, methods and tools necessary to answer the central question "how can we, in a data-rich world, design and operate cyber-physical systems differently?" The resulting data-driven techniques will transform the design and operation process into one in which data and models - and human designers and operators - continuously and fluently interact. This integrated view promises capabilities beyond its parts. Explicitly integrating data will lead to more efficient decision-making and help reduce the gap from model-based design to system deployment. Furthermore, it will blend design- and run-time tasks, and help develop cyber-physical systems not only for their initial deployment but also for their lifetime. While proposed theory, methods and tools will cut across the spectrum of cyber-physical systems, the project focuses on their implications in the emerging application of additive manufacturing. Even though a substantial amount of engineering time is spent, additive manufacturing processes often fail to produce acceptable geometric, material or electro-mechanical properties. Currently, there is no mechanism for predicting and correcting these systematic, repetitive errors nor to adapt the design process to encompass the peculiarities of this manufacturing style. A data-driven cyber-physical systems perspective has the potential to overcome these challenges in additive manufacturing. The project's education plan focuses on the already much needed transformation of the undergraduate and graduate curricula to train engineers and computer scientists who will create the next-generation of cyber-physical with a data-driven mindset. The team will reach out to K-12 students and educators through a range of activities, and to undergraduate students from underrepresented groups through year-long research projects. All educational material generated by the project will be shared publicly.
Performance Period: 10/01/2017 - 09/30/2020
Institution: University of California-Berkeley
Sponsor: National Science Foundation
Award Number: 1645964
CPS: Breakthrough: Multi-Sensory Event Detection for Cross-Platform Coordination and Verification
Lead PI:
Patrick Tague
Abstract
As researchers and developers move from digital to cyber-physical systems, a gap is emerging that is revealing challenges to performance and security in many different cyber-physical system domains. In particular, cybersecurity protections in the digital domain provide desirable protection and verification mechanisms that currently have no analog in the physical domain, limiting verification capabilities in cyber-physical systems. To illustrate this gap, suppose two devices are deployed in the same physical space where they should be allowed to coordinate despite any previous physical or digital relationship. Instead of relying on risky human involvement or specialized hardware, automating the pairing process using measurable and verifiable details of common context should allow them to bootstrap a trust relationship. Our project goals to enable this verification based on context measurement will fill an important gap in cyber-physical systems. The concept of "context fingerprinting" is offered as an approach to bridge this gap without the need for computationally intensive cryptography. It is a technique to allow two dissimilar devices to observe events,
process measurement data, and create and exchange contextual fingerprints to verify a shared property. This project explores the CPS foundations for context fingerprinting of devise. The concept will be empirically evaluated, showing how to facilitate usable and
secure bootstrapping of trust among IoT devices.
Performance Period: 01/01/2017 - 12/31/2019
Institution: Carnegie-Mellon University
Sponsor: National Science Foundation
Award Number: 1645759
CPS: Synergy: Image-Based Indoor Navigation for Visually Impaired Users
Lead PI:
Marco Duarte
Abstract
Severe visual impairment and blindness preclude many essential activities of daily living. Among these is independent navigation in unfamiliar indoor spaces without the assistance of a sighted companion. We propose to develop PERCEPT-V: an organic vision-driven, smartphone-based indoor navigation system, in which the user can navigate in open spaces without requiring retrofit of the environment. When the user seeks to obtain navigation instructions to a chosen destination, the smartphone will record observations from multiple onboard sensors in order to perform user localization. Once the location and orientation of the user are estimated, they are used to calculate the coordinates of the navigation landmarks surrounding the user. The system can then provide directions to the chosen destination, as well as an optional description of the landmarks around the user. We will focus on addressing the cyber-physical systems technology shortcomings usually encountered in the development of indoor navigation systems. More specifically, our project will consider the following transformative aspects in the design of PERCEPT-V: (i) Image-Based Indoor Localization and Orientation: PERCEPT-V will feature new computer vision-based localization algorithms that reduce the dependence on highly controlled image capture and richly informative images, increasing the reliability of localization from images taken by blind subjects in crowded environments; (ii) Customized Navigation Instructions: PERCEPT-V will deliver customized navigation instructions for sight-impaired users that accounts for diverse levels of confidence and operator capabilities. A thorough final evaluation study featuring visually impaired participants will assess our hypotheses driving the design and refinements of PERCEPT-V using rigorous statistical analysis.
Performance Period: 02/01/2017 - 01/31/2020
Institution: University of Massachusetts Amherst
Sponsor: National Science Foundation
Award Number: 1645737
Subscribe to