The terms denote educational areas that are part of the CPS technology.
This research investigates a cyber-physical framework for scalable, long-term monitoring and maintenance of civil infrastructures. With growth of the world economy and its population, there has been an ever increasing dependency on larger and more complex networks of civil infrastructure as evident in the billions of dollars spent by the federal, state and local governments to either upgrade or repair transportation systems or utilities. Despite these large expenditures, the nation continues to suffer staggering consequences from infrastructural decay. Therefore, paramount to the concept of a smart city of the future is the concept of smart civil infrastructure that can self-monitor itself to predict any impending failures and in the cases of extreme events (e.g. earthquakes) identify portions that would require immediate repair, and prioritize areas for emergency response. A goal of this research project is to make significant progress towards this grand vision by investigating a framework of infrastructural Internet-of-Things (i-IoT) using a network of self-powered, embedded health monitoring sensors. The collaborative and interdisciplinary nature of this research would provide opportunities for unique outreach programs involving undergraduate and graduate students in technical areas, e.g., sensors, IoTs and structural health monitoring. The project would also provide avenues for disseminating the results of this research to stakeholders in the state governments and for translating the results of the research into field deployable prototypes. This research addresses different elements of the proposed i-IoT framework by bringing together expertise from three universities in the area of self-powered sensors, energy scavenging processors, structural health monitoring and earthquake engineering. At the fundamental level, the project involves investigating self-powered sensors that will require zero maintenance and can continuously operate over the useful lifespan of the structure without experiencing any downtime. The challenge in this regard is that sensors need to occupy a small enough volume such that an array of these devices could be easily embedded and can provide accurate spatial resolution in structural imaging. This research is also investigates techniques that would enable real time wireless collection of data from an array of self-powered sensors embedded inside a structure, without taking the structure out-of-service. The methods to be explored involve combining the physics of energy scavenging, transduction, rectification and logic computation to improve the system's energy-efficiency and reduce the system latency. At the algorithmic level the project explores novel structural failure prediction and structural forensic algorithms based on historical data collected from self-powered sensors embedded at different spatial locations. This includes kernel algorithms that can exploit the data to quickly identify the most vulnerable part of a structure after a man-made or a natural crisis (for example an earthquake). Finally, the technology translation plan for this research is to validate the proposed i-IoT framework in real-world deployment, which includes buildings, multi-span bridges and highways.
Off
Washington University in St. Louis
-
National Science Foundation
Xuan Zhang
Submitted by Shantanu Chakrabartty on July 12th, 2017
The objective of this research is to (1) gain insights into the challenges of securing interactions in Internet of Things (IoT)deployments, (2) develop a practical framework that mitigates security and privacy threats to IoT interactions, and (3) validate the proposed framework in a medium-scale IoT testbed and through user studies. The emerging IoT computing paradigm promises novel applications in almost all sectors by enabling interactions between users, sensors, and actuators. These interactions can take the form of device-to-device (e.g., Bluetooth Low Energy (BLE)) or human-to-device (e.g., voice control). By exploiting vulnerabilities in these interaction surfaces, an adversary can gain unauthorized access to the IoT, which enables tracking, profiling and posing harm to the user. With the thousands of diverse IoT manufacturers, developers, and devices, it is very challenging, if not impossible, to ensure all devices are properly secured at production and kept up-to-date after production. IoT users and administrators have to place their trust in a set of devices, with the least secure device breaking the security chain. By shifting the trust base from the various manufacturers and developers to a single framework under the user's control, deploying IoT devices will be more feasible and less vulnerable. The proposed framework will help advance the national health, prosperity and welfare, and also secure the national defense. Securing IoT interface surfaces as case studies will be integrated in graduate-level courses, and used to train (especially underrepresented and female) students with interdisciplinary topics that require a balanced mix of theory and practice, thus developing human resources in the nationally needed areas.The proposed research will also significantly advance the understanding of the challenges to secure IoT interaction surfaces in practice, thus promoting the progress of science. This project will establish a general direction to secure interactions in the current and future IoT deployments. It will offer an additional protection layer in the cases where security cannot be properly built-in and maintained.
Off
University of Michigan Ann Arbor
-
National Science Foundation
Kang Shin Submitted by Kang Shin on July 12th, 2017
Due to their increasing use by civil and federal authorities and vast commercial and amateur applications, Unmanned Aerial Systems (UAS) will be introduced into the National Air Space (NAS); the question is only how this can be done safely. Today, NASA and the FAA are designing a new, (NextGen) automated air traffic control system for all aircraft, manned or unmanned. New algorithms and tools will need to be developed to enable computation of the complex questions inherent in designing such a system while proving adherence to rigorous safety standards. Researchers must develop the tools of formal analysis to be able to address the UAS in the NAS problem, reason about UAS integration during the design phase of NextGen, and tie this design to on-board capabilities to provide runtime System Health Management (SHM), ensuring the safety of people and property on the ground. This proposal takes a holistic view and integrates advances in the state of the art from three intertwined perspectives to address safe integration of unmanned systems into the national airspace: from on-board the vehicle, from the environment (NAS), and from the underlying theory enabling their formal analysis. There has been rapid development of new UAS technologies yet few of them are formally mathematically rigorous to the degree needed for FAA safety-critical system certification. This project bridges that gap, integrating new UAS and air traffic control designs with advances in formal analysis. Within the wealth of promising directions for autonomous UAS capabilities, this project fills a unique need, providing a direct synergy between on-board UAS SHM, the NAS environment in which they must operate, and the theoretical foundations common to both of these. This research will help to build a safer NAS with increased capacity for UAS and create broadly impactful capabilities for SHM on-board UAS. Advancements will require theoretical research into more scalable model checking and debugging of safety properties. Safety properties express the sentiment that "something bad does not happen" during any system execution; they represent the vast majority of the requirements for NextGen designs and all requirements researchers can monitor on-board a UAS for system heath management during runtime. This research will tackle new frontiers in embedding health management capabilities on-board UAS. Collaborations with aerospace system designers at the National Aeronautics and Space Administration and tool designers at the Bruno Kessler Foundation will aid real-life utility and technology transfer. Broader impact will be achieved by involving undergraduate students in the design of an open-source, affordable, all-COTS and 3D-printable UAS, which will facilitate flight testing of this project's research advances. An open-UAS design for academia will be useful both for classroom demonstrations and as a research platform. Further impact will be achieved by using this UAS and the research it enables in interactive teaching experiences for K-12, undergraduate, and graduate students and in mentoring outreach specifically targeted at girls achieving in Science, Technology, Engineering and Mathematics (STEM) subjects.
Off
University of Cincinnati
-
National Science Foundation
Submitted by Kristin Yvonne Rozier on May 30th, 2017
This Frontier award supports the SONYC project, a smart cities initiative focused on developing a cyber-physical system (CPS) for the monitoring, analysis and mitigation of urban noise pollution. Noise pollution is one of the topmost quality of life issues for urban residents in the U.S. with proven effects on health, education, the economy, and the environment. Yet, most cities lack the resources for continuously monitoring noise and understanding the contribution of individual sources, the tools to analyze patterns of noise pollution at city-scale, and the means to empower city agencies to take effective, data-driven action for noise mitigation. The SONYC project advances novel technological and socio-technical solutions that help address these needs. SONYC includes a distributed network of both sensors and people for large-scale noise monitoring. The sensors use low-cost, low-power technology, and cutting-edge machine listening techniques, to produce calibrated acoustic measurements and recognizing individual sound sources in real time. Citizen science methods are used to help urban residents connect to city agencies and each other, understand their noise footprint, and facilitate reporting and self-regulation. Crucially, SONYC utilizes big data solutions to analyze, retrieve and visualize information from sensors and citizens, creating a comprehensive acoustic model of the city that can be used to identify significant patterns of noise pollution. This data can in turn be used to drive the strategic application of noise code enforcement by city agencies, in a way that optimally reduces noise pollution. The entire system, integrating cyber, physical and social infrastructure, forms a closed loop of continuous sensing, analysis and actuation on the environment. SONYC is an interdisciplinary collaboration between researchers at New York University and Ohio State University. It provides multiple educational opportunities to students at all levels, including an outreach initiative for K-12 STEM education. The project uses New York City as its focal point, involving partnerships with the city's Department of Environmental Protection, Department of Health and Mental Hygiene, the business improvement district of Lower Manhattan, and ARUP, one of the world's leaders in environmental acoustics. SONYC is an innovative and high-impact application of cyber-physical systems to the realm of smart cities, and potentially a catalyst for new CPS research at the intersection of engineering, data science and the social sciences. It provides a blueprint for the mitigation of noise pollution that can be applied to cities in the US and abroad, potentially affecting the quality of life of millions of people.
Off
New York University
-
National Science Foundation
Claudio Silva
Roger DuBois
Juan Bello
Anish Arora Submitted by Anish Arora on May 26th, 2017
The recent increase in the variety and usage of wearable sensing systems allows for the continuous monitoring of health and wellness of users. The output of these systems enable individuals to make changes to their personal routines in order to minimize exposures to pollutants and maintain healthy levels of exercise. Furthermore, medical practitioners are using these systems to monitor proper activity levels for rehabilitation purposes and to monitor threatening conditions such as heart arrhythmias. However, there is substantial work to be done to facilitate the processing and interpretation of such information in order to maximize impact. This proposal develops a computational framework that models the complex interactions between physiological and environmental factors contributing to an individual's health. The contributions of this award will facilitate the broad adoption of wearable sensing platforms and innovative analytical tools by individuals and medical practitioners. This award develops methodology for the estimation and prediction of physiological responses and environmental factors, with the objective of enabling users to efficiently change their behavior. To accomplish this objective, the framework will build on tools from statistical analysis, topological data analysis, optimization theory and human behavior analysis. This novel framework will not only develop new formal techniques, but it will also serve as a bridge between these cross-disciplinary fields. In particular, the proposed hierarchical computational framework has the potential of providing a trade-off between accuracy and computational flexibility based on the choice of granularity of the representation. This award will: (1) develop methodology for the concurrent representation of physiological, kinematic and environmental states for inference purposes; (2) develop techniques for mapping representations between different systems to enable information sharing; and (3) develop techniques to maximize the impact on the behavior of individuals by building on the proposed data representation. The algorithm development will be informed by integration of limitations on embedded platforms due to memory, computational and power capabilities, and transmission costs when off-board processing is required. The proposed techniques will empower users and medical practitioners to understand, analyze, and make decisions based on patterns in the data. The outcomes of this project will empower medical practitioners by providing innovative and effective tools for wearable sensing systems which enable efficient pattern identification, data representation and visualization. Besides training students directly working on this project, the data sets and algorithms developed will be incorporated into a new graduate course on computational techniques for physiological and environmental sensing. Undergraduate students will be engaged by participating in data collection experiments, REUs, and local demonstrations. Underrepresented undergraduate student communities will be exposed to the research at the national level by presenting demos at well-known diversity conferences in the STEM fields. Furthermore, K-12 local student communities will be engaged via summer workshops that will be prepared for students and educators.
Off
North Carolina State University
-
National Science Foundation
Submitted by Edgar Lobaton on May 26th, 2017
Equipment operation represents one of the most dangerous tasks on a construction sites and accidents related to such operation often result in death and property damage on the construction site and the surrounding area. Such accidents can also cause considerable delays and disruption, and negatively impact the efficiency of operations. This award will conduct research to improve the safety and efficiency of cranes by integrating advances in robotics, computer vision, and construction management. It will create tools for quick and easy planning of crane operations and incorporate them into a safe and efficient system that can monitor a crane's environment and provide control feedback to the crane and the operator. Resulting gains in safety and efficiency wil reduce fatal and non-fatal crane accidents. Partnerships with industry will also ensure that these advances have a positive impact on construction practice, and can be extended broadly to smart infrastructure, intelligent manufacturing, surveillance, traffic monitoring, and other application areas. The research will involve undergraduates and includes outreach to K-12 students. The work is driven by the hypothesis that the monitoring and control of cranes can be performed autonomously using robotics and computer vision algorithms, and that detailed and continuous monitoring and control feedback can lead to improved planning and simulation of equipment operations. It will particularly focus on developing methods for (a) planning construction operations while accounting for safety hazards through simulation; (b) estimating and providing analytics on the state of the equipment; (c) monitoring equipment surrounding the crane operating environment, including detection of safety hazards, and proximity analysis to dynamic resources including materials, equipment, and workers; (d) controlling crane stability in real-time; and (e) providing feedback to the user and equipment operators in a "transparent cockpit" using visual and haptic cues. It will address the underlying research challenges by improving the efficiency and reliability of planning through failure effects analysis and creating methods for contact state estimation and equilibrium analysis; improving monitoring through model-driven and real-time 3D reconstruction techniques, context-driven object recognition, and forecasting motion trajectories of objects; enhancing reliability of control through dynamic crane models, measures of instability, and algorithms for finding optimal controls; and, finally, improving efficiency of feedback loops through methods for providing visual and haptic cues.
Off
University of Illinois at Urbana-Champaign
-
National Science Foundation
Mani Golparvar-Fard Submitted by Mani Golparvar-Fard on May 25th, 2017
Computation is everywhere. Greeting cards have processors that play songs. Fireworks have processors for precisely timing their detonation. Computers are in engines, monitoring combustion and performance. They are in our homes, hospitals, offices, ovens, planes, trains, and automobiles. These computers, when networked, will form the Internet of Things (IoT). The resulting applications and services have the potential to be even more transformative than the World Wide Web. The security implications are enormous. Internet threats today steal credit cards. Internet threats tomorrow will disable home security systems, flood fields, and disrupt hospitals. The root problem is that these applications consist of software on tiny low-power devices and cloud servers, have difficult networking, and collect sensitive data that deserves strong cryptography, but usually written by developers who have expertise in none of these areas. The goal of the research is to make it possible for two developers to build a complete, secure, Internet of Things applications in three months. The research focuses on four important principles. The first is "distributed model view controller." A developer writes an application as a distributed pipeline of model-view-controller systems. A model specifies what data the application generates and stores, while a new abstraction called a transform specifies how data moves from one model to another. The second is "embedded-gateway-cloud." A common architecture dominates Internet of Things applications. Embedded devices communicate with a gateway over low-power wireless. The gateway processes data and communicates with cloud systems in the broader Internet. Focusing distributed model view controller on this dominant architecture constrains the problem sufficiently to make problems, such as system security, tractable. The third is "end-to-end security." Data emerges encrypted from embedded devices and can only be decrypted by end user applications. Servers can compute on encrypted data, and many parties can collaboratively compute results without learning the input. Analysis of the data processing pipeline allows the system and runtime to assert and verify security properties of the whole application. The final principle is "software-defined hardware." Because designing new embedded device hardware is time consuming, developers rely on general, overkill solutions and ignore the resulting security implications. The data processing pipeline can be compiled into a prototype hardware design and supporting software as well as test cases, diagnostics, and a debugging methodology for a developer to bring up the new device. These principles are grounded in Ravel, a software framework that the team collaborates on, jointly contributes to, and integrates into their courses and curricula on cyberphysical systems.
Off
University of California at Berkeley
-
National Science Foundation
Submitted by Bjoern Hartmann on May 4th, 2017
The Cyber Resilient Energy Delivery Consortium (CREDC) is excited to present the 2017 Summer School program to be held June 12-16, 2017! Our opening reception will be on Sunday, June 11.
Submitted by Adam Hahn on May 1st, 2017
Event
mLearn 2017
16th World Conference on Mobile and Contextual Learning (mLearn 2017) Golden Bay Beach Hotel, Larnaca, Cyprus | 30 October - 1 November, 2017 | http://iamlearn.org/mlearn/ In Cooperation with ACM SIGAPP
Submitted by Anonymous on April 14th, 2017

Call for Student Posters on Unmanned Aircraft Systems (UAS) Communications and Networking Research

 

Kamesh Namuduri Submitted by Kamesh Namuduri on January 24th, 2017
Subscribe to Education