Applications of CPS technologies used in health care.
A wide range of health outcomes is affected by air pollution. In March 2014 the World Health Organization (WHO) released a report that in 2012 alone, a staggering 7 million people died as a result of air pollution exposure, one in eight of total global deaths. A major component of this pollution is airborne particulate matter, with approximately 50 million Americans have allergic diseases. This project will develop and field the first integrated IoT in-situ sensor package tracking pollution and pollen to provide airborne particulate mapping for Chattanooga. Longer term it is hoped that the data collection approach and initial visualization tools developed in Chattanooga can be used to support a nationwide, open access dissemination platform on the order of Google's StreetView, but called PollutionView. Such scaling of the project's pilot results through a PollutionView tool will contribute significantly to a transformation of the Environmental Public Health field in the United States. The project involves real-time big data analysis at a fine-grain geographic level. This will involve trades with sensing and computing especially if the sensor package is to be deployed at scale. The project will help determine if real-time allergen collection and visualization can improve health and wellness. Thus, this project will combine Cyber Physical Systems (CPS) and gigabit networks to address major health concerns due to air pollution. A working demonstration of this project will be presented during the Global City Teams meeting in June 2015 with an update in June 2016. Airborne particulate matter particularly affects the citizens of Chattanooga, TN. The objectives of this project are twofold: first, to develop and deploy an array of Internet of Things (IoT) in-situ sensors within Chattanooga capable of comprehensively characterizing air quality in real time, including location, temperature, pressure, humidity, the abundance of 6 criterion pollutants (O3, CO, NO, NO2, SO2, and H2S), and the abundance of airborne particulates (10-40 µm), both pollen-sized and smaller PM2.5 (<2.5 µm) particles; and second, to have a pollen validation campaign by deploying an in-situ pollen air sampler in Chattanooga to identify specific pollen types.
Off
University of Texas at Dallas
-
National Science Foundation
David Lary Submitted by David Lary on December 22nd, 2015
The project investigates a formal verification framework for artificial pancreas (AP) controllers that automate the delivery of insulin to patients with type-1 diabetes (T1D). AP controllers are safety critical: excessive insulin delivery can lead to serious, potentially fatal, consequences. The verification framework under development allows designers of AP controllers to check that their control algorithms will operate safely and reliably against large disturbances that include patient meals, physical activities, and sensor anomalies including noise, delays, and sensor attenuation. The intellectual merits of the project lie in the development of state-of-the-art formal verification tools, that reason over mathematical models of the closed-loop including external disturbances and insulin-glucose response. These tools perform an exhaustive exploration of the closed loop system behaviors, generating potentially adverse situations for the control algorithm under verification. In addition, automatic techniques are being investigated to help AP designers improve the control algorithm by tuning controller parameters to eliminate harmful behaviors and optimize performance. The broader significance and importance of the project are to minimize the manual testing effort for AP controllers, integrate formal tools in the certification process, and ultimately ensure the availability of safe and reliable devices to patients with type-1 diabetes. The framework is made available to researchers who are developing AP controllers to help them verify and iteratively improve their designs. The team is integrating the research into the educational mission by designing hands-on courses to train undergraduate students in the science of Cyber-Physical Systems (CPS) using the design of AP controllers as a motivating example. Furthermore, educational material that explains the basic ideas, current challenges and promises of the AP concept is being made available to a wide audience that includes patients with T1D, their families, interested students, and researchers. The research is being carried out collaboratively by teams of experts in formal verification for Cyber-Physical Systems, control system experts with experience designing AP controllers, mathematical modeling experts, and clinical experts who have clinically evaluated AP controllers. To enable the construction of the verification framework from the current state-of-the-art verification tools, the project is addressing major research challenges, including (a) building plausible mathematical models of disturbances from available clinical datasets characterizing human meals, activity patterns, and continuous glucose sensor anomalies. The resulting models are integrated in a formal verification framework; (b) simplifying existing models of insulin glucose response using smaller but more complex delay differential models; (c) automating the process of abstracting the controller implementation for the purposes of verification; (d) producing verification results that can be interpreted by control engineers and clinical researchers without necessarily understanding formal verification techniques; and (e) partially automating the process of design improvements to potentially eliminate severe faults and improve performance. The framework is evaluated on a set of promising AP controller designs that are currently under various stages of clinical evaluation.
Off
University of Texas at El Paso
-
National Science Foundation
Submitted by Fraser Cameron on December 22nd, 2015
Cyber physical systems (CPSs) are merging into major mobile systems of our society, such as public transportation, supply chains, and taxi networks. Past researchers have accumulated significant knowledge for designing cyber physical systems, such as for military surveillance, infrastructure protection, scientific exploration, and smart environments, but primarily in relatively stationary settings, i.e., where spatial and mobility diversity is limited. Differently, mobile CPSs interact with phenomena of interest at different locations and environments, and where the context information (e.g., network availability and connectivity) about these physical locations might not be available. This unique feature calls for new solutions to seamlessly integrate mobile computing with the physical world, including dynamic access to multiple wireless technologies. The required solutions are addressed by (i) creating a network control architecture based on novel predictive hierarchical control and that accounts for characteristics of wireless communication, (ii) developing formal network control models based on in-situ network system identification and cross-layer optimization, and (iii) designing and implementing a reference implementation on a small scale wireless and vehicular test-bed based on law enforcement vehicles. The results can improve all mobile transportation systems such as future taxi control and dispatch systems. In this application advantages are: (i) reducing time for drivers to find customers; (ii) reducing time for passengers to wait; (iii) avoiding and preventing traffic congestion; (iv) reducing gas consumption and operating cost; (v) improving driver and vehicle safety, and (vi) enforcing municipal regulation. Class modules developed on mobile computing and CPS will be used at the four participating Universities and then be made available via the Web.
Off
SUNY at Stony Brook
-
National Science Foundation
Submitted by Shan Lin on December 22nd, 2015
Millions of mobile applications (apps) are being developed in domains such as energy, health, security, and entertainment. The US FDA expects that there will be 500 million smart phone users downloading healthcare related apps by the end of 2015. Many of these apps will perform interventions to control human physiological parameters such as blood pressure and heart rate. The intervention aspects of the apps can cause dependency problems, e.g., multiple interventions of multiple apps can increase or decrease each other's effects, some of which can be harmful to the user. Detecting and resolving these dependencies are the main goals of this project. Success in this research can significantly improve the safety of home health care. This project will develop EyePhy, a completely new approach to primary and secondary dependency analysis for wellness and mobile medical apps based on smart phones. The approach offers personalized dependency analysis and accounts for time dependent interventions such as time interval for which a drug or other intervention is effective. To do that, EyePhy uses a physiological simulator called HumMod which was developed by the medical community to model the complex interactions of the human physiology using over 7800 variables. Among the goals of EyePhy are the reduction of app developers' effort in specifying dependency metadata compared to state of the art solutions, offering personalized dependency analysis for the user, and identifying problems in real time, as medical app products are being used. Such dependency problems occur mainly because (i) each app is developed independently without knowing how other apps work and (ii) when an app performs an intervention to control its target parameters (e.g., blood pressure), it may affect other physiological parameters (e.g., kidney) without even knowing it. A priori proofs that individual cyber-physical systems (CPS) app devices are safe cannot guarantee how it will be used and with which other (future) apps it may be run concurrently. It is becoming more common for people to use multiple apps. The average person will not understand how multiple apps might affect his health due to hidden dependencies among a large number of parameters. Consequently, a tool such as EyPhy is critical to future deployments of safe mobile medical apps.
Off
University of Virginia Main Campus
-
National Science Foundation
John Stankovic Submitted by John Stankovic on December 22nd, 2015
This project exploits an early concept of a flexible, low-cost, and drone-carried broadband long-distance communication infrastructure and investigates its capability for immediate smart-city application in emergency response. This effort is to support the Smart Emergency Response System (SERS) cluster to participate in the Global City Teams Challenge. This project will have an immediate impact in firefighting and other smart-city emergency response applications by quickly deploying a broadband communication infrastructure, thus improving the efficiency of first responders and saving lives. This communication infrastructure expands the capability of individual drones and enables broad new multi-drone applications for smart cities and has the potential to create new businesses and job markets. This interdisciplinary project addresses the following technology issues: 1) development of cyber-physical systems (CPS) technology that enables robust long-range drone-to-drone communication infrastructure; 2) practical drone system design and performance evaluation for WiFi provision; and 3) a systematic investigation of its capability to address smart-city emergency response needs, through both analysis and participation in fire-fighting exercises, as a case study. The project team includes an academic institution, technology companies and government planners, each of whom provides complementary expertise and perspectives that are crucial to the success of the project. The project also provides exciting interdisciplinary training opportunities for students and the community to learn CPS technologies and the Global City Teams Challenge efforts.
Off
University of North Texas
-
National Science Foundation
Submitted by Shengli Fu on December 22nd, 2015
This project will work with national and international medical and disaster professionals to extract formal use cases for ground, aerial, and marine robots for medical response and humanitarian relief to the Ebola (and future) epidemics. A set of detailed use cases is urgently needed to meet the challenges posed by the epidemic and to prepare robotics for assisting with future epidemics. The robotics community cannot provide robots without understanding the needs and engineering mistakes or mismatches will both be financially costly and delay the delivery of effective solutions. This is a rare opportunity to work with responders as they plan for a deployment of more than 3,000 troops plus Centers for Disease Control workers, and a possibly greater number of volunteers through non-governmental organizations such as Doctors Without Borders. The project outcomes will allow robotics companies to confidently pre-position/re-position products and to incorporate the findings into R&D investment strategies. The categorization of problems will guide academia in future research and to use as motivating class projects. The effective use of robots will provide responders with tools for the short term and will provide achievable expectations of robotics technology in general. There is no comprehensive statement of the missions that robots can be used for during a medical event and general mission descriptions (e.g. we need a robot to transport bodies) do not capture the design constraints on a robot. Prior work has shown that not understanding the operational envelope, work domain, and culture results in overly expensive robots that cannot be adopted. Robotics has not been considered by health professionals for the entire space of a medical event (hospitals, field medicine, logistics, security from riots), nor has the disaster or medical robotics communities been engaged with epidemics. This project will provide the fundamental understanding of how robots can be used for medical disasters and will design a formal process for projecting robotics requirements. It will benefit safety security and rescue robotics by expanding research from meteorological, geological, and man-made disasters to medical disasters and surgical robotics and telerobotics by pushing the boundaries of how robots are used for biosafety event.
Off
Texas A&M Engineering Experiment Station
-
National Science Foundation
Submitted by Robin Murphy on December 22nd, 2015
Human-in-the-loop control strategies in which the user performs a task better, and feels more confident to do so, is an important area of research for cyber-physical systems. Humans are very adept at learning to control complex systems, particularly those with non-intuitive kinematic constraints (e.g., cars, bicycles, wheelchairs, steerable needles). With the advent of cyber-physical systems, (physical systems integrated with cyber control layer), human control is no longer constrained to system inputs. Users can also control system outputs through a number of different teleoperation mappings. Given all this flexibility, what is the most intuitive way for a human user to control an arbitrary system and how is intuitiveness quantified? The project focuses on human-in-the-loop control for medical needles, which steer with bicycle-like kinematics. These needles could be used in a variety of medical interventions including tissue biopsy, tumor ablation, abscess drainage, and local drug delivery. We have explored a variety of teleoperation mappings for human control of these steerable needles; yet, we have found inconsistencies between objective performance metrics (e.g., task time and error), and post-experimental surveys on comfort or ease-of use. Users occasionally report a preference for control mappings, which objectively degrade performance, and vice versa. It is important to measure the real-time engagement of the user with the physical system in order to capture the nuances of how different control mappings affect physical effort, mental workload, distraction, drowsiness, and emotional response. Physiological sensors such as electroencephalography (EEG), galvanic skin response (GSR), and electromyography (EMG), can be used to provide these real-time measurements and to quantitatively classify the intuitiveness of new teleoperation algorithms. Broader Impacts: Intuitive and natural human-in-the-loop control interfaces will improve human health and well being, through applications in surgery and rehabilitation. The results of this study will be disseminated publicly on the investigator's laboratory website, a conference workshop, and a new medical robotics seminar to be held jointly between UT Dallas and UT Southwestern Medical Center. Outreach activities, lab tours, and mentoring of underrepresented students at all levels, will broaden participation in STEM. Additionally, the proximity of the investigator?s hospital-based lab to medical professionals will engage non-engineers in design and innovation
Off
University of Texas at Dallas
-
National Science Foundation
Ann Majewicz Submitted by Ann Majewicz on December 22nd, 2015
This cross-disciplinary research proposes a patient-specific cost-saving approach to the design and optimization of healthcare cyber-physical systems (HCPS). The HCPS computes the patient's physiological state based on sensors, communicates this information via a network from home to hospital for quantifying risk indices, signals the need for critical medical intervention in real time, and controls vital health signals (e.g., cardiac rhythm, blood glucose). The research proposed under the HCPS paradigm will treat the human body as a complex system. It will entail the development of mathematical models that capture the time-dependence and fractal behavior of physiological processes and the design of quality-of-life (QoL) control strategies for medical devices. The research will advance the understanding of the correlations between physiological processes, drug treatment, stress level and lifestyle. To date, the complex interdependence, variability and individual characteristics of physiological processes have not been taken into account in the design of medical devices and artificial organs. The existing mathematical approaches rely on reductionist and Markovian assumptions. This research project will rethink the theoretical foundations for the design of healthcare cyber-physical systems by capturing the interdependencies and fractal characteristics of physiological processes within a highly dynamic network. To establish the theoretical foundations of HCPS, a three-step approach will be followed: (i) construct a multi-scale non-equilibrium statistical physics inspired framework for patient modeling that captures the time dependence, non-Gaussian behavior, interdependencies and multi-fractal behavior of physiological processes; (ii) develop adaptive patient-specific and physiology-aware (multi-fractal) close-loop control algorithms for dynamic complex networks; (iii) design algorithms and methodologies for the HCPS networked components that account for biological and technological constraints. This research will significantly contribute to early chronic disease detection and treatment. Models and implementable algorithms, which can both predict physiological dynamics and assess the risk of acute and chronic diseases, will be valuable instruments for patient-centered healthcare. This in-depth mathematical analysis of physiological complexity facilitates a transformative multimodal and multi-scale approach to CPS design with healthcare applications. The project not only addresses the current scientific and technological gap in CPS, but can also foster new research directions in related fields such as the study of interdependent networks with implications for understanding homeostasis and diseases and the study and control of complex systems. The cyber-physical systems designed under this newly proposed paradigm will have vital social and economic implications, including the improvement of QoL and the reduction of lost productivity rates due to chronic diseases. The project will offer interdisciplinary training for graduate, undergraduate and K-12 students. The PI will integrate the research results within his courses at University of Southern California and make them widely available through the project website. Moreover, the PI will enhance civic engagement by involving college and K-12 students in community outreach activities that will raise awareness of the important role of health monitoring.
Off
University of Southern California
-
National Science Foundation
Paul Bogdan Submitted by Paul Bogdan on December 22nd, 2015
All cyber-physical systems (CPS) depend on properly calibrated sensors to sense the surrounding environment. Unfortunately, the current state of the art is that calibration is often a manual and expensive operation; moreover, many types of sensors, especially economical ones, must be recalibrated often. This is typically costly, performed in a lab environment, requiring that sensors be removed from service. MetaSense will reduce the cost and management burden of calibrating sensors. The basic idea is that if two sensors are co-located, then they should report similar values; if they do not, the least-recently-calibrated sensor is suspect. Building on this idea, this project will provide an autonomous system and a set of algorithms that will automate the detection of calibration issues and preform recalibration of sensors in the field, removing the need to take sensors offline and send them to a laboratory for calibration. The outcome of this project will transform the way sensors are engineered and deployed, increasing the scale of sensor network deployment. This in turn will increase the availability of environmental data for research, medical, personal, and business use. MetaSense researchers will leverage this new data to provide early warning for factors that could negatively affect health. In addition, graduate student engagement in the research will help to maintain the STEM pipeline. This project will leverage large networks of mobile sensors connected to the cloud. The cloud will enable using large data repositories and computational power to cross-reference data from different sensors and detect loss of calibration. The theory of calibration will go beyond classical models for computation and physics of CPS. The project will combine big data, machine learning, and analysis of the physics of sensors to calculate two factors that will be used in the calibration. First, MetaSense researchers will identify measurement transformations that, applied in software after the data collection, will generate calibrated results. Second, the researchers will compute the input for an on-board signal-conditioning circuit that will enable improving the sensitivity of the physical measurement. The project will contribute research results in multiple disciplines. In the field of software engineering, the project will contribute a new theory of service reconfiguration that will support new architecture and workflow languages. New technologies are needed because the recalibration will happen when the machine learning algorithms discover calibration errors, after the data has already been collected and processed. These technologies will support modifying not only the raw data in the database by applying new calibration corrections, but also the results of calculations that used the data. In the field of machine learning, the project will provide new algorithms for dealing with spatiotemporal maps of noisy sensor readings. In particular, the algorithms will work with Gaussian processes and the results of the research will provide more meaningful confidence intervals for these processes, substantially increasing the effectiveness of MetaSense models compared to the current state of the art. In the field of pervasive computing, the project will build on the existing techniques for context-aware sensing to increase the amount of information available to the machine learning algorithms for inferring calibration parameters. Adding information about the sensing context is paramount to achieve correct calibration results. For example, a sensor that measures air pollution inside a car on a highway will get very different readings if the car window is open or closed. Finally, the project will contribute innovations in sensor calibration hardware. Here, the project will contribute innovative signal-conditioning circuits that will interact with the cloud system and receive remote calibration parameters identified by the machine learning algorithms. This will be a substantial advance over current circuits based on simple feedback loops because it will have to account for the cloud and machine learning algorithms in the loop and will have to perform this more complex calibration with power and bandwidth constraints. Inclusion of graduate students in the research helps to maintain the STEM pipeline.
Off
University of California at San Diego
-
National Science Foundation
William Griswold Submitted by William Griswold on December 22nd, 2015
Brain-computer interfaces (BCIs) are cyber-physical systems (CPSs) that record human brain waves and translate them into the control commands for external devices such as computers and robots. They may allow individuals with spinal cord injury (SCI) to assume direct brain control of a lower extremity prosthesis to regain the ability to walk. Since the lower extremity paralysis due to SCI leads to as much as $50 billion of health care cost each year in the US alone, the use of a BCI-controlled lower extremity prosthesis to restore walking can have a significant public health impact. Recent results have demonstrated that a person with paraplegia due to SCI can use a non-invasive BCI to regain basic walking. While encouraging, this BCI is unlikely to become a widely adopted solution since the poor signal quality of non-invasively recorded brain waves may lead to unreliable BCI operation. Moreover, lengthy and tedious mounting procedures of the non-invasive BCI systems are impractical. A permanently implantable BCI CPS can address these issues, but critical challenges must be overcome to achieve this goal, including the elimination of protruding electronics and reliance on an external computer for brain signal processing. The goal of this study is to develop a benchtop version of a fully implantable BCI CPS, capable of acquiring electrocorticogram signals, recorded directly from the surface of the brain, and analyzing them internally to enable direct brain control of a robotic gait exoskeleton (RGE) for walking. The BCI CPS will be designed as a low-power system with revolutionary adaptive power management in order to meet stringent heat and power consumption constraints for future human implantation. Comprehensive measurements and benchtop tests will ensure proper function of BCI CPS. Finally, the system will be integrated with an RGE, and its ability to facilitate brain-controlled walking will be tested in a small group of human subjects. The successful completion of this project will have broad bioengineering and scientific impact. It will revolutionize medical device technology by minimizing power consumption and heat production while enabling complex operations to be performed. The study will also help deepen the understanding of how the human brain controls walking, which has long been a mystery to neuroscientists. Finally, this study?s broader impact is to promote education and lifelong learning in engineering students and the community, broaden the participation of underrepresented groups in engineering, and increase the scientific literacy of persons with disabilities. Research opportunities will be provided to (under-)graduate students. Their findings will be broadly disseminated and integrated into teaching activities. To inspire underrepresented K-12 and community college students to pursue higher education in STEM fields, and to increase the scientific literacy of persons with disabilities, outreach activities will be undertaken in the form of live scientific exhibits and actual BCI demonstrations. Recent results have demonstrated that a person with paraplegia due to SCI can use an electroencephalogram (EEG)-based BCI to regain basic walking. While encouraging, this EEG-based BCI is unlikely to become a widely adopted solution due to EEG?s inherent noise and susceptibility to artifacts, which may lead to unreliable operation. Also, lengthy and tedious EEG (un-)mounting procedures are impractical. A permanently implantable BCI CPS can address these issues, but critical CPS challenges must be overcome to achieve this goal, including the elimination of protruding electronics and reliance on an external computer for neural signal processing. The goal of this study is to implement a benchtop analogue of a fully implantable BCI CPS, capable of acquiring high-density (HD) electrocorticogram (ECoG) signals, and analyzing them internally to facilitate direct brain control of a robotic gait exoskeleton (RGE) for walking. The BCI CPS will be designed as a low-power modular system with revolutionary adaptive power management in order to meet stringent heat dissipation and power consumption constraints for future human implantation. The first module will be used for acquisition of HD-ECoG signals. The second module will internally execute optimized BCI algorithms and wirelessly transmit commands to an RGE for walking. System and circuit-level characterizations will be conducted through comprehensive measurements. Benchtop tests will ensure the proper system function and conformity to biomedical constraints. Finally, the system will be integrated with an RGE, and its ability to facilitate brain-controlled walking will be tested in a group of human subjects.The successful completion of this project will have broad bioengineering and scientific impact. It will revolutionize medical device technology by minimizing power consumption and heat dissipation while enabling complex algorithms to be executed in real time. The study will also help deepen the physiological understanding of how the human brain controls walking. This study will promote education and lifelong learning in engineering students and the community, broaden the participation of underrepresented groups in engineering, and increase the scientific literacy of persons with disabilities. Research opportunities will be provided to under-graduate students. Their findings will be broadly disseminated and integrated into teaching activities. To inspire underrepresented K-12 and community college students to pursue higher education in STEM fields, and to increase the scientific literacy of persons with disabilities, outreach activities will be undertaken in the form of live scientific exhibits and actual BCI demonstrations.
Off
University of California at Irvine
-
National Science Foundation
Payam Heydari Submitted by Payam Heydari on December 22nd, 2015
Subscribe to Health Care