This project will work with national and international medical and disaster professionals to extract formal use cases for ground, aerial, and marine robots for medical response and humanitarian relief to the Ebola (and future) epidemics. A set of detailed use cases is urgently needed to meet the challenges posed by the epidemic and to prepare robotics for assisting with future epidemics. The robotics community cannot provide robots without understanding the needs and engineering mistakes or mismatches will both be financially costly and delay the delivery of effective solutions. This is a rare opportunity to work with responders as they plan for a deployment of more than 3,000 troops plus Centers for Disease Control workers, and a possibly greater number of volunteers through non-governmental organizations such as Doctors Without Borders. The project outcomes will allow robotics companies to confidently pre-position/re-position products and to incorporate the findings into R&D investment strategies. The categorization of problems will guide academia in future research and to use as motivating class projects. The effective use of robots will provide responders with tools for the short term and will provide achievable expectations of robotics technology in general. There is no comprehensive statement of the missions that robots can be used for during a medical event and general mission descriptions (e.g. we need a robot to transport bodies) do not capture the design constraints on a robot. Prior work has shown that not understanding the operational envelope, work domain, and culture results in overly expensive robots that cannot be adopted. Robotics has not been considered by health professionals for the entire space of a medical event (hospitals, field medicine, logistics, security from riots), nor has the disaster or medical robotics communities been engaged with epidemics. This project will provide the fundamental understanding of how robots can be used for medical disasters and will design a formal process for projecting robotics requirements. It will benefit safety security and rescue robotics by expanding research from meteorological, geological, and man-made disasters to medical disasters and surgical robotics and telerobotics by pushing the boundaries of how robots are used for biosafety event.
Off
Texas A&M Engineering Experiment Station
-
National Science Foundation
Submitted by Robin Murphy on December 22nd, 2015
Human-in-the-loop control strategies in which the user performs a task better, and feels more confident to do so, is an important area of research for cyber-physical systems. Humans are very adept at learning to control complex systems, particularly those with non-intuitive kinematic constraints (e.g., cars, bicycles, wheelchairs, steerable needles). With the advent of cyber-physical systems, (physical systems integrated with cyber control layer), human control is no longer constrained to system inputs. Users can also control system outputs through a number of different teleoperation mappings. Given all this flexibility, what is the most intuitive way for a human user to control an arbitrary system and how is intuitiveness quantified? The project focuses on human-in-the-loop control for medical needles, which steer with bicycle-like kinematics. These needles could be used in a variety of medical interventions including tissue biopsy, tumor ablation, abscess drainage, and local drug delivery. We have explored a variety of teleoperation mappings for human control of these steerable needles; yet, we have found inconsistencies between objective performance metrics (e.g., task time and error), and post-experimental surveys on comfort or ease-of use. Users occasionally report a preference for control mappings, which objectively degrade performance, and vice versa. It is important to measure the real-time engagement of the user with the physical system in order to capture the nuances of how different control mappings affect physical effort, mental workload, distraction, drowsiness, and emotional response. Physiological sensors such as electroencephalography (EEG), galvanic skin response (GSR), and electromyography (EMG), can be used to provide these real-time measurements and to quantitatively classify the intuitiveness of new teleoperation algorithms. Broader Impacts: Intuitive and natural human-in-the-loop control interfaces will improve human health and well being, through applications in surgery and rehabilitation. The results of this study will be disseminated publicly on the investigator's laboratory website, a conference workshop, and a new medical robotics seminar to be held jointly between UT Dallas and UT Southwestern Medical Center. Outreach activities, lab tours, and mentoring of underrepresented students at all levels, will broaden participation in STEM. Additionally, the proximity of the investigator?s hospital-based lab to medical professionals will engage non-engineers in design and innovation
Off
University of Texas at Dallas
-
National Science Foundation
Ann Majewicz Submitted by Ann Majewicz on December 22nd, 2015
All cyber-physical systems (CPS) depend on properly calibrated sensors to sense the surrounding environment. Unfortunately, the current state of the art is that calibration is often a manual and expensive operation; moreover, many types of sensors, especially economical ones, must be recalibrated often. This is typically costly, performed in a lab environment, requiring that sensors be removed from service. MetaSense will reduce the cost and management burden of calibrating sensors. The basic idea is that if two sensors are co-located, then they should report similar values; if they do not, the least-recently-calibrated sensor is suspect. Building on this idea, this project will provide an autonomous system and a set of algorithms that will automate the detection of calibration issues and preform recalibration of sensors in the field, removing the need to take sensors offline and send them to a laboratory for calibration. The outcome of this project will transform the way sensors are engineered and deployed, increasing the scale of sensor network deployment. This in turn will increase the availability of environmental data for research, medical, personal, and business use. MetaSense researchers will leverage this new data to provide early warning for factors that could negatively affect health. In addition, graduate student engagement in the research will help to maintain the STEM pipeline. This project will leverage large networks of mobile sensors connected to the cloud. The cloud will enable using large data repositories and computational power to cross-reference data from different sensors and detect loss of calibration. The theory of calibration will go beyond classical models for computation and physics of CPS. The project will combine big data, machine learning, and analysis of the physics of sensors to calculate two factors that will be used in the calibration. First, MetaSense researchers will identify measurement transformations that, applied in software after the data collection, will generate calibrated results. Second, the researchers will compute the input for an on-board signal-conditioning circuit that will enable improving the sensitivity of the physical measurement. The project will contribute research results in multiple disciplines. In the field of software engineering, the project will contribute a new theory of service reconfiguration that will support new architecture and workflow languages. New technologies are needed because the recalibration will happen when the machine learning algorithms discover calibration errors, after the data has already been collected and processed. These technologies will support modifying not only the raw data in the database by applying new calibration corrections, but also the results of calculations that used the data. In the field of machine learning, the project will provide new algorithms for dealing with spatiotemporal maps of noisy sensor readings. In particular, the algorithms will work with Gaussian processes and the results of the research will provide more meaningful confidence intervals for these processes, substantially increasing the effectiveness of MetaSense models compared to the current state of the art. In the field of pervasive computing, the project will build on the existing techniques for context-aware sensing to increase the amount of information available to the machine learning algorithms for inferring calibration parameters. Adding information about the sensing context is paramount to achieve correct calibration results. For example, a sensor that measures air pollution inside a car on a highway will get very different readings if the car window is open or closed. Finally, the project will contribute innovations in sensor calibration hardware. Here, the project will contribute innovative signal-conditioning circuits that will interact with the cloud system and receive remote calibration parameters identified by the machine learning algorithms. This will be a substantial advance over current circuits based on simple feedback loops because it will have to account for the cloud and machine learning algorithms in the loop and will have to perform this more complex calibration with power and bandwidth constraints. Inclusion of graduate students in the research helps to maintain the STEM pipeline.
Off
University of California at San Diego
-
National Science Foundation
William Griswold Submitted by William Griswold on December 22nd, 2015
All cyber-physical systems (CPS) depend on properly calibrated sensors to sense the surrounding environment. Unfortunately, the current state of the art is that calibration is often a manual and expensive operation; moreover, many types of sensors, especially economical ones, must be recalibrated often. This is typically costly, performed in a lab environment, requiring that sensors be removed from service. MetaSense will reduce the cost and management burden of calibrating sensors. The basic idea is that if two sensors are co-located, then they should report similar values; if they do not, the least-recently-calibrated sensor is suspect. Building on this idea, this project will provide an autonomous system and a set of algorithms that will automate the detection of calibration issues and preform recalibration of sensors in the field, removing the need to take sensors offline and send them to a laboratory for calibration. The outcome of this project will transform the way sensors are engineered and deployed, increasing the scale of sensor network deployment. This in turn will increase the availability of environmental data for research, medical, personal, and business use. MetaSense researchers will leverage this new data to provide early warning for factors that could negatively affect health. In addition, graduate student engagement in the research will help to maintain the STEM pipeline. This project will leverage large networks of mobile sensors connected to the cloud. The cloud will enable using large data repositories and computational power to cross-reference data from different sensors and detect loss of calibration. The theory of calibration will go beyond classical models for computation and physics of CPS. The project will combine big data, machine learning, and analysis of the physics of sensors to calculate two factors that will be used in the calibration. First, MetaSense researchers will identify measurement transformations that, applied in software after the data collection, will generate calibrated results. Second, the researchers will compute the input for an on-board signal-conditioning circuit that will enable improving the sensitivity of the physical measurement. The project will contribute research results in multiple disciplines. In the field of software engineering, the project will contribute a new theory of service reconfiguration that will support new architecture and workflow languages. New technologies are needed because the recalibration will happen when the machine learning algorithms discover calibration errors, after the data has already been collected and processed. These technologies will support modifying not only the raw data in the database by applying new calibration corrections, but also the results of calculations that used the data. In the field of machine learning, the project will provide new algorithms for dealing with spatiotemporal maps of noisy sensor readings. In particular, the algorithms will work with Gaussian processes and the results of the research will provide more meaningful confidence intervals for these processes, substantially increasing the effectiveness of MetaSense models compared to the current state of the art. In the field of pervasive computing, the project will build on the existing techniques for context-aware sensing to increase the amount of information available to the machine learning algorithms for inferring calibration parameters. Adding information about the sensing context is paramount to achieve correct calibration results. For example, a sensor that measures air pollution inside a car on a highway will get very different readings if the car window is open or closed. Finally, the project will contribute innovations in sensor calibration hardware. Here, the project will contribute innovative signal-conditioning circuits that will interact with the cloud system and receive remote calibration parameters identified by the machine learning algorithms. This will be a substantial advance over current circuits based on simple feedback loops because it will have to account for the cloud and machine learning algorithms in the loop and will have to perform this more complex calibration with power and bandwidth constraints. Inclusion of graduate students in the research helps to maintain the STEM pipeline.
Off
University of Colorado at Boulder
-
National Science Foundation
Submitted by Michael Hannigan on December 22nd, 2015
In telerobotic applications, human operators interact with robots through a computer network. This project is developing tools to prevent security threats in telerobotics, by monitoring and detecting malicious activities and correcting for them. To develop tools to prevent and mitigate security threats against telerobotic systems, this project adapts cybersecurity methods and extends them to cyber-physical systems. Knowledge about physical constraints and interactions between the cyber and physical components of the system are leveraged for security. A monitoring system is developed which collects operator commands and robot feedback information to perform real-time verification of the operator. Timely and reliable detection of any discrepancy between real and spoofed operator movements enables quick detection of adversarial activities. The results are evaluated on the UW-developed RAVEN surgical robot. This project brings together research in robotics, computer and network security, control theory and machine learning, in order to gain better understanding of complex teleoperated robotic systems and to engineer telerobotic systems that provide strict safety, security and privacy guarantees. The results are relevant and applicable to a wide range of applications, including telerobotic surgery, search and rescue missions, military operations, underwater infrastructure and repair, cleanup and repair in hazardous environments, mining, as well as manipulation/inspections of objects in low earth orbit. The project algorithms, software and hardware are being made available to the non-profit cyber-physical research community. Graduate and undergraduate students are being trained in cyber-physical systems security topics, and K-12, community college students and under-represented minority students are being engaged.
Off
University of Washington
-
National Science Foundation
Howard Chizeck Submitted by Howard Chizeck on December 21st, 2015
Cyber physical systems (CPSs) are merging into major mobile systems of our society, such as public transportation, supply chains, and taxi networks. Past researchers have accumulated significant knowledge for designing cyber physical systems, such as for military surveillance, infrastructure protection, scientific exploration, and smart environments, but primarily in relatively stationary settings, i.e., where spatial and mobility diversity is limited. Differently, mobile CPSs interact with phenomena of interest at different locations and environments, and where the context information (e.g., network availability and connectivity) about these physical locations might not be available. This unique feature calls for new solutions to seamlessly integrate mobile computing with the physical world, including dynamic access to multiple wireless technologies. The required solutions are addressed by (i) creating a network control architecture based on novel predictive hierarchical control and that accounts for characteristics of wireless communication, (ii) developing formal network control models based on in-situ network system identification and cross-layer optimization, and (iii) designing and implementing a reference implementation on a small scale wireless and vehicular test-bed based on law enforcement vehicles. The results can improve all mobile transportation systems such as future taxi control and dispatch systems. In this application advantages are: (i) reducing time for drivers to find customers; (ii) reducing time for passengers to wait; (iii) avoiding and preventing traffic congestion; (iv) reducing gas consumption and operating cost; (v) improving driver and vehicle safety, and (vi) enforcing municipal regulation. Class modules developed on mobile computing and CPS will be used at the four participating Universities and then be made available via the Web.
Off
University of Virginia Main Campus
-
National Science Foundation
John Stankovic Submitted by John Stankovic on December 18th, 2015
Cyber physical systems (CPSs) are merging into major mobile systems of our society, such as public transportation, supply chains, and taxi networks. Past researchers have accumulated significant knowledge for designing cyber physical systems, such as for military surveillance, infrastructure protection, scientific exploration, and smart environments, but primarily in relatively stationary settings, i.e., where spatial and mobility diversity is limited. Differently, mobile CPSs interact with phenomena of interest at different locations and environments, and where the context information (e.g., network availability and connectivity) about these physical locations might not be available. This unique feature calls for new solutions to seamlessly integrate mobile computing with the physical world, including dynamic access to multiple wireless technologies. The required solutions are addressed by (i) creating a network control architecture based on novel predictive hierarchical control and that accounts for characteristics of wireless communication, (ii) developing formal network control models based on in-situ network system identification and cross-layer optimization, and (iii) designing and implementing a reference implementation on a small scale wireless and vehicular test-bed based on law enforcement vehicles. The results can improve all mobile transportation systems such as future taxi control and dispatch systems. In this application advantages are: (i) reducing time for drivers to find customers; (ii) reducing time for passengers to wait; (iii) avoiding and preventing traffic congestion; (iv) reducing gas consumption and operating cost; (v) improving driver and vehicle safety, and (vi) enforcing municipal regulation. Class modules developed on mobile computing and CPS will be used at the four participating Universities and then be made available via the Web.
Off
University of Minnesota-Twin Cities
-
National Science Foundation
Tian He Submitted by Tian He on December 18th, 2015
Cyber physical systems (CPSs) are merging into major mobile systems of our society, such as public transportation, supply chains, and taxi networks. Past researchers have accumulated significant knowledge for designing cyber physical systems, such as for military surveillance, infrastructure protection, scientific exploration, and smart environments, but primarily in relatively stationary settings, i.e., where spatial and mobility diversity is limited. Differently, mobile CPSs interact with phenomena of interest at different locations and environments, and where the context information (e.g., network availability and connectivity) about these physical locations might not be available. This unique feature calls for new solutions to seamlessly integrate mobile computing with the physical world, including dynamic access to multiple wireless technologies. The required solutions are addressed by (i) creating a network control architecture based on novel predictive hierarchical control and that accounts for characteristics of wireless communication, (ii) developing formal network control models based on in-situ network system identification and cross-layer optimization, and (iii) designing and implementing a reference implementation on a small scale wireless and vehicular test-bed based on law enforcement vehicles. The results can improve all mobile transportation systems such as future taxi control and dispatch systems. In this application advantages are: (i) reducing time for drivers to find customers; (ii) reducing time for passengers to wait; (iii) avoiding and preventing traffic congestion; (iv) reducing gas consumption and operating cost; (v) improving driver and vehicle safety, and (vi) enforcing municipal regulation. Class modules developed on mobile computing and CPS will be used at the four participating Universities and then be made available via the Web.
Off
University of Pennsylvania
-
National Science Foundation
George Pappas Submitted by George Pappas on December 18th, 2015
Cyber physical systems (CPSs) are merging into major mobile systems of our society, such as public transportation, supply chains, and taxi networks. Past researchers have accumulated significant knowledge for designing cyber physical systems, such as for military surveillance, infrastructure protection, scientific exploration, and smart environments, but primarily in relatively stationary settings, i.e., where spatial and mobility diversity is limited. Differently, mobile CPSs interact with phenomena of interest at different locations and environments, and where the context information (e.g., network availability and connectivity) about these physical locations might not be available. This unique feature calls for new solutions to seamlessly integrate mobile computing with the physical world, including dynamic access to multiple wireless technologies. The required solutions are addressed by (i) creating a network control architecture based on novel predictive hierarchical control and that accounts for characteristics of wireless communication, (ii) developing formal network control models based on in-situ network system identification and cross-layer optimization, and (iii) designing and implementing a reference implementation on a small scale wireless and vehicular test-bed based on law enforcement vehicles. The results can improve all mobile transportation systems such as future taxi control and dispatch systems. In this application advantages are: (i) reducing time for drivers to find customers; (ii) reducing time for passengers to wait; (iii) avoiding and preventing traffic congestion; (iv) reducing gas consumption and operating cost; (v) improving driver and vehicle safety, and (vi) enforcing municipal regulation. Class modules developed on mobile computing and CPS will be used at the four participating Universities and then be made available via the Web.
Off
Temple University
-
National Science Foundation
Submitted by Shan Lin on December 18th, 2015
Event
VECoS 2015
9th International Workshop on Verification and Evaluation of Computer and Communication Systems (VECoS 2015) Important dates Paper submission: May 15, 2015 Decision notification: July 12, 2015 Camera-ready submission: July 23, 2015 Workshop: September 10-11, 2015 Aims and scope
Submitted by Anonymous on March 10th, 2015
Subscribe to Healthcare and Public Health Sector