Intuitive Human-in-the-Loop Control for Medical Cyber-Physical Systems Poster.pdf

pdf

INTRODUCTION

Human-in-the-loop cyber-physical systems should be intuitive and natural to use, especially in complex environments, such as the operating room, and in the presence of difficult system properties, such as nonholonomic kinematics. However, what constitutes an intuitive control interface between a human operator and physical system is not always clear. For example, some nonholonomic systems (e.g. bicycles, cars) seem best controlled in joint space, where the user controls system inputs, such as steering angle and velocity. Recently, the robotics literature has shown that kinematically-similar systems (i.e. steerable needles, wheelchairs) are better controlled in Cartesian space, where the user controls a desired system output, such as position. For these systems, the addition of a cyber control layer between the operator and the robot enables exploration of novel teleoperation mappings, which may lead to more intuitive human-in-the-loop control. The goal of this project is to quantify measures of intuitiveness or naturalness, which may then be used to validate novel human control paradigms for cyber-physical systems.

KEYWORDS

Cyber-physical systems, Surgical Robotics, Teleoperation, Human-in-the-Loop

POSTER SUMMARY

We have integrated a variety of physiological sensors (EEG, EMG, galvanic skin response, heart rate, etc.) with custom C++ code and the Robot Operating System (ROS) to control a haptic device in different teleoperation control scenarios, while recording user performance and physiological response. For the first phase of this project, we designed a task of known difficulty using Fitts’ Law, a psychomotor law that relates movement time to the width and separation distances of two targets. In a human subjects study (UTD IRB #14-57), four subjects were recruited to participate in the Fitts’ Law task while collecting physiological data and performance metrics and controlling for task difficulty. Nearly all metrics, except those derived from EEG signals and galvanic skin response, showed statistically significant correlation with task difficulty. We the evaluated several least-squares models for task difficulty, based on physiological, kinematic, and performance metrics. Of the models tested, one did not include any performance metrics, relying solely on metrics derived from the human subjects movement signatures and physiological response. All the of models had an average prediction error of less than 10% indicating that knowledge of the task objectives is not required to predict the difficulty of the task. We are currently conducting the second phase of our research project, which consists of applying our intuitiveness model to subjects conducting tasks for unknown difficulty for the purpose of identifying the most intuitive control interfaces. As part of an REU supplement project, we identified important muscle groups and calibration procedures for three different needle steering teleoperation algorithms: joint space, Cartesian space, and Cartesian space with force feedback. We also conducted a pilot study with 6 human subjects while collecting EMG, EEG, galvanic skin response, heart rate, and objective performance metrics for each of the algorithms.

ACKLOWELDGEMENTS:

This work was supported by NSF CRII CPS 1464432 and an REU Supplement

Tags:
License: CC-2.5
Submitted by Ann Majewicz on