CPS Breakthrough: From Whole-Hand Tactile Imaging to Interactive Simulation

pdf

Abstract:

This project aims to enable cyber-­‐physical systems that can be worn on the body in order to one day allow their users to touch, feel, and manipulate computationally simulated three-­‐dimensional objects or digital data in physically realistic ways, using the whole hand. It will do this by precisely measuring touch and movement-­‐induced displacements of the skin in the hand, and by reproducing these signals interactively, via new technologies to be developed in the project. The resulting systems will offer the potential to impact a wide range of human actives that depend on touch and interaction with the hands. The project seeks to enable new applications for wearable cyber physical interfaces that may have broad applications in health care, manufacturing, consumer electronics, and entertainment. Although human-­‐interactive technologies have advanced greatly, current systems employ only a fraction of the sensorimotor capabilities of their users, greatly limiting applications and usability. The development of whole-­‐hand haptic interfaces that allow their wearers to feel and manipulate digital content has been a longstanding goal of engineering research, but has remained far from reality. The reason can be traced to the difficulty of reproducing or even characterizing the complex, action-­‐dependent stimuli that give rise to touch sensations during everyday actives. This project will pioneer new methods for imaging complex haptic stimuli, consisting of movement-­‐dependent skin strain and contact-­‐induced surface waves propagating in skin, and for modeling the dependence of these signals on hand kinematics during grasping. It will use the resulting fundamental advances to catalyze the development of novel wearable CPS, in the form of whole-­‐hand haptic interfaces. The laGer will employ surface wave and skin strain feedback to supply haptic feedback to the hand during interaction with real and computational objects, enabling a range of new applications in VR. The project will be executed through research in three main research areas. In the first, it will utilize novel contact and non-­‐contact techniques based on data acquired through on-­‐body sensor arrays to measure whole-­‐hand mechanical stimuli and grasping kinematics at high spatial and temporal resolution. In a second research area, it will undertake data-­‐driven systems modeling and analysis of statistical contingencies between the kinematic and cutaneous sensed during everyday actives. In a third research area, it will engineer and perceptually evaluate novel cyber physical systems consisting of haptic interfaces for whole hand interaction. In order to further advance the applications of these systems in medicine, through a collaboration with the Drexel College of Medicine, the project will develop new methods for assessing clinical skills of palpation during medical examination, with the aim of improving the efficacy of what is oNen the first, most common, and best opportunity for diagnosis, using physician’s own sense of touch.

Tags:
License: CC-2.5
Submitted by Yon Visell on