Visible to the public CPS: Breakthrough: From Whole-Hand Tactile Imaging to Interactive SimulationConflict Detection Enabled

Project Details
Lead PI:Yon Visell
Performance Period:09/01/15 - 12/31/18
Institution(s):University of California-Santa Barbara
Sponsor(s):National Science Foundation
Award Number:1628831
566 Reads. Placed 543 out of 804 NSF CPS Projects based on total reads on all related artifacts.
Abstract: This project aims to enable cyber-physical systems that can be worn on the body in order to one day allow their users to touch, feel, and manipulate computationally simulated three-dimensional objects or digital data in physically realistic ways, using the whole hand. It will do this by precisely measuring touch and movement-induced displacements of the skin in the hand, and by reproducing these signals interactively, via new technologies to be developed in the project. The resulting systems will offer the potential to impact a wide range of human activities that depend on touch and interaction with the hands. The project seeks to enable new applications for wearable cyber physical interfaces that may have broad applications in health care, manufacturing, consumer electronics, and entertainment. Although human interactive technologies have advanced greatly, current systems employ only a fraction of the sensorimotor capabilities of their users, greatly limiting applications and usability. The development of whole-hand haptic interfaces that allow their wearers to feel and manipulate digital content has been a longstanding goal of engineering research, but has remained far from reality. The reason can be traced to the difficulty of reproducing or even characterizing the complex, action-dependent stimuli that give rise to touch sensations during everyday activities. This project will pioneer new methods for imaging complex haptic stimuli, consisting of movement dependent skin strain and contact-induced surface waves propagating in skin, and for modeling the dependence of these signals on hand kinematics during grasping. It will use the resulting fundamental advances to catalyze the development of novel wearable CPS, in the form of whole-hand haptic interfaces. The latter will employ surface wave and skin strain feedback to supply haptic feedback to the hand during interaction with real and computational objects, enabling a range of new applications in VR. The project will be executed through research in three main research areas. In the first, it will utilize novel contact and non-contact techniques based on data acquired through on-body sensor arrays to measure whole-hand mechanical stimuli and grasping kinematics at high spatial and temporal resolution. In a second research area, it will undertake data-driven systems modeling and analysis of statistical contingencies between the kinematic and cutaneous sensed during everyday activities. In a third research area, it will engineer and perceptually evaluate novel cyber physical systems consisting of haptic interfaces for whole hand interaction. In order to further advance the applications of these systems in medicine, through a collaboration with the Drexel College of Medicine, the project will develop new methods for assessing clinical skills of palpation during medical examination, with the aim of improving the efficacy of what is often the first, most common, and best opportunity for diagnosis, using physician's own sense of touch.