A Novel Biomechatronic Interface Based on Wearable Dynamic Imaging Sensors

pdf

Abstract: 

The problem of controlling biomechatronic systems, such as multiarticulating prosthetic hands, involves unique challenges in the science and engineering of Cyber Physical Systems (CPS), requiring integration between computational systems for recognizing human functional activity and intent and controlling prosthetic devices to interact with the physical world. Research on this problem has been limited by the difficulties in noninvasively acquiring robust biosignals that allow intuitive and reliable control of multiple degrees of freedom (DoF). Traditionally, myoelectric signals based on surface electromyography (sEMG) recordings from the skin are used for control signals. These signals suffer from poor signal to noise and limited specificity for deeper muscles. The objective of this research is to investigate a new sensing paradigm based on ultrasonic imaging of dynamic muscle activity, which is known as sonomyography. Our approach involves the integration of novel imaging technologies, new computational methods for activity recognition and learning, and high-performance embedded computing to enable robust and intuitive control of dexterous prosthetic hands with multiple DoF. In the first year, we developed a real-time environment for using sonomyography to interact with a virtual reality environment, including a virtual prosthetic hand. In the second year, our focus has been on evaluating the robustness of sonomyography and our image analysis algorithms to classify complex grasps with various arm and wrist positions. Our results demonstrate that our proposed approach can maintain classification accuracy >85% even when the forearm and wrist is moved in different positions, as long as the training dictionary contains adequate examples of these types of motions. We also performed preliminary experiments with an amputee. These experiments showed that we could achieve offline classification accuracy of 96%, and real time classification accuracy of 77% for with just a few minutes of training.  We also made progress in developing protocols for designing and fabricating custom piezo-polymer ultrasound transducers that can be readily integrated into a prosthetic shell. Finally, we investigated strategies at the hardware, system, application for optimizing power and performance when mapping image analysis algorithms to multicore embedded processing platforms. These results demonstrate the potential of ultrasound imaging as a sensing strategy for biomechatronic interfaces and provide intuitive graded control, and demonstrate the technical feasibility of developing wearable platforms employing these technologies. Ongoing activities involve the integration of ultrasound imaging sensors with a low-power heterogeneous multicore embedded processor, and the development of more sophisticated image analysis and control strategies for interacting with the virtual environment. 

Tags:
License: CC-2.5
Submitted by Siddhartha Sikdar on