Visible to the public Control of Surgical Robots: Network Layer to Tissue Contact


The original goals of this project were to take an existing open surgical robot testbed (the RAVEN robot), and to create a robust infrastructure for cyber-physical systems with which to extend traditional real-time control and teleoperation concepts. This is done by adding three new interfaces to the system:

  • A robust open network protocol.
  • A low-overhead interface between direct teleoperation and intelligent functions.
  • A tissue-contact control module which is aware of tissue biomechanics

In addition, during the course of the project, two additional goals developed:

  • Development of real time haptic rendering methods using non-contact sensors, such as RGB-depth cameras (like the Kinect)--and the implementation of virtual fixtures. These virtual fixtures include "forbidden region" virtual fixtures (to push back against the surgeon's hand controls if an undesired target is approached), and also "guidance" virtual fixtures (to help guide end effector progress toward a target). Both types of virtual fixtures can increase the accuracy, safety and speed of operation of telerobotic surgery, as well as other types of teleoperation.
  • A preliminary consideration of security issues for telerobotic systems.

Research contributions of this project include:

  • An open network protocol and interfaces suitable for direct teleoperation (involving multiple operators and robots).

This includes the evaluation of existing open source software, interfaces, communication protocols and standards -- and their incorporation within a larger open network protocol

  • Implementation of new and existing control algorithms in code suitable for both testbed mechanical systems and the RAVEN surgical robot.

This includes the implementation of simultaneous parameter and state estimation (using a variant of the Unscented Kalman Filter), and the use of these estimates in several different controllers.

  • Experimental evaluation of these controllers using a mechanical testbed, and using the RAVEN II robot (two arms).
  • The development of haptic rendering algorithms that can produce 3 DOF or 6 DOF haptic information (to a haptic interface device), based upon streaming RGB-D point cloud information obtained simultaneously from one or more RGB-D cameras (such as the Kinect). This allows for real-time haptic rendering of moving objects.
  • The implementation of virtual fixtures to protect objects from surgical contact, or to guide surgical tools to them. Development of these virtual fixtures includes accurate implementation of simulations of robot models.

Award ID: 0930930

Creative Commons 2.5

Other available formats:

Control of Surgical Robots: Network Layer to Tissue Contact
Switch to experimental viewer