Visible to the public Image Guided Robot-Assisted Medical Interventions


Introduction: Image-guided and robot-assisted interventions and surgeries are emerging as a highly promising alternative to traditional open surgeries and free-hand procedures for improved patient management. The overall objective of this award is to develop a novel cyber-physical system for performing Multimodal Image-guided RObot-assisted Surgeries (MIROS). MIROS entails the development and integration of robust and scalable (i) multimodal sensing (including real-time imaging), (ii) controlled robotic manipulators, and (iii) the interventionalist. The work is focused on the two primary areas of such system: the physical world (developing sensors, haptic interfacing and robots) the computational core (on-the-fly imaging and sensor data processing for real-time robot control and human-machine interfacing)

Scientific Directions: The scientific directions and research objectives of the MIROS project are the use of true sensing of the physical world with multi-contrast imaging, to maneuver a surgical robotic manipulator inside the beating heart of a patient', while offering a comprehensive perception of the Area of Procedure (AoP) to the operator via a visuo-haptic interface.

Research Objectives: Our work in the third year of the project continued along the areas of the two prior year: (I) On the Sensors side (Physical and Cyber world) we are focused two areas:

  1. (a) Continued the implementation of novel near-real time MR data collection strategy was developed for real- time 3D monitoring of anatomical landmarks in the AoP on the beating heart. This approach selectively collect MR raw data that are significant for reconstructing the AoP; thereby, increasing the speed of acquisition. This information is then used by the computational core for the generation of access corridors and autonomous or semi-autonomous control of the robotic manipulator.

  2. (b) We completed the development and experimental demonstration of a novel 1.8 mm diameter multi-modal probe that combines a miniature RF coil (MR sensor) and a laser induced fluorescence (LIF) endoscopic sensor. We are completing the integration of this in-house develop sensor onto the end-effector of our robot for high resolution molecular level sensing of the pathologic focus during the procedure.

(II) On the computational core and system integration side we continued the development and integration of:

  1. (a) Completed the development of novel algorithms for the generation of dynamic access corridors (inside the beating heart), from the above mentioned real-time MR data, for safely and accurately maneuvering the robotic ,manipulator inside the continuously changing AoO (due to breathing, heart beating or

    unexpected events)

  2. (b) Haptic rendering algorithms were implemented and integrated for force-feedback based perception of the

    AoO; combined with the dynamic access corridors dynamic virtual fixtures are then used to direct the

    operator in the semi-autonomous or manual mode of robot control.

  3. (c) Completed the Real-time tracking algorithm of anatomical landmarks, from fast MRI, was further refined

    based on a Bayesian network of particle filter trackers, that are used by the robot control module and the

    visuo-haptic human-information/machine interface.

  4. (d) Parallel computing methods were implemented for combining and, in particular synchronizing, MR data

    processing, haptic device control, and robot control (the latter two devices are controlled via dedicated

    embedded systems - implemented on the PC104 factor for ruggedness and mobility among sites).

(III) On the physical world we have focused on our efforts on the three projects:

  1. (a) We completed the development of a new five DoF force-feedback device (that replicates the kinematics of a single-port-access surgical robot or laparoscope). The force feedback exerted by the device is based on the on-the-fly analysis of MR data with the aforedescribed algorithms we developed in area II(b) (i.e. MR data access corridors/safety haptic rendering force-feedback).

  2. (b) We designed and we re currently prototyping our second seven degree-of-freedom (DoF) MR compatible robotic manipulator (the KardioBot II) for MR-guided transapical access into the beating heart (that is the originally proposed clinical paradigm). This has resulted to the development of a new MR compatible actuation and control paradigm that we feel, if successful, may change the way MR compatible robots are actuated. While the physical prototype is under development, a realistic model of the new robot has been interfaced with the computational core components II(a), II(b) and II(c), and used for implementing image- based automated control for maneuvering insi9de the moving tissue.

Award ID: 0932272

Creative Commons 2.5

Other available formats:

Image Guided Robot-Assisted Medical Interventions