Robot-assisted biopsy using ultrasound guidance: initial ... - CiteSeerX

1 downloads 17 Views 304KB Size Report
Mar 28, 2004 - Westbury, NY) filled with gelatin and equally sized peas (mean ..... Gerald Nittmann (ARC Seibersdorf Research) and Johann Hummel.
Eur Radiol (2005) 15:765–771 DOI 10.1007/s00330-004-2487-x

Joachim Kettenbach Gernot Kronreif Michael Figl Martin Fürst Wolfgang Birkfellner Rudolf Hanel Helmar Bergmann

Received: 28 March 2004 Revised: 28 June 2004 Accepted: 6 August 2004 Published online: 24 September 2004 © Springer-Verlag 2004

J. Kettenbach (✉) Division of Angiography and Interventional Radiology, Department of Radiology, Medical University Vienna, General Hospital, Währinger Guertel 18-20, 1090 Vienna, Austria e-mail: [email protected] Tel.: +43-1-404007620 Fax: +43-1-404004898 G. Kronreif · M. Fürst Robotics Laboratory, ARC Seibersdorf Research, 2444 Seibersdorf, Austria M. Figl · W. Birkfellner · R. Hanel H. Bergmann Department of Biomedical Engineering and Physics, Medical University Vienna, General Hospital, Währinger Guertel 18–20, 1090 Vienna, Austria

E X P E R I M E N TA L

Robot-assisted biopsy using ultrasound guidance: initial results from in vitro tests

Abstract The purpose of this study was to develop a robotic system for ultrasound (US)-guided biopsy and to validate the feasibility, accuracy and efficacy using phantom tests. Twenty peas (mean diameter 9.3± 0.1 mm) embedded within a gelphantom were selected for biopsy. Once the best access was defined, the position of the US transducer was recorded by an optical tracking system. Positional data of the transducer and the corresponding US image were transferred to the roboter planning system (LINUX-based industrial PC equipped with video capture card). Once the appropriate position, angulation and pitch were calculated, the robotic arm moved automatically with seven degreesof-freedom to the planned insertion path, aiming the needle-positioning unit at the center of the target. Then, the biopsy was performed manually using a coaxial technique. The length

Introduction Percutaneous biopsy performed under ultrasound (US) guidance has been shown to be a safe and reliable alternative to excisional surgical biopsy. The high efficacy of this technique, however, depends on the accuracy of the needle placement and the quality of the harvested tissue. At times, access to a target can be technically challenging due to various factors, including a limited space at the skin entry site or a difficult angulated access. To improve the accessibility of lesions, surgical robots and manipulators have potential advantages that

of all harvested specimens was measured, and the deviation of the actual needle tract from the center of the target was evaluated sonographically. In all targets, the biopsy specimen (mean length 5±1.2 mm) was harvested with only one needle pass required The mean deviation of the needle tip from the center of the target was 1.1±0.8 mm. Robotic assisted biopsies in-vitro using USguidance were feasible and provided high accuracy. Keywords Biopsies · Interventional procedures · Ultrasound

are well known in the clinical and technical community [1]. Medical robotic systems, in particular, can provide accurate needle guidance and stable access, leading to increased precision, accuracy and reproducible sampling of different parts of a lesion. Thus, greater efficacy, particularly for lesions difficult to target, can be anticipated. Although CT imaging has been used for robotic-assisted procedures by several groups [1–3], to our knowledge, only a few have used US to guide a robotic system [4, 5]. Contrary to most of the other robotic systems mentioned above, our goal was to develop a prototype robot-

766

ic system (B-Rob I) designed for both imaging modalities, US and CT. In this study, we used US guidance to prove the feasibility, accuracy and efficacy of B-Rob I, a robotic system, during robot-assisted biopsy validated by in-vitro phantom tests.

Materials and methods The complete robotic system, B-Rob I, includes the following components: (1) an optical tracking system (NDI Polaris, Waterloo, Ontario Canada) for real-time localization of the US transducer, the phantom bag’s position and the robot position; (2) a LINUX (SUSE 7.1)-based industrial PC (Pentium III, 1 GHz, 128 MB RAM) equipped with a video capture card (WinTV-PCI-FM 718, Hauppauge Computer Works, Inc.), including the medical planning software “ROBUST;” (3) a 4-DOF (degree-of-freedom) robotic arm for gross positioning; (4) a 3-DOF needle positioning unit (“NPU”), including a needle holder; (5) the robot control system (MS Windows 2000 based industrial PC, Pentium III, 1 GHz, 128 MB RAM), which includes custom input devices and safety switches to control the robot kinematics. According to the requirements of clinical practice, the entire robotic system was mounted on a mobile platform. Thus, the robotic system can easily be transferred to different interventional sites. Since the video output is a standardized interface of US scanners, we used the video signal as input for the robot system. Thus, for routine, practical use, various US devices could be used with our robot system.

Fig. 1 a Four-DOF robotic arm with three linear axes (A1–A3) and one rotational axis (A4) was used for gross positioning of the needle positioning unit (NPU). b For fine positioning, the 3-DOF NPU was mounted at the end of the robotic arm

Robot design A 4-DOF robotic arm with three linear axes (A1–A3) and one rotational axis (A4) was used for gross positioning (Fig. 1a) of the needle positioning unit (NPU). For fine positioning, the 3-DOF NPU was mounted at the end of the robotic arm (Fig. 1b). The NPU consisted of two parallel “fingers” made from carbon fiber composite material connected to each other with spherical joints and a cylindrical needle holder in between (Fig. 1b). Relative movements of the resulting parallelogram kinematics provided fine orientation of the needle. This design enabled the needle holder axis to angulate 15° in the four main directions. A third linear DOF-drive served to move the NPU to the patient’s skin. Thus, fine-needle adjustments were possible with 3-DOF without simultaneous movement of the four main axes (A1–A4), which ultimately contribute to system safety. For easy sterilization, the two fingers, including the cylindrical needle holder, could be disconnected from the NPU by means of a rapid-change bayonet connection. Registration of the system components In order to determine the spatial relationship between the three involved system components (US transducer, phantom bag and robot), the positional data of each item obtained by the optical tracker was registered to the robot’s coordinate system. Registration of the robot system In a first step, a rigid body transformation between a tracker tool attached to the robot’s base and the internal coordinate system of the robot had to be defined by means of a point-to-point registration process [6]. The result of this registration procedure is a matrix that describes the coordinate transformation between the tracker coordinates and the robot, which also allows calculation of the needle po-

sition and angulation in the tracker coordinates later during the intervention. During the first pre-clinical tests, the resulting “fiducial registration error” (FRE) was between 0.7 and 0.9 mm [7]. Calibration of the US probe Another tracker tool was firmly attached to the US transducer’s case in order to record the location and orientation of the transducer each time a US image was acquired and grabbed by the robotic workstation. Next, the position of an acrylic glass cube in space was defined by means of a registration process beforehand. The acrylic glass cube itself formed a water-filled chamber including a perspex frame with six nylon strings, and each string had two or three plastic fiducials (spheres with 1-mm diameter) attached. For the calibration procedure, the current image of all strings and their fiducials seen on the US monitor was “grabbed.” At the same time, the positional data of the US transducer as obtained by the optical tracker was transferred to the workstation. Then each fiducial marker was identified by selection at the graphical user interface (GUI) of the calibration system and combined with the sphere’s coordinates. After selection of 20–30 spheres, the calibration calculated a 6-DOF transformation matrix using the method of singular value decomposition. To assign a 3D coordinate to a 2D pixel of the scan for each given image, a calibration procedure was implemented, originally based on the method described by Detmer et al. [8]. A detailed discussion of the implemented procedure is also given in [9]. Phantom model and US imaging For the in vitro tests, a phantom model was prepared, which consisted of a dense plastic enema bag (E-Z-EM enema bag, E-Z-EM,

767

Fig. 3 Using the acquired US plane, the “skin” entry point and the center of the target were selected on the planning workstation Fig. 2 Transducer’s movement is manually “frozen” and the coordinates of the transducer’s position and orientation —measured by the optical tracker system—are transferred to the planning workstation

Westbury, NY) filled with gelatin and equally sized peas (mean diameter 0.93±0.1 cm), prepared as described previously [10]. The phantom model was then fixed with plaster stripes on an acrylic glass cube, equipped with a tracker tool mounted on its front side. Thus, the spatial relationship to the robot coordinate systems could be established, which would be useful in the case of inadvertent movement of the phantom during biopsy. Imaging was accomplished with a 2.0–4.0-MHz curved array transducer and an Ultramark 9 HDI scanner (Advanced Technology Laboratories, Bothell, WA) equipped with a standard video output interface. Twenty peas (mean diameter, 9.3±0.1 mm) were selected for biopsy using US performed by one experienced radiologist (J.K.). Once a pea was selected, the largest diameter in all three orthogonal axes and the distance between the surface of the phantom and the center of the target was documented. Then, the transducer was angulated until the largest diameter of the target was visible in the transverse plane on the US screen. At this time point, the transducer’s movement was manually “frozen,” and the coordinates of the transducer’s position and orientation —measured by the optical tracker system—were transferred to the planning workstation (Fig. 2). Simultaneously, the corresponding image of the US monitor was grabbed from the video output of the US scanner and converted into a digital file using a LINUX-based industrial PC equipped with a commercial video capture card (WinTV, Hauppauge Computer Works). Planning of the intervention Planning of the intervention was performed by means of custom software “ROBUST” developed using C++ programming language with Qt-Library (1.45) and SUSE-Linux 7.1. Using the acquired US plane, the “skin” entry point as well as the center of the target were selected on the planning workstation (Fig. 3). After computation of the trajectory, relevant data, such as angulation of the needle and distance to the target, were calculated and sent to the robot controller via a TCP/IP socket connection.

Robot control system A specially designed control system (industrial PC with operating system MS Windows 2000, Pentium III, CPU 1 GHz, robot control interface developed in Delphi 5.0) performed and supervised all required movements of the robot system and was the main interface for all specific sub-systems, i.e., the navigation software, the robot system, the input device, and the security devices. In addition, the system interface allowed manual control of the robot system, if required. Positioning of the robot, execution of the intervention After confirming the planned access, the 4-DOF robotic arm was moved near the phantom’s entry point and locked into a safety position about 4-cm above the “skin” entry point. Fine adjustments of the NPU according to the coordinates calculated were performed by a coordinated motion of the robot kinematics before the NPU was lowered at the “skin” level. Then, after blocking the six main axes (A1–A6), the NPU finally moved caudally to the skin entry point by activation of auxiliary linear axis A7 with reduced speed and force (Fig. 4a). At this position, a 17-gauge puncture needle (length 130 mm; Bard, Angiomed, Karlsruhe, Germany) was inserted into the robotic needle holder (Fig. 4b). According to the insertion depth calculated by the planning workstation, the small rubber marker of the puncture needle was moved to the position displayed in order to indicate the appropriate insertion depth. Once the puncture needle was inserted into the phantom, the stylet of the puncture needle was removed and the 18-gauge biopsy needle (Bard, Angiomed, length 160 mm) was inserted (Fig. 4c). Using an automated biopsy device (Magnum Core high Speed, 22-mm excursion), one biopsy sample was obtained coaxially. The complete intervention was monitored and documented on the control window by means of graphical information of the planned and real biopsy trajectory superimposed on the actual US scan (Fig. 3). The actual trajectory of the needle guide was calculated constantly via a standard kinematic transformation based on the internal sensor systems of the robot as well as by using the position measurement based on the optical tracker. Should there have been a need for a correction of the needle angulation, this could have been accomplished easily by means of manual correction using a robot input device (Fig. 5).

768

Fig. 5 The robot-input device used in case of manual correction of the robot movement

Fig. 6 Once the robot arm had been removed to its resting position, the actual trajectory within the gel phantom was documented in both planes (transverse and orthogonal) in order to measure the deviation of the biopsy trajectory from the center of the target

procedure is demonstrated in Table 1. Once the robot arm had been removed to its resting position, the actual trajectory within the gel phantom was documented in both planes (transverse and orthogonal) in order to measure the deviation of the biopsy trajectory from the center of the target (Fig. 6). Safety features

Fig. 4a–c Once the puncture needle was inserted into the phantom, the stylet of the puncture needle was removed and the 18-gauge biopsy needle inserted

For safety reasons, the above-mentioned tasks were distributed between several hardware and software components of the control system. If the phantom bag or the robot platform had been moved inadvertently during the procedure, this would have been recognized because new coordinates of the phantom’s base (acrylic glass cube) or the robot would have been registered to the optical tracking system and sent to the planning workstation. In case of a system breakdown or potential collision, the main control system of the robot would have been de-activated immediately.

Results After biopsy, the puncture needle was removed and the length of the harvested pea specimen was measured independently by two investigators (J.K., M.F.). The average of these two measurements was used for further analysis (Table 2). The algorithm of the

The 20 targeted peas had a mean transverse diameter of 9.3±0.1 mm, and the median distance of the target center from the surface of the phantom bag (“skin”) was 3.7

769

Table 1 Workflow for the robot-assisted procedure, assuming that the target already has been selected and the insertion point already identified, cleaned and prepared for biopsy Task Ultrasonography of target and selection of best access Frame-grabbing the US image and transfer of positional data to workstation Planning of intervention: definition of best entry point, calculation of needle angle and insertion depth Confirmation of planning Transfer of data to robot and positioning of the robotic arm at appropriate position at skin entry level “Loading” the needle positioning unit with puncture needle to the calculated depth Biopsy and retrieval of specimen Retrieval of puncture needle Removal of robot to safe position US check of biopsy trajectory and measurement of biopsy specimen

Table 2 Insertion depth of puncture needle, length of harvested biopsy specimen and deviation of the biopsy trajectory from the center of the target evaluated for each biopsy procedure Number of procedure

Insertion depth (cm)

Length of harvested specimen (mm)a

Deviation from the center along x-axis (mm)

Deviation from the center along z-axis (mm)

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 Mean±SD (range)

3.2 3.0 1.0 3.6 2.0 2.8 2.9 4.0 4.0 5.9 4.9 5.7 5.4 7.0 3.8 2.4 2.7 1.1 4.0 4.0 3.7±1.5 (1.0–7.0)

5.9 4.6 6.5 5.6 4.1 5.1 4.0 5.3 5.5 3.9 4.4 5.6 5.7 6.2 6.4 6.4 4.7 5.0 9.4 5.5 5.5±1.2 (3.9–9.4)

0.0 0.9 0.0 1.9 2.9 1.7 1.3 0.5 1.6 0.0 1.1 1.1 0.4 1.6 1.4 1.2 1.8 0.7 0.0 2.6 1.1±0.8 (0.0–2.9)

0.8 0.5 0.6 0.6 3.0 2.1 0.2 1.1 1.0 1.5 1.1 0.8 0.6 2.9 1.7 0.9 1.1 3.6 0.2 2.0 1.4±0.9 (0.2–3.6)

a Mean

value, evaluated by two observers.

(range, 1.5–6.9) cm. In all cases, only one needle pass was necessary to obtain a biopsy specimen. The mean length of the harvested specimen, as calculated from measurements obtained by two different observers, was 5.5±1.2 mm (Table 2). Using US control scans, the mean deviation between the needle trajectory and the center of the target was 1.1±0.8 mm along the x-axis (transverse plane) and 1.4±0.9 mm along the z-axis (sagittal plane) (Fig. 6). The mean duration of the procedure including targeting, planning, biopsy and retrieval of specimens was 2.6±1.0 (range, 1.5–6.0) min (the latter was due to line-of-sight problems for the optical tracker system). In at least three cases, US examinations of the biopsy trajectory revealed a deflection of the biopsy needle by the targeted pea. In one case, the pea was obviously

pushed away by the biopsy needle within the gel phantom. In that case, the deviation of the needle path from the target entry was 3.6 mm in the craniocaudal direction.

Discussion Robotic systems can enhance surgical and interventional procedures through improved precision, stability and dexterity. Furthermore, a robot is resistant to radiation and infection. The ability to use detailed, quantitative information from US, CT or MRI in particular allows robots to guide accurately instruments to pathologic structures deep within the body [11]. A review of rep-

770

resentative robotic developments demonstrates various clinical applications in neurosurgery, orthopedic, urological, ophthalmologic and cardiac surgeries [12]. NeuroMate, a six-axis robot for neurosurgery, has been used in over 1,600 procedures since 1989, including tumor biopsies, stereoelectroencephalographic investigations and stereotactic midline surgery [13]. The ROBODOC system was developed for hip-replacement procedures and has been used in over 1,000 cases utilizing CT data from the patient’s anatomy [14]. For eye surgery, a “Steady-Hand” robot was developed to provide smooth, tremor-free, precise positioning and force scaling [12]. Experiments to test the ability of humans to position a microsurgical needle within a 150-µm accuracy found that success rates significantly improved using the “Steady-Hand” robot. Similar techniques were used in stapedotomy [15]. More recently, the da Vinci system has been used for more than 1,000 cardiac procedures, such as fully endoscopic coronary bypass grafts, as well as for cholecystectomy or Nissen fundoplication [16]. In this paper, we introduce B-ROB I, a prototype robot system for percutaneous biopsy, designed to guide remotely a biopsy needle to small targets localized before the procedure with US or CT imaging. As a first step toward an integrated robotic system for image-guided interventions, we have demonstrated the successful development of B-ROB I. Next, our phantom tests demonstrate that robot-assisted biopsy of small targets localized by US imaging is feasible. The robot design provides a safe operation. In this study, the high precision of registration, planning and movement of the robotic NPU enabled an effective biopsy in all cases with a high accuracy for even relatively small targets such as peas. Other groups who used CT guidance for robot-assisted interventions support our result [12]. Tseng et al. used a commercially available robot, CT imaging and a magnetic tracker device to accomplish a positioning accuracy around 2 mm for moving neurosurgical instruments to targets attached to a phantom skull [17]. Fichtinger et al. used a 7-DOF, passive arm and motorized needle-insertion device to simulate prostate biopsy in vitro and achieved an average distance between the needle tip and the target of about 2 mm [1]. Kaiser et al. used a 6-DOF MR-compatible robotic device dedicated to biopsy of breast lesions under MR guidance. During in vitro tests, targets of 4-mm in diameter were successfully punctured [18]. More recently, their system has been used clinically with successful biopsy of 14 breast lesions [19]. However, accuracy could be altered by several components: (1) inappropriate calibration of the robotic components before a biopsy; (2) measuring noise and tracking of a rather low data acquisition rate using the optical tracking system, which may cause small registration errors; (3) small movements of the transducer head during capturing and registration; (4) deflections of the biopsy

needle by inhomogeneous tissue structures during biopsy. The potential for missing a lesion because of tissue shifting during the needle penetration into tissue has been described by others [3]. Although we hit all selected target lesions, US of the trajectory revealed deflections of the biopsy needle in at least three procedures, and the largest deviation of the planned trajectory from the target center was 3.6 mm. One possible solution to minimize the risk of needle deflection is to use the shortest biopsy needle possible and to reduce the distance between the guidance tool and the skin surface practically to zero. While the first condition depends on the target localization, the latter was easily provided by our NPU design. However, bending may still occur within the phantom, and we have not yet performed biopsies deeper than 7 cm. Further improvements will explore the techniques to increase the accuracy of targeting the center of a lesion with US. For example, we could improve the planning process by using two angulated US scans. Although the optical tracking system cannot recognize rotations of the transducer’s head around a corner, an angle of 50–70° between the two US scans will be appropriate, each cutting the center of the target to provide a higher accuracy than using a single US scan in the current setting. Another major component that contributes to inaccuracies of the robot is the optical tracking system, which measures the position of all involved system components. The main advantage of using an optical tracking system results from its relatively high positional accuracy (which is about 0.35 mm under optimal conditions). Drawbacks include the fact that a visual connection between the camera and the tracker tools mounted on the system components had to be maintained at all times. Another disadvantage lies in the low data acquisition rate (up to 60 Hz under optimal conditions; 20–30 Hz in a realistic scenario) and the relatively high costs of the tracking system. Passive reflectors may lower the cost, but will not improve accuracy. While robotic components and the registration process can be improved, the movement of targets in vivo cannot be prevented by conventional means, and even less so when breathing is a factor. Movement of targets within a gel matrix are difficult to predict; however, given a certain access path, additional US from a different point of view may provide some kind of control in case of needle deflection or target movement. Knowing the new position of a deflected target, our robot system can be moved manually to any desired position using a dedicated control board. While a high precision could be achieved during in vitro tests, for clinical application, small movements of the patient could be registered with a tracker tool mounted at the patient’s surface. Also, an immobilization device (BodyFix immobilization device, Medical Intelligence, Schwabmunchen, Germany) has been proven to be useful in or-

771

der to reduce patients’ movement during an intervention [20]. Also, registration techniques to compensate movement of organs during breathing are under development elsewhere [21], and further developments may consider the use of motion filters, similar to what has been used by cardiac robots [16]. Another improvement will include an increase of the angulation of the NPU for angulated trajectories. Nevertheless, the amount of time and effort necessary to get the robot calibrated and ready for biopsy is far from clinical usability. However, a robotic-assisted biopsy will be of great clinical value for several reasons: (1) it provides very stable needle guidance, even for angulated approaches; (2) it allows access to lesions when the presence of the US transducer would limit the access for the biopsy needle (intercostal approach); (3) it assists the radiologist while performing the coaxial biopsy or while doing a US check from a different approach, while advancing the needle. Robotic-assisted biopsy may further expand the time window to explore a lesion when using new ultrasound agents, since more time can be spent to target a lesion [22]. Thereafter, the robot may guide the needle into the

most promising region of the lesion without the need of a second contrast injection. Since percutaneous biopsy is faster, less invasive and less expensive than surgical biopsy, imaging-guided needle biopsy has increasingly become an alternative to surgical biopsy, in particular for the histologic assessment of small lesions such as breast tumors [23]. We have demonstrated that biopsy was feasible with high accuracy using a single US image. These phantom experiments indicate that our robotic system can be used for several percutaneous interventions. Compared to other systems, our prototype will be open to different imaging modalities such as CT and US. Further in-vivo tests will explore its usefulness during clinical applications. Acknowledgments The development of the robotic system was co-funded by the Austrian Federal Ministry of Transport, Innovation and Technology (BMVIT). The authors wish to acknowledge the significant contributions of Ludwig Kleiser, Kurt Renauer and Gerald Nittmann (ARC Seibersdorf Research) and Johann Hummel (Department of Biomedical Engineering and Physics, Medical University of Vienna). This study was supported in part by the LudwigBoltzmann Institute (director: C. Herold) for clinical and experimental radiology. We are grateful to Mary McAllister, Johns Hopkins University Hospital, Baltimore, MD, for manuscript assistance.

References 1. Fichtinger G, DeWeese TL, Patriciu A et al (2002) System for robotically assisted prostate biopsy and therapy with intraoperative CT guidance. Acad Radiol 9:60–74 2. Yanof J, Haaga J, Klahr P et al (2001) CT-integrated robot for interventional procedures: preliminary experiment and computer-human interfaces. Comput Aided Surg 6:352–359 3. Masamune K, Fichtinger G, Patriciu A et al (2001) System for robotically assisted percutaneous procedures with computed tomography guidance. Comput Aided Surg 6:370–383 4. Ng WS, Davies BL, Timoney AG, Hibberd RD (1993) The use of ultrasound in automated prostatectomy. Med Biol Eng Comput 31:349–354 5. Vilchis A, Masuda K, Troccaz J, Cinquin P (2003) Robot-based teleechography: the TER system. Stud Health Technol Inform 95:212–217 6. Horn BKP (1987) Closed-form solutions of absolute orientation using unit quaternions. J Opt Soc A 4:629–642 7. Fitzpatrick JM, West JB, Maurer CR Jr (1998) Predicting error in rigid-body point-based registration. IEEE Trans Med Imaging 17:694–702 8. Detmer PR, Bashein G, Hodges T et al (1994) 3D ultrasonic image feature localization based on magnetic scanhead tracking: in vitro calibration and validation. Ultrasound Med Biol 20:923–936

9. Hummel J, Figl M, Kollmann C, Bergmann H, Birkfellner W (2002) Evaluation of a miniature electromagnetic position tracker. Med Phys 29:2205–2212 10. Silver B, Metzger TS, Matalon TA (1990) A simple phantom for learning needle placement for sonographically guided biopsy. Am J Roentgenol 154:847–848 11. Howe RD, Matsuoka Y (1999) Robotics for surgery. Annu Rev Biomed Eng 1:211–240 12. Cleary K, Nguyen C (2001) State of the art in surgical robotics: clinical applications and technology challenges. Comput Aided Surg 6:312–328 13. Benabid AL, Hoffmann D, Le Bas JF, Lavallee S (1995) Value of image guided neurosurgery in neuro-oncology. Bull Cancer 82:573s–580s 14. Birke A, Reichel H, Hein W et al (2000) ROBODOC—a path into the future of hip endoprosthetics or an investment error? Z Orthop Ihre Grenzgeb 138:395–401 15. Rothbaum DL, Roy J, Stoianovici D et al (2002) Robot-assisted stapedotomy: micropick fenestration of the stapes footplate. Otolaryngol Head Neck Surg 127:417–426 16. Bodner J, Wykypiel H, Wetscher G, Schmid T (2004) First experiences with the da Vinci operating robot in thoracic surgery. Eur J Cardiothorac Surg 25:844–851

17. Tseng CS, Chung CW, Chen HH, Wang SS, Tseng HM (1999) Development of a robotic navigation system for neurosurgery. Stud Health Technol Inform 62:358–359 18. Kaiser WA, Fischer H, Vagner J, Selig M (2000) Robotic system for biopsy and therapy of breast lesions in a highfield whole-body magnetic resonance tomography unit. Invest Radiol 35:513–519 19. Pfleiderer SO, Reichenbach JR, Wurdinger S et al (2003) Interventional MR-mammography: manipulatorassisted large core biopsy and interstitial laser therapy of tumors of the female breast. Z Med Phys 13:198– 202 20. Bale RJ, Lottersberger C, Vogele M et al (2002) A novel vacuum device for extremity immobilisation during digital angiography: preliminary clinical experiences. Eur Radiol 12:2890–2894 21. Clifford MA, Banovac F, Levy E, Cleary K (2002) Assessment of hepatic motion secondary to respiration for computer assisted interventions. Comput Aided Surg 7:291–299 22. Nilsson A, Krause J (2003) Targeted tumour biopsy under contrast-enhanced ultrasound guidance. Eur Radiol 13(Suppl 4):L239–L240 23. Helbich TH, Matzek W, Fuchsjager MH (2004) Stereotactic and ultrasound-guided breast biopsy. Eur Radiol 14:383–393

Suggest Documents