availability of robotic assistance in standard operating rooms and for more surgical ... system would also have greater potential for use in remote and hazardous ...
Control Software Design of A Compact Laparoscopic Surgical Robot System Ji Ma and Peter Berkelman Mechanical Engineering Department University of Hawaii at Manoa 2540 Dole St., Holmes 302, Honolulu HI 96822, USA [jima, peterb]@hawaii.edu Abstract – We have developed a prototype teleoperated robotic surgical system which is modular, compact and easy to use. In this paper, the control software design of the prototype is introduced. The main function of the control software is to realize master-slave control. According to the functions, The control software consists of three layers: hardware drivers, master-slave control and human-machine interface. Each software layer includes several software modules which are easy to maintain, upgrade and are stable and reliable. The preliminary motion control and experimental results are given in the end. Index Terms – surgical robotics, software, control
I. INTRODUCTION In typical minimally invasive laparoscopic surgical procedures, the surgeon views the surgical area through an endoscope and display system and manipulates complex surgical instruments whose motions are constrained by the entry points of the instruments into the body of the patient. Medical robots are currently an active field of research and development. In survey [1], different medical robot systems in categories of general surgery, orthopaedics, imaging, neurosurgery, oral/maxillofacial/earnose-throat surgery, urology, trauma, and radiological surgery are described, although only 23% of these systems have been tested on patients and 14% have been made commercially available. Adoption of robotic devices in standard medical procedures has remained rare until the present due to reasons that may include difficulties of integrating robots into the operating room, safety concerns, high costs, and the lack of proven quantitative benefits from robotically assisted procedures. Robot-assisted minimally invasive surgery can eliminate manual tremor, introduce scaling factors between the hand motions of the surgeon and the robotic instruments, and provide additional articulated joints at the tips of the instruments. The resulting enhanced surgical dexterity may lead to improved patient outcomes and make more difficult procedures feasible. The use of current commercial robotic surgical systems is limited by their considerable size, complexity, and cost, and their time-consuming setup, maintenance, and sterilization procedures. The development and testing of a simpler, compact, portable, robotic surgery *
This work is partially supported by the University of Hawaii-Manoa College of Engineering and Mechanical Engineering Department
system with equivalent performance, greater ease of use, more versatility, and reduced setup time, would increase the availability of robotic assistance in standard operating rooms and for more surgical procedures. A smaller robotic surgery system would also have greater potential for use in remote and hazardous areas. Two robotic systems that have been developed to assist surgeons to perform minimally invasive surgical procedures are the da Vinci surgical system 1 [2] and previously the ZEUS system 2 [3]. The two systems are compared in [4]. Each system has undergone various clinical trials [5] and obtained regulatory approvals in Europe and the United States. The ZEUS system was used in a demonstration of transatlantic teleoperated surgery [3], [6]. The da Vinci system has been used in clinical validation of difficult minimally invasive coronary procedures [7], [8], [9]. We have developed a compact laparoscopic surgical robot prototype system [10]. The manipulators are designed to be similar to the Light Endoscope Robot [LER] described in [11], [12]. The aim of this work is to reduce the size, weight, complexity and costs of teleoperated robotic surgery systems and to simplify and reduce the setup, sterilization, and maintenance procedures required to use these systems. Fig. 1 shows a model of one endoscope manipulator and two instrument manipulators attached together in a modular system which can be placed over the abdomen of a patient. The three manipulators are to be attached to each other and to the sides of the operating table by thin metal arms rigidly clamped in the desired position. Fig. 2 shows the photo of the two instrument manipulators.
Fig. 1. Modular Robotic Surgery System Model 1 2
Intuitive Surgical Inc. Computer Motion Inc.
SURGEON Teleoperation Master #1: Left Hand
Microphone
audio signal
Video Display
Teleoperation Master #2: Right Hand
}
handle positions (Firewire/IEEE-1394)
Notebook PC: Teleoperation Interface, Voice Recognition, and Communication Software
} Endoscope Manipulator Motor Controllers (3)
Instrument Manipulator Motor Controllers (8)
motion commands, status and error feedback (USB/RS-232 serial port)
Articulated Instrument Motor Controllers (2-6)
} Endoscope Manipulator (LER)
Fig. 2. Two Surgery Instrument Manipulators
Instrument Manipulator #1
II. SYSTEM ARCHITECTURE A. System Overview Fig. 3 shows the schematic of the compact laparoscopic surgery robot system. The components of the compact surgery robot system will include the following: 1) Teleoperation Masters: The masters perceive the surgeon’s hand motions and change these motions into position signals. 2) Instrument Manipulators: The instrument manipulator follows the master motion signals to realize translational and rotational motions of the surgical instruments attached to the manipulators. 3) Endoscope Manipulator: The structure of endoscope manipulator is same as the instrument manipulators except its lower force/torque requirement for moving the endoscope and one fewer degree of freedom [DOF]. The endoscope manipulator moves according to voice commands of the surgeon or moves semi-autonomously to trace the object in the surgical site from the endoscope’s video images as described in [13]. 4) Motor Controller: Motor controllers are used to drive the DC brushless motors in the manipulator and linear actuator motors in the surgical instrument. 5) Surgical Instruments: According to different surgical tasks, different surgical instruments such as grippers and scissors are installed in the instrument manipulators. 6) Video Feedback: Video signal is acquired from the endoscope and the video feedback is provided by a monitor for the surgeon and nurses to observe the surgical site. 7) Voice Command: Since the surgeon’s hands are occupied by two masters, the voice command is used to control the endoscope manipulator. 8) Control Software: The control software includes hardware device drivers, master-slave control the humanmachine interface and other tasks. In the current prototype, the endoscope manipulator and voice command are not included. The video feedback uses a web camera in place of endoscope. And the surgical instruments are ordinary laparoscopic surgical grippers and scissors. Instruments with articulated wrists are to be introduced in following prototypes.
Instrument Manipulator #2
motor coil currents, sensor feedback
Articulated Instrument(s)
Video Camera
PATIENT
Fig. 3 Compact Laparoscopic Surgery Robot System Schematic
B. Master System The master system consists of two PHANToM Omni haptic devices 3 [14] which are able to provide 3 DOF position information in Cartesian space, 3 DOF of rotation information, and 3 DOF force output to the user. In the current prototype, the force output is not provided due to the absence of force sensors in the manipulators and we use only 4 DOF of input including 3 DOF translation in Cartesian space and 1 DOF roll rotation of the surgical instrument. In Fig. 4, frame {P} is the PHANToM Omni device reference frame which is predefined by the software interface and libraries [HDAPI] 4. The operator’s hand motions and position information is provided in frame {P} which can be read directly from the software interface libraries. Frame {M } is the master reference frame which generates the given reference position for the slave system. For easy and comfortable operation, the operator can choose a operation area arbitrarily which is the frame {M } . There is an offset transform between these two frames which is described in section II. D.
r YM
r YP OM OP
r {M } YM
r XM
r XP
r {P} YP
r ZM
r ZP
M
Pdev M
r ZP
OP
Poffset
r XP r ZM
M
Pref
r XM
OM
Fig. 4 Master and PHANToM Omni Device Reference Frame
C.
Slave System The slave system consists of two instrument manipulators and their surgical instruments. The slave system has 4 DOF of joint space motions: surgical instrument insertion, azimuth 3 4
Sensable Technologies PHANToM Omni Sensable Technologies Software HDAPI
rotation, inclination rotation and instrument roll rotation about its own axis. Through forward kinematics and inverse kinematics, the coordinate transform between joint space and Cartesian space of the slave system is realized. The slave reference frame is shown in Fig. 5. The forward kinematics can be calculated from (1). p x ρ sin φ cos θ p ρ sin φ sin θ y = p z ρ cos φ α pα
(1)
tan −1 ( p y p x ) θ φ cos −1 p 2 2 2 px + p y + pz z = 2 ρ p x + p y2 + p z2 pα α
r Zs
α&
φ
r Ys
py px pz
ρ
r Zs
α& ρ&
Fig. 5 Slave Manipulator Reference Frame
D. Master-slave Reference Frame Relations The reference frame of the PHANToM Omni haptic device is predefined in the interface software. But during surgical operation, the surgeon may arbitrarily choose a comfortable position and master reference frame for his master device operation which does not coincide with the PHANToM Omni haptic device. So the offset transform between the master reference frame and the PHANToM Omni haptic device reference frame should be included. The master position vector is given by (3). M
Pref = M Pdev − M Poffset
In (3) frame, M
M
M
In (4) S Pref is the given position of the slave, K scale is the master-slave motion ratio, M Pref is the given position of the master and MS R is the coordinate transform matrix from the master position to the slave position. The offset vector M Poffset in (3) and (4) can be determined
−1 S Poffset = MS R −1 ⋅ K scale ⋅ P + M Pdev
(5)
III. SOFTWARE DESIGN
r Xs
r Ys
(4)
(2)
θ
r Xs
Pref = K scale ⋅MS R⋅M Pref = K scale ⋅MS R ⋅( M Pdev − M Poffset )
M
)
o
S
with (5) when the operator presses the foot switch and start to work.
And the inverse kinematics is given by (2).
(
operation, so a master slave motion scale transform is necessary. The transform can be written as (4).
(3)
Pref is the reference vector in the master reference
Pdev is the vector directly read from master device and
Poffset is the offset vector between the master’s reference
value and the value read from the master device. In Fig. 4 and Fig. 5, the definitions of the master and slave reference frames are different. It is necessary to realize the transform between the position vector in the master reference frame and the position vector in the slave reference frame. And in some occasions, it may be necessary to magnify the master-slave motion ratio in order to realize more accurate
For the requirement of real time operation, stability, easy operation and upgradeability, the software is designed with multi-thread, multi-priority and modularized concepts. Currently the PHANToM Omni haptic device works stably on the Windows 2000 5 operation system, so the control system software of this prototype is developed on the Windows 2000 operation system and the Windows XP 5 operation system. In the future, the software can be implanted into other real time operation systems for more stable operation. The software architecture is shown in Fig. 6. It consists of three layers: hardware device drivers, master-slave control and human-machine interface. The device drivers have the highest priorities in the software system for the requirement of real time control of the hardware. The master-slave control software has a higher or highest software priority to calculate the master slave control variables. And the human-machine interface software has the normal software priority. A. Device Drivers Device drivers are mainly used to interface with the hardware by decoding the device codes from the hardware device into the values which can be used by the master-slave control software, and encoding the control values and commands into hardware codes then sending to hardware device. The device drivers include motor drivers which communicate with the motor driven units, input/output drivers which controls the digital input and output signals, and master haptic device drivers which communicate with PHANToM Omni haptic devices. These device drivers run at the highest priority in the system to assure the real time hardware control. The modular structure of the device drivers makes the software able to expand and easy to maintain. If some of the hardware devices are changed, it only needs to change the corresponding hardware drivers without revising all of the code. B. Master-Slave Control Software The function of the master-slave control software is to realize master-slave teleoperation control. It includes the manipulator motion control module, haptic device control 5
Microsoft Corporation Products
module, data log module and error detection module etc. According to different task types, the control periods of these modules are different. For example, the control period of the haptic devices is 1ms to keep the smooth control response for the user, and the control period of the error detection is the slowest. The control rate for the manipulator motion control is 30Hz, which is limited by the maximum communication frequency (about 35Hz at 19200bps RS-232 baud rate) between the computer and the motor driven units. The control frequencies of the different software modules are follows: Manipulator motion control: 30Hz Error detection module: 1Hz I/O control module: 10Hz Haptic device control: 1000Hz Surgical simulation: 200Hz At present, the haptic device control module is mainly used to get the operator’s hand motion information, there is no force output to the operator. The haptic output function is reserved due to the absence of force sensor in the slave. The surgical simulation module is reserved for future development which will provide virtual 3D training for the surgeons and help them operate the robot system skillfully and quickly. Data Display(10Hz) Configuration
Status Display
Data Display
Working Mode Control Master-slave Mode
Debug Mode
Calibration Mode
System Configuration
OpenGL Display(30Hz) Configuration File Log File
Graphical Display
Data File
Human-Machine Interface
Surgical Simulation (200Hz) Simulation Manipulator Control
Data Recording
I/O Control (10Hz)
Master Haptic Control (1000Hz)
Motor Driver1
Motor Driver8
Motor Driven Unit 1
Moto Driven Unit 8
Manipulator 1
Manipulator 2
Master-Slave Control
Haptic Control
Shared Memory
I/O Driver
Haptic Device Driver
Gripper Driven Unit
Device Drivers
Devices
I/O Device
Master Device 1
Given Joint Position
PD Controller
Control Value
Convert to Code
Control Code
Motor
Joint Position
Motor Driven Unit Joint Position
Convert to Joint Position
Joint Code Hall Sensor
Fig. 7 Joint Position Closed-loop Control
C. Human-Machine Interface Software The human-machine interface consists of data display, status and error display, work mode selection, manipulator operation, parameter setting and graphical display of the manipulator motion, and surgical simulation for training etc. In the current prototype, the human-machine interface is consists of different modules such as OpenGL display module, data display module, status and error display module, control button module, user help module, parameter setting module, program debug module etc. The surgical simulation for simulation is reserved for future development. According to different applications, the software can be loaded with different modules to realize various display interfaces both for the user or programmer. IV. CONTROL RESULTS
Manipulator Control (30Hz)
Fault Detect (1Hz)
controller. The current position control law is a typical proportional-plus-derivative control algorithm (PD). The input of the PD controller is the joint angle error and the output is the velocity control code to each motor controller. The motor driven unit adjust the motor velocity according to the output control code.
Master Device 2
Fig. 6 Control Software Architecture
The motions of the slave manipulators are realized by controlling the motors installed in the joints of the manipulators to move according to the target positions which come from the master devices by the surgeon’s operation. Fig. 7 is the diagram of the joint position closed-loop control for each joint. The input is the given joint position which is calculated from the master’s given position by using inverse kinematics. The feedback joint position is measured and decoded from the hall sensor in the motor by the motor
Fig. 8 is the closed-loop position control step response of an instrument manipulator in the horizontal x direction. The dashed line represents the command and the solid line represents response. Fig 9 is the results of the master-slave operation to realize grasp and release tasks and Fig. 10 is the corresponding joint control data. From the results, it can be seen that the system works well and the slave follows the master’s operation exactly, but there is a small delay (about 120ms) between master and slave trajectories. The delay is caused by the manipulator joint mechanical response time and the serial communication frequency (30Hz) between computer and motor driven unit. V. CONCLUSION AND FUTURE WORKS We have developed a prototype of a simple and minimally basic system for teleoperated robotic minimally invasive surgery. The novel features of this system are its small size, ease of setup and modular configuration. The next step to be taken in the development of our surgical robot system is to add articulated robotic wrists to the instruments to increase dexterity. The control system software for this robot system consists of three layers and each software layer consists of several software modules which are easy to use, maintain, upgrade
and are stable and reliable. The preliminary controller is designed and the control results are given. In the future, further testing, tuning, and optimization of the current prototype controllers remains to be done to improve the accuracy, response time and other performance parameters. The Light Endoscopic Robot described in section I and the preliminary design of the instrument robots was done at the TIMC-IMAG Laboratory of Grenoble, France. The instrument surgical manipulators described were fabricated by Alpes Instruments Inc. of Meylan, France.
180 Azimuth (deg) 160 140 Joint Position
ACKNOWLEDGMENT
200
100
60 40
65
Position (mm)
0
5
10
15
20
25 Time (s)
30
35
40
45
50
REFERENCES
60
55
50
45
40
0
1
2
3
4 5 Time (s)
6
7
8
9
Fig. 8 Horizontal Motion Step Response
150 Z XYZ Position (mm) & Roll Angle (deg)
Inclination (deg)
20
Fig. 10 Master-slave Control Joint Data
70
100 Roll
50
Y 0
X
-50
-100
Roll (deg)
80
0
35
Insertion (mm)
120
0
5
10
15
20
25 Time (s)
30
35
Fig. 9 Master-slave Control Result
40
45
50
[1] P. Pott, A. Kopfle, A. Wagner, E. Badreddin, R. Manner, P. Weiser, H.-P. Scharf, and M. Schwarz, “State of the art of surgical robotics,” in Perspective in Image-Guided Surgery: Proceedings of the Scientific Workshop on Medical Robotics, Navigation and Visualization, Remagen, Germany, 2004, pp. 375–382. [2] G. S. Guthart and J. K. Salisbury, “The Intuitive (TM) telesurgery system: Overview and application,” in International Conference on Robotics and Automation. San Francisco: IEEE, April 2000, pp. 618–621. [3] S. E. Butner, M. Ghoudoussi, and Y. Wang, “Robotic surgery – the transatlantic case,” in International Conference on Robotics and Automation. Washington D.C.: IEEE, May 2002, pp. 1882–1888. [4] G. Sung and I. Gill, “Robotic laparoscopic surgery: a comparison of the da vinci and zeus systems,” Urology, vol. 58, no. 6, pp. 893–898, 2001. [5] H. Reichenspurner, R. Demaino, M. Mack, D. Boehm, H. Gulbins, C. Detter, B. Meiser, R. Ellgas, and B. Reichart, “Use of the voice controlled and computer-assisted surgical system zeus for endoscopic coronary artery surgery bypass grafting,” Journal of Thoracic and Cardiovascular Surgery, vol. 118, pp. 11–16, 1999. [6] J. Marescaux, J. Leroy, M. Gagner, F. Rubino, D. Mutter, M. Vix, S. Butner, and M. K. Smith, “Transatlantic robot-assisted telesurgery,” Nature, vol. 413, pp. 379–380, 2001. [7] U. Kappert, R. Cichon, J. Schneider, V. Gulielmos, S. M. Tugtekin, K. Matschke, I. Schramm, and S. Schueler, “Closed-chest coronary artery surgery on the beating heart with the use of a robotic system,” J Thorac Cardiovasc Surg, vol. 120, no. 4, pp. 809–811, October 2000. [8] D. Loulmet, A. Carpentier, N. d’Attellis, F. Mill, D. Rosa, G. Guthart, A. Berrebi, C. Cardon, O. Ponzio, and B. Aupecle, “First endoscopic coronary artery bypass grafting using computer assisted instruments,” J Thoracic Cardiovasc Surg, vol. 118, no. 1, pp. 4–10, July 1999. [9] V. Falk, A. Diegeler, T. Walther, J. Banusch, J. Brucerius, J. Raumans, R. Autschbach, and F. W. Mohr, “Total endoscopic computer enhanced coronary artery bypass grafting,” Eur J Cardiothoracic Surg, vol. 17, no. 1, pp. 38–45, January 2000. [10]P. Berkelman, J. Ma, “A compact, modular, teleoperated Robotic minimally invasive surgery system,” in International Conference on Biomedical Robots and Mechatronics. Pisa, Italy: IEEE/RAS-EMBS, February 2006. [11]P. J. Berkelman, E. Boidard, P. Cinquin, and J. Troccaz, “LER: The light endoscope robot,” in International Conference on Intelligent Robots and Systems. Las Vegas: IEEE/RSJ, October 2003, pp. 2835–2840. [12]P. Berkelman, P. Cinquin, E. Boidard, J. Troccaz, and J.-A. Long, “Development and testing of a compact endoscope manipulator for minimally invasive surgery,” Computer Aided Surgery, vol. 11, 2005. [13]S. Voros, E. Orvain, P. Cinquin, and J.-A. Long, “Automatic detection of instruments in laparoscopic images: a first step towards high level
command of robotized endoscopic holders,” in International Conference on Biomedical Robots and Mechatronics. Pisa, Italy: IEEE/RAS-EMBS, February 2006. [14]T. H. Massie and J. K. Salisbury, “The phantom haptic interface: A device for probing virtual objects,” in Dynamic Systems and Control. Chicago: ASME, 1994, pp. 295–299.