Haptics in computer-mediated simulation: Training in vertebroplasty ...

12 downloads 0 Views 234KB Size Report
The authors show that with virtual reality gaming tech- nologies .... tissue (bone) and soft tissue (skin) is important in obtaining a realistic view of the skin and in ...
Simulation & Gaming http://sag.sagepub.com

Haptics in computer-mediated simulation: Training in vertebroplasty surgery Chee-Kong Chui, Jackson S. K. Ong, Zheng-Yi Lian, Zhenlan Wang, Jeremy Teo, Jing Zhang, Chye-Hwang Yan, Sim-Heng Ong, Shih-Chang Wang, Hee-Kit Wong, Chee-Leong Teo and Swee-Hin Teoh Simulation Gaming 2006; 37; 438 DOI: 10.1177/1046878106291667 The online version of this article can be found at: http://sag.sagepub.com/cgi/content/abstract/37/4/438

Published by: http://www.sagepublications.com

On behalf of: Association for Business Simulation & Experiential Learning International Simulation & Gaming Association Japan Association of Simulation & Gaming

North American Simulation & Gaming Association Society for Intercultural Education, Training, & Research

Additional services and information for Simulation & Gaming can be found at: Email Alerts: http://sag.sagepub.com/cgi/alerts Subscriptions: http://sag.sagepub.com/subscriptions Reprints: http://www.sagepub.com/journalsReprints.nav Permissions: http://www.sagepub.com/journalsPermissions.nav

Downloaded from http://sag.sagepub.com at PENNSYLVANIA STATE UNIV on February 7, 2008 © 2006 SAGE Publications. All rights reserved. Not for commercial use or unauthorized distribution.

Haptics in computer-mediated simulation: Training in vertebroplasty surgery Chee-Kong Chui Jackson S.K. Ong Zheng-Yi Lian Zhenlan Wang Jeremy Teo Jing Zhang Chye-Hwang Yan Sim-Heng Ong Shih-Chang Wang Hee-Kit Wong Chee-Leong Teo Swee-Hin Teoh National University of Singapore

Surgical simulators and computer games share the enabling technologies in the human-machine interface. With appropriate design and development, the computer-game-like medical training simulator could be used in surgical training. The authors describe a PC-based system for the simulation of the vertebroplasty procedure. In vertebroplasty, the surgeon or radiologist relies on sight and feel to properly insert the bone needle through various tissue types and densities and monitor the injection and reflux of the polymethylmethacrylate (PMMA), or cement, into the vertebra. This article focuses on the provision of a near-realistic haptic feel in bone needle insertion and manual PMMA injection. This involves an efficient biomechanical modeling of bone needle insertion and PMMA flow in bone for haptic rendering, as well as reliable delivery of forces via haptic devices. The authors show that with virtual reality gaming technologies, the surgical simulator can become a virtual trainer for a potentially risky spinal interventional procedure. KEYWORDS: biomechanical modeling; haptic device; haptic rendering; man-machine interfaces; surgical simulation; virtual reality gaming

According to Merril (1993, 1994), many studies show that doctors are most likely to make errors when performing their first few to several dozen new diagnostic and therapeutic surgical procedures; this has been termed the “learning curve” effect. Merril claims that operative risk could be substantially reduced by the development of a simulator that allows transfer of skills from the simulation to the actual point of AUTHORS’ NOTE: The project is funded by a grant (R-265-000-184-305) from the Agency of Science, Technology and Research (A*STAR), Singapore. SIMULATION & GAMING, Vol. 37 No. 4, December 2006 438-451 DOI: 10.1177/1046878106291667 © 2006 Sage Publications

438 Downloaded from http://sag.sagepub.com at PENNSYLVANIA STATE UNIV on February 7, 2008 © 2006 SAGE Publications. All rights reserved. Not for commercial use or unauthorized distribution.

Chui et al. / TRAINING IN VERTEBROPLASTY SURGERY

439

patient contact. Recent advancements in surgical modeling and virtual reality (VR) technologies have resulted in simulators that can help train doctors to obtain a higher degree of precision, reliability, safety, and cost efficiency (Greenleaf, 1995). Some of these surgical simulators, for example, SEP (Simsurgery, Oslo, Norway) and LapMentor (Simbionix, Lod, Israel), are commercially available. However, they are still a long way off from reducing the dependence on animals and cadavers in surgical training. Many medical professionals perceive existing surgical trainers to be too computergame-like, in particular if the interface device does not accurately mimic the sensorimotor tasks of the procedure accurately. Surgical simulators and computer games share the enabling technologies in human computer interfacing, including computer graphics and haptic interfacing. Indeed, computer gaming is a cost-effective conduit to motivate and engage a game-savvy generation of surgeons or medical practitioners to constantly practice their skills on virtual patients. For example, the force feedback joystick can be used to teach dynamic systems in medical simulation (Okamura, Richard, & Cutkosky, 2002). The main issue is how we can use or enhance computer-game-like medical training simulation so that the user can obtain the right experience, in particular, the integration of haptics (the generation of touch and force feedback information). Integrating haptic interactions requires understanding the end-users’ sensory, perceptual, and cognitive abilities and also the limitations in the procedure. The capability to recreate the experience caused by contact between a tool and an object is particularly useful in surgical simulators. This is because users experience an enhanced sense of realism when a haptic simulation is combined with a graphic simulation. The process of computing and generating forces in response to the interaction with virtual objects is referred to as haptic rendering (Salisbury et al., 2004). The haptic interface is a medium that enables the user to touch, feel, and manipulate virtual objects using a haptic device. The problem with current VR interfaces or simulation interfaces is that they provide mainly visual and occasional auditory feedback. This results in “inaccurate” or “incomplete” training of the user (the surgeon in this context) in surgical procedures. The user needs to experience the environment not only by having a visual display but also by being able to touch, strike, and receive feedback from the environment. We are developing a computer-game-like surgical simulator for training for the procedure of spinal cement vertebroplasty. With an integrative approach to creating an immersive environment with both haptic and visual feedback, we demonstrate that the simulation system could be used as a virtual trainer.

Vertebroplasty and simulation Percutaneous vertebroplasty (Murphy & Lin, 2001) is a minimally invasive procedure where liquid bone cement or polymethylmethacrylate (PMMA) mixed with

Downloaded from http://sag.sagepub.com at PENNSYLVANIA STATE UNIV on February 7, 2008 © 2006 SAGE Publications. All rights reserved. Not for commercial use or unauthorized distribution.

440

SIMULATION & GAMING / December 2006

radio-opaque barium sulphate or tantalum powder is injected into a fractured vertebral body under either fluoroscopic (X-ray) or CT guidance to relieve the patient’s pain and reduced mobility after cement hardening. The hypothesis is that the cement acts to bind the fracture components together and provide increased mechanical strength to the adjacent bone. The injection is performed with one or two bone biopsy needles that penetrate the vertebral pedicle and enter the body of the vertebra. It is important to have correct bone needle placement to avoid damaging vital organs such as the nerve roots or the spinal cord. High pressure injection of the PMMA while its viscosity remains moderately low is required and presents a significant risk of leakage (or extravasation) into the spinal canal and the veins within and surrounding it. This has been well-documented in the medical literature and may lead to compression of the spinal cord and nerves, as well as pulmonary embolism and arterial occlusion. There is a need for a VR-based computer-game-like simulator that allows the user to experience haptic feedback during interaction with the virtual environment models for vertebroplasty training. To minimize the risk of simulated complications and, hence, maximize the training scores, the interventionist would control the placement and direction of bone needle insertion based on haptic sensing of the reaction force and visual tracking. The rate and amount of bone cement injection is now controlled by pressure-calibrated injection devices in modern and advanced hospitals, so it is more important to simulate the viscosity of the cement and model its flow based on time, pressure, volume, viscosity, and the trabecular microachitecture of the injected bone. The sensory feedback is applicable to hand injection. The potential of medical simulators to replace live animals in surgical training so as to minimize cost, inefficiency, and animal discomfort has been previously reported (Anderson et al., 2002; Yamashita et al., 1999). Currently, simulators have been developed for catheter insertion (Anderson et al., 2002; Gobbetti, Tuveri, Zanetti, & Zorcolo, 2000; Wang, Chui, Lim, Cai, & Mak, 1999), epidural lumbar puncture (Brett, Parker, Harrison, Thomas, & Carr, 1997; Dang, Annaswamy, & Srinivasan, 2001; Gorman, Krummel, Webster, Smith, & Hutchens, 2000), spine biopsy (Kwon et al., 2001), breast biopsy (Azar, Metaxas, & Schnall, 2000), neurosurgical probe insertion (Shimoga & Khosla, 1994), and prostate needle biopsy (Zeng et al., 1998). Some of these simulators include haptic interaction. However, no clinically viable simulator has been developed or adapted for the vertebroplasty procedure. A possible reason is the need for specific, accurate, and computationally efficient models for bone needle insertion and bone cement injection that must be solved and updated in less than 20 ms and 2 ms, respectively, to sustain realistic visual and haptic force feedback. Although most studies assume a minimum refreshing rate of 1000 Hz for haptic rendering (Cotin, Delingette, & Ayache, 2000), Burdea (1996) suggested a minimum rate of 300 Hz. The force feedback by the manual hand injection system is very poor, and there is extremely limited control over injection rate. An integrative approach is employed in our solution to provide an immersive environment with both haptic and visual feedback for vertebroplasty training.

Downloaded from http://sag.sagepub.com at PENNSYLVANIA STATE UNIV on February 7, 2008 © 2006 SAGE Publications. All rights reserved. Not for commercial use or unauthorized distribution.

Chui et al. / TRAINING IN VERTEBROPLASTY SURGERY

Human Operator (Trainee Surgeon)

Visual Feedback

Computer Graphic Render Display Monitor

Force Feedback Motion

441

Haptic Device

Fluoroscopic View

Simulation Collision Detection

Haptic Render

Integrative Model Organ (Bone + Skin) Needle

Controller CyberGrasp + Delta + Force Feedback Joystick

3D View

Syringe Deform Computation

Bone-Needle Interface PMMA Flow

Flow Computation

FIGURE 1: Overview of Integrative Haptic and Visual Control for Computer-Based Training of Vertebroplasty Procedure

An immersive environment with haptics for vertebroplasty training Figure 1 illustrates the control flow between the various components in this integrative solution. The input to the simulation is computational models reconstructed from patient medical images such as CT and MRI. An adaptive thresholding method (Yan, Ong, Ge, Teoh, & Chui, 2004) is used to segment the human bone from the medical images. The volumetric model is then reconstructed from the segmented bone images. The computational model for bone needle interaction as well as PMMA injection is embedded within the reconstructed patient bone model. Modeling the hard tissue (bone) and soft tissue (skin) is important in obtaining a realistic view of the skin and in computing force feedback. We have developed a fast finite element method assuming that the trabecular bone network can be represented as a structure of beam elements for simulation of needle insertion (Ong et al., in press). Visual feedback can be updated at a rate of 20 Hz, whereas force feedback meets the minimal rate of 300 Hz. The amount of force feedback during PMMA injection is calculated based on the rate, location, rheological parameters of PMMA, and volume of injection over time using a physics-based model (Lian, 2006). With the needle already inserted, we assume that the flow will follow a defined path in the vertebral body from the needle tip. The computational model and biomechanical simulation serve as the link between haptic and visual rendering.

Downloaded from http://sag.sagepub.com at PENNSYLVANIA STATE UNIV on February 7, 2008 © 2006 SAGE Publications. All rights reserved. Not for commercial use or unauthorized distribution.

442

SIMULATION & GAMING / December 2006

FIGURE 2: Overview of Immersive Environment: (a) Virtual Workbench, Force Feedback Joystick, and Delta Haptic Device; (b) An Interactive Session With User Wearing CyberGrasp and CyberGlove; (c) Needle Holding Attachment Unit

Figure 2 shows an overview of the VR-based simulation system. There is a workbench designed and built in the Control Lab, Department of Mechanical Engineering, National University of Singapore. The workbench includes a display monitor, a mirror, and a pair of stereo glasses. The system also includes sensors and force feedback actuators. The user looks into the mirror through the glasses and perceives the virtual image through the work volume where the hands lie as shown in Figure 2(b). This produces a stereo view of a virtual volume in which the hands can move without obscuring the display. Alternatively, a fluoroscopic view can be displayed with no stereoscopic vision. The movement of the user’s right hand and fingers is captured by an Immersion CyberGlove data glove. The Immersion CyberGrasp is the force feedback actuator that provides variable resistance felt on the fingers by pulling them during interaction with the simulation. We show in a later section that it is possible to train the user on PMMA injection by providing resistance to the user’s thumb. The force feedback joystick is used with the left hand to complement the CyberGrasp on the right hand. To provide force feedback to the arm, which is necessary for needle insertion training, a Delta Haptic Device from the Swiss Federal Institute of Technology (EPFL) is used. The device has six degrees of freedom and also provides high strength, high stiffness, and high sensitivity. It is used in conjunction with a custom-built needle-holding attachment (Figure 2[c]).

Downloaded from http://sag.sagepub.com at PENNSYLVANIA STATE UNIV on February 7, 2008 © 2006 SAGE Publications. All rights reserved. Not for commercial use or unauthorized distribution.

Chui et al. / TRAINING IN VERTEBROPLASTY SURGERY

443

FIGURE 3: 3D Graphic User Interface: (a) First-Person View; (b) Third-Person View With Spine-Only Display and (c) Rotating View Point Using Rolling Ball; (d) Feeling the Spine Using Force Feedback Data Glove

Force feedback to the hand will allow the user to virtually hold surgical instrumentations or touch the patient. Figure 3 presents the 3D graphic user interface of the virtual surgical environment. The data glove is the main interface medium. There are options for first- and third-person views, transparent or spine-only displays, zooming functions, as well as a virtual trackball for whole body manipulation. The user interface, with collision detection algorithms and haptic rendering, allows detection and force feedback in the interaction between patient, patient spine, and surgical instrumentations, respectively. The haptic and computer graphic rendering is processed via two separate computing threads in a unifying framework. This is shown in Figure 3(c), with the user performing a physical examination by feeling the spine.

Simulation of bone needle insertion For interactive surgical simulation of bone needle insertion involving haptic rendering, the force at the needle tip has to be computed quickly. In our biomechanical

Downloaded from http://sag.sagepub.com at PENNSYLVANIA STATE UNIV on February 7, 2008 © 2006 SAGE Publications. All rights reserved. Not for commercial use or unauthorized distribution.

444

SIMULATION & GAMING / December 2006

FIGURE 4: Bone Needle Insertion Simulation: (a) Needle Insertion; (b) Syringe Positioning; (c) Needle and Syringe Positioned in First-Person View; (d) Needle Insertion Under Simulated Fluoroscopic Guidance

model for bone needle insertion, the cortical bone is regarded as a dense form of cancellous bone that can be modeled using a linear elastic material. The porosity of the bone determines the resistance that is felt as the user inserts the needle into the bone. The bar element method that we proposed in Ong et al. (in press), in which each trabecular bone is represented as a finite element beam, is highly computationally efficient. With 1,000 elements, the computed force feedback was close to the insertion force measured during experiments. Figure 4 shows snapshots of the interaction of the needle-holding attachment with the CyberGrasp and Delta. A bone needle is picked up using the CyberGrasp (Figure 4[a]). The user could more accurately feel the insertion of the needle into the bone via the Delta. This is followed by placing the syringe over the needle (Figure 4[c]). The fingers are constrained such that the user can feel the shape of the instrument. Figure 4(d) shows the insertion of a simulated bone needle into the reconstructed human spinal model with a simulated fluoroscopic view. Figure 5 illustrates the simulated force feedback on the needle tip as a function of displacement. The simulation accurately predicts the penetrating force at approximately 20 N for the cortical bone. This agrees with the experimental study presented in the literature (Matsumiya et al., 2003). With 5,000 finite element (FE) beam elements, a refreshing rate of more than 1,000 Hz can be achieved for haptic rendering. It should be noted that in many clinical patients, the insertion of the vertebroplasty needle requires a bone hammer to tap the needle into position under repeated live fluoroscopic guidance. One exception would be in patients with extensive bone

Downloaded from http://sag.sagepub.com at PENNSYLVANIA STATE UNIV on February 7, 2008 © 2006 SAGE Publications. All rights reserved. Not for commercial use or unauthorized distribution.

Chui et al. / TRAINING IN VERTEBROPLASTY SURGERY

445

Force (N)

20

10

0

5

10

Displacement (mm)

FIGURE 5: Needle Insertion Force (N) Versus Displacement (mm) From Simulation Using Finite Element Beam Elements

destruction caused by an infiltrative tumor, when manual insertion of the needle can be performed without the use of a hammer. Haptic sensing is particularly important in this case. Typically, the needle is advanced gradually by light taps of the hammer after the bone cortex is reached by manual insertion. Frequent biplanar fluoroscopic checking for needle position, and angling or steering of the needle by hand during tapping, is commonly required. This aspect of the needle insertion has not been simulated at this time. To simulate the hammering of the needle, the user will now hold a hammer to tap on the needle. The force feedback is felt via the hammer. This can be easily implemented into the system due to its modular architecture.

Simulation of PMMA injection An aim of this project is to develop a PMMA injection interface component that can provide realistic visual and haptic feedback using biomechanical models that are computationally efficient. A branching pipe network geometrical model is first extracted from micro-CT scans of a trabecular bone cube to represent porous cancellous bone. Bone cement flow through the cannula and trabecular network is modeled as Hagen-Poiseuille flow. A rheological model is then developed based on dynamic rheological testing of curing PMMA bone cement samples to supplement the physical-based model by predicting changes in cement viscosity with time and shear rate during an injection. A fluid-flow visualization algorithm (Lian, 2006) is developed and then integrated with a fluoroscopic graphical interface so that the simulated PMMA injection into a lumbar vertebra can be visualized under a computer graphics-rendered fluoroscopic view. The rheological and physical-based models are implemented to compute injection pressures in real time. The computed force feedback is delivered to the user via CyberGrasp. The PMMA injection interface acquires the bone cement injection rate every 10 ms by multiplying a virtual injection speed with the cross-sectional area of a syringe used for cement delivery in vertebroplasty. The injection speed is computed as the

Downloaded from http://sag.sagepub.com at PENNSYLVANIA STATE UNIV on February 7, 2008 © 2006 SAGE Publications. All rights reserved. Not for commercial use or unauthorized distribution.

446

SIMULATION & GAMING / December 2006

FIGURE 6: Polymethylmethacrylate (PMMA) Injection Simulation: (a) Position-Sensing Data Glove; (b) Force Feedback Actuator Fitted Onto Data Glove With Five Independently Controlled Fingers’ Resistance; (c) Virtual PMMA Injection Performed Using CyberGrasp and a Syringe With PMMA Infiltration Monitored on the Fluoroscopic Graphical Interface; (d) Phases of Bone Cement Injection With Cement Overflowing at Terminal Sections of the Trabecular Network Rendered as Red Spheres

average rate of change in displacement between the distal joints of the thumb and middle finger in each 10 ms time-step. (For the thumb, “distal” refers to the tip; for a finger, “distal” refers to the joint just below the tip.) The displacement between the joints is obtained from the difference of the individual displacements of each joint relative to the wrist, which is measured using the position-sensing haptic glove (CyberGlove) as shown in Figure 6(a). The injection pressure computed every 10 ms is multiplied by the syringe cross-sectional area to compute and update the reaction force during a virtual injection. This injection force is assumed to remain constant within each 10 ms time-step and can be accessed by the controller of the haptic actuator (CyberGrasp; Figure 6[b]) every 1 ms to sustain force feedback to the thumb during a virtual injection of bone cement. The prototype PMMA injection interface provides a sufficiently realistic visual and haptic feedback of bone cement infiltration during virtual injection (Figure 6[c]). It should be noted that PMMA changes its viscosity fairly rapidly over time during the hardening reaction. Normally, it is mixed to a honey-like consistency prior to injection and then allowed to harden to approximately the consistency of toothpaste during the injection. However, this is at the point when the viscosity sharply changes over about 5 minutes as the cement solidifies. This has not been simulated at this

Downloaded from http://sag.sagepub.com at PENNSYLVANIA STATE UNIV on February 7, 2008 © 2006 SAGE Publications. All rights reserved. Not for commercial use or unauthorized distribution.

Chui et al. / TRAINING IN VERTEBROPLASTY SURGERY

447

time. Finally, the most recent systems for vertebroplasty have eschewed manual injection for calibrated volume and measured pressure devices to reduce the variation in injection rate and volume between operators, which makes haptic simulation of this aspect less important.

Discussion and conclusion The prototyped surgical simulator described in this article is a virtual trainer for vertebroplasty. With integrative haptics and visual feedback, the user can undergo more complete training in inserting the bone needle through various tissue types and densities and monitoring the injection and reflux of the PMMA into the vertebra. We are planning to have the VR-based system deployed in medical school for formal training, although there might be a need to include more risky spinal operations into the immersive environment. There are some technical issues that should be resolved to achieve widespread usage of the simulator. For example, the oscillation of the probe with the Delta haptic device can be rather large at times. The force feedback joystick may be a more stable apparatus for the user to appreciate the force felt during needle insertion and is also a more cost-effective solution. The position-sensing data glove (CyberGlove) is small for some users. There may be no need for the “hand” to be rendered apart from feeling the human spinal structure in physical examination. Furthermore, because the needle may not be inserted by hand but by using a bone hammer and incremental manual angling and visual checking, this aspect of the placement can be implemented and will be simulated in further work. When the CyberGlove is used to perform virtual injection, a cement infiltration process is observed with an infiltration rate and amount realistically proportional to the rate of thumb motion. A scaling method has been used to reduce the injection rate by a suitable amount to allow filling of the vertebra to be represented by filling of an extracted trabecular cube geometrical model. The realism of over-injection detection and visualization allows the user to determine when infiltrating bone cement has reached the posterior quarter of the vertebra being filled, which usually demands injection to be stopped to prevent cement extravasation. Haptic interaction for PMMA injection appears to be a more difficult problem compared with that of flow visualization. To have a realistic force feedback, computed injection pressures were scaled down by a factor of 10 to match the order of clinical measurements (Krebs et al., 2005). However, implementing haptic-driven force feedback accurately is difficult due to the compatibility and stability issues of the haptic actuator (CyberGrasp). Moreover, the computed injection force must be limited so as to prevent exceeding the maximum 12 kN force that can be supplied by the actuator to the thumb. Nevertheless, this setup is efficient and effective from focusing on the essential components—the haptic feedback is on the thumb, and computation and resultant visual update occurs around the needle path. In the end, this problem of haptic simulation of the injection may become less important because of modern technical developments in cement delivery systems that Downloaded from http://sag.sagepub.com at PENNSYLVANIA STATE UNIV on February 7, 2008 © 2006 SAGE Publications. All rights reserved. Not for commercial use or unauthorized distribution.

448

SIMULATION & GAMING / December 2006

remove all force feedback from the operator, resolving the problem down to one of pressure, time, and volume. In such a situation, it is more important to accurately model the cement viscosity, trabecular properties of the bone including defects and fracture lines (patient-specific bone data would be important here), as well as the flow and positioning of the cement during the injection. This is the subject of further work. Although the stability and robustness of hardware (haptic devices) appear to be an annoying problem from our experience with various users, the main research problem continues to be haptic rendering. Haptic rendering requires a much higher refreshing rate compared with that of graphical rendering. Various methods have been proposed to simplify the computational model so that the desired high refreshing rate can be achieved. On the other hand, it is important to have a realistic simulation so that the user will not experience an incorrect procedure. Our biomechanical model of bone needle interaction has considered the underlying bone structure as well as the type of needle used in the simulation. Although the bar element method is most computationally efficient, we are considering the extended bar element method that takes into consideration the trabecular distribution better. We are planning an experiment on bone needle insertion using cadaver specimens. Bone fracture and the resultant FE remeshing are important issues that will need to be addressed. The experimental observation may provide us with important information in implementing adaptive meshing for bone cracking due to needle insertion. Independent rheological testing has been performed by Lian (2006). The convergence and eventual coincidence of the test results and theoretical results within an initial curing phase of bone cements support the validity of the extracted rheological models and the proposed extraction methodology within that phase. The geometrical model, scaling methodology, and fluid flow visualization algorithms developed are practical and feasible for a real-time visualization of bone cement injection into cancellous bone. The implementation of the physics-based rheological models in the simulation is a viable simplification, when compared with a finite element method, to obtain injection pressures in real time. The PMMA injection interface has provided a sufficiently realistic feedback of bone cement infiltration during virtual manual injection via the CyberGrasp. The prototype simulator described in this article has provided a platform for further improvements to be made, such as the accuracy of haptic-driven force feedback and visual rendering, as well as inclusion of other risky spinal operations. For both tasks on bone needle insertion and manual PMMA injection in vertebroplasty, it appears that most users relied heavily on the visual cues during simulation. An experienced user will be able to use the haptic senses to augment the visual cues for a better performance outcome. Hence, it is important to incorporate haptics in surgical simulator so that users can have a complete and accurate training on their hand-eye coordination skills. We are also exploring the possibility of using lower cost, lower resolution gaming gloves to replace the CyberGrasp system for a less expensive solution. A cost-effective computer-game-like solution will better encourage the practitioner to practice needle insertion and manual PMMA injection when the need arises. A potential important application of the haptics and VR techniques described in this article is robotic-guided bone needle insertion and PMMA injection. There is Downloaded from http://sag.sagepub.com at PENNSYLVANIA STATE UNIV on February 7, 2008 © 2006 SAGE Publications. All rights reserved. Not for commercial use or unauthorized distribution.

Chui et al. / TRAINING IN VERTEBROPLASTY SURGERY

449

already an emerging interest in the development and utilization of a computer-aided surgical system using robots for percutaneous delivery of surgical devices and therapeutic agents to soft tissue and bone lesions (Masamune et al., 1996; Matsumiya et al., 2003; Schreiner et al., 1997). The force-sensing actuator can be easily installed at the end effecter of these robots. In robotic-guided needle insertion, the haptic associated senses are lost. It is necessary for the surgeon or radiologist to receive metrics at each step of organ deformation and compare the intra-operative information with the plan. Based on the hypothesis that a good surgeon has better haptic sensing during tool-tissue interaction in operation compared with his or her peers, augmenting the haptic sensing via the surgical tool will be an interesting research topic.

References Anderson, J. H., Chui, C. K., Cai, Y., Wang, P., Li, Z., Ma, X., Nowinski, W. L., Solaiyappan, M., Murphy, K. & Gailloud, P. (2002). Virtual reality training in interventional radiology: The Johns Hopkins and Kent Ridge Digital Laboratory Experience. Seminars in Interventional Radiology, 19(2), 179-185. Azar, F. S., Metaxas, D. N., & Schnall, M. D. (2000). A finite element model of the breast for predicting mechanical deformations during biopsy procedures. Proceedings of the IEEE Workshop on Mathematical Methods in Biomedical Image Analysis, pp. 38-45. Brett, P. N., Parker, T. J., Harrison, A. J., Thomas, T. A., & Carr, A. (1997). Simulation of resistance forces acting on surgical needles. Journal of Engineering in Medicine, 211, 335-347. Burdea, G. (1996). Force and touch feedback for virtual reality. New York: John Wiley & Sons. Cotin, S., Delingette, H., & Ayache, N. (2000). A hybrid elastic model allowing real-time cutting, deformations and force-feedback for surgery training and simulation. Visual Computer, 16(8), 437-452. Dang, T., Annaswamy, T. M., & Srinivasan, M. A. (2001). Development and evaluation of an epidural injection simulator with force feedback for medical training. Medicine Meets Virtual Reality, pp. 97-102. Gobbetti, E., Tuveri, M., Zanetti, G., & Zorcolo, A. (2000). Catheter insertion simulation with co-registered direct volume rendering and haptic feedback. Medicine Meets Virtual Reality, pp. 96-98. Gorman, P., Krummel, T., Webster, R., Smith, M., & Hutchens, D. (2000). A prototype haptic lumbar puncture simulator. Medicine Meets Virtual Reality, pp. 106-109. Greenleaf, W. J. (1995). Medical applications of virtual reality technology. In J. D. Bronzino (Ed.), The biomedical engineering handbook (pp. 1165-1179). Boca Raton, FL: CRC Press. Krebs, J., Ferguson, S. J., Bohner, M., Baroud, G., Steffen, T., & Heini, P. F. (2005). Clinical measurements of cement injection pressure during vertebroplasty. Spine, 30(5), E118-E122. Kwon, D. S., Kyung, J. U., Kwon, S. M., Ra, J. B., Park, H. W., Kang, H. S., Zeng, J., & Cleary, K. R. (2001). Realistic force reflection in a spine biopsy simulator. Proceedings of the IEEE International Conference on Robotics and Automation, pp. 1358-1363. Lian, Z. Y. (2006). Biomechanical model driven haptic interaction for PMMA injection simulation. Final year project thesis, National University of Singapore, Singapore. Masamune, K., Sonderegger, M., Iseki, H., Takakura, K., Suzuki, M., & Dohi, T. (1996). Robots for stereotactic neurosurgery. Advanced Robotics, 10(4), 391-401. Matsumiya, K., Momoi, Y., Kobayshi, E., Sugano, N., Yonenobu, K., Inada, H., Tsuji, T., & Sakuma, I. (2003). Forces and torques during robotic needle insertion to human vertebra. International Congress Series, 1256, 492-497. Merril, J. R. (1993). Surgery on the cutting edge. Virtual Reality World, 1(3-4), 34-38. Merril, J. R. (1994). Presentation material: Medicine meets virtual reality II (Abstract). In Medicine meets virtual reality II: Interactive technology and healthcare: Visionary applications for simulation visualization robotics (pp. 158-159). San Diego: Aligned Management Associates.

Downloaded from http://sag.sagepub.com at PENNSYLVANIA STATE UNIV on February 7, 2008 © 2006 SAGE Publications. All rights reserved. Not for commercial use or unauthorized distribution.

450

SIMULATION & GAMING / December 2006

Murphy, J. K., & Lin, D. M. (2001). Vertebroplasty. Journal of Clinical Densitometry, 4(3), 189-197. Okamura, A. M., Richard, C., & Cutkosky, M. R. (2002). Feeling is believing: Using a force-feedback joystick to teach dynamic systems. ASEE Journal of Engineering Education, 92(3), 345-349. Ong, J.S.K., Chui, C. K., Wang, Z. L., Zhang, J., Teo, J. C. M., Yan, C. H., Ong, S. H., Teo, C. L., & Teoh, S. H. (in press). Biomechanical modeling of bone-needle interaction for haptic rendering in needle insertion simulation. Proceedings of the 9th International Conference on Control, Automation, Robotics and Vision (ICARCV 2006). Salisbury, K., Conti, F., & Barbagli, F. (2004, March/April). Haptic rendering: Introductory concepts. IEEE Computer Graphics and Applications, pp. 24-32. Schreiner, S., Anderson, J. H., Taylor, R. H., Funda, J., Bzostek, A., & Barnes, A. C. (1997). A system for percutaneous delivery of treatment with a fluoroscopic-guided robot. Proceedings of the Joint Conference of Computer Vision, Virtual Reality and Robotics in Medicine and Medical Robotics and Computer Surgery, Grenoble, France. Shimoga, K. B., & Khosla, P. K. (1994). Visual and force feedback to aid neurosurgical probe insertion. Proceedings 16th International Conference of the IEEE Engineering in Medicine and Biology Society, pp. 1051-1052. Wang, Y., Chui, C. K., Lim, H. L., Cai, Y., & Mak, K. H. (1999). Real-time interactive simulator for percutaneous coronary revascularization procedures. Computer Aided Surgery, 3(5), 211-227. Yamashita, J., Yamauchi, Y., Mochimaru, M., et al. (1999). Real-time 3-D model-based navigation system for endoscopic paranasal sinus surgery. IEEE Trans. Biomedical Engineering, 46(1), 107-116. Yan, C. H., Ong, S. H., Ge, Y., Teoh, S. H., & Chui, C. K. (2004, August 20-23). Accurate and precise 3D surface registration. Proceedings of the 6th IASTED International Conference on Signal and Image Processing (SIP 2004), Honolulu, HI, pp. 413-441. Zeng, J., Kaplan, C., Bauer, J., Xuan, J., Sesterhenn, I. A., Lynch, J. H., Freedman, M. T., & Mun, S. K. (1998). Optimizing prostate needle biopsy through 3-D simulation. Proceedings of SPIE—Medical Imaging 1998, pp. 488-497.

Chee-Kong Chui is currently with the Department of Mechanical Engineering, National University of Singapore. He was the principal investigator of the Biomedical Simulation & Device Design Project at the Institute of Bioengineering (IBE) prior to pursing a PhD in the Biomedical Precision Engineering Lab, University of Tokyo, Japan. His research interests include computer-integrated and robot-assisted surgery, human-machine interface, medical device design, biomechanical modeling, and simulation. Jackson S. K. Ong is a research engineer in the Department of Mechanical Engineering, National University of Singapore. He is currently working toward his M.Eng degree in biomechanical engineering. Zheng-Yi Lian is a mechanical engineering student at the National University of Singapore. He is expecting a First Class Honors degree by the middle of this year. Zhenlan Wang is a research fellow in the Department of Mechanical Engineering, National University of Singapore. He received his PhD from the School of Computing, National University of Singapore in 2006. Jeremy Teo is a research fellow in the Department of Mechanical Engineering, National University of Singapore. He is currently working toward his PhD in medicine. Jing Zhang is a research engineer in the Department of Mechanical Engineering, National University of Singapore. He is currently working toward his PhD in electrical and computer engineering. Chye-Hwang Yan is an adjunct assistant professor in the Department of Electrical and Computer Engineering, National University of Singapore. He heads the Signal Processing Lab in the Defense Science Organization of Singapore. He received his PhD from Stanford University in 1998.

Downloaded from http://sag.sagepub.com at PENNSYLVANIA STATE UNIV on February 7, 2008 © 2006 SAGE Publications. All rights reserved. Not for commercial use or unauthorized distribution.

Chui et al. / TRAINING IN VERTEBROPLASTY SURGERY

451

Sim-Heng Ong is an associate professor and the head of the Biomedical Engineering Group in the Department of Electrical and Computer Engineering, National University of Singapore. He received his PhD from Sydney University, Australia. Shih-Chang Wang is an associate professor and the head of the Department of Diagnostic Radiology, Yong Loo Lin School of Medicine, National University of Singapore. Hee-Kit Wong is a professor and the head of the Department of Orthopaedic Surgery, Yong Loo Lin School of Medicine, National University of Singapore. Chee-Leong Teo is an associate professor in the Department of Mechanical Engineering, National University of Singapore. He is also the director of NUS Overseas Colleges (NOC). He received his PhD from the University of California, Berkeley. Swee-Hin Teoh is a professor in the Department of Mechanical Engineering, National University of Singapore. He is also the director of the Centre for Biomedical Materials Application & Technology (BIOMAT). He is well known for his excellent leadership in interdisciplinary research and education. He received his PhD from Monash University, Australia. ADDRESSES: CKC, JSKO, ZYL, ZW, JT, & SHT: Centre for Biomedical Materials Application & Technology (BIOMAT), National University of Singapore, E3-05-23, Engineering Drive 3, Singapore 119260; telephone: +65-6516-5104; fax: +65-6777-3537. JZ, CHY, & SHO: Department of Electrical and Computer Engineering, National University of Singapore, E4-05-48, Engineering Drive 3, Singapore 117576. SSW & HKW: National University Hospital, 5 Lower Kent Ridge Road, Singapore 119074. CLT: NUS Overseas Colleges, E3A, 10 Kent Ridge Crescent, Singapore 119260.

Downloaded from http://sag.sagepub.com at PENNSYLVANIA STATE UNIV on February 7, 2008 © 2006 SAGE Publications. All rights reserved. Not for commercial use or unauthorized distribution.

Suggest Documents