Interactive 3D Data Registration for Proton Therapy using Touch ... - Aviz

1 downloads 147 Views 798KB Size Report
contrast to the current iterative manipulation of single degrees of freedom .... degree (M.Sc., M. Eng, or equivalent) i
Call for PhD applications

Interactive 3D Data Registration for Proton Therapy using Touch-Based Interfaces Summary We are looking for a PhD student who will investigate the design of fast and precise 3D data manipulation techniques for proton therapy patient placement in cancer treatment. Based on modern touch-based interaction techniques, we aim to reduce the time needed for X-Ray imaging matching. The goal of this technique is to achieve a manual registration between radiologic 2D orthogonal images of the patient to digitally reconstructed 2D radiographs that result from 3D CTs. The challenge in this project is that this manual match needs to satisfy a precision of 1.5 mm for all spatial directions and 1.5° for all rotation axes. Currently, the match is done using a traditional, mouse-based interface in which at most two spatial dimensions are manipulated at any given time. Consequently, this process requires several iterations of matching adjustment and subsequent verification, each of which requires a new X-ray of the patient. With novel interaction techniques we are confident not only to be able to reduce the time needed per patient, but thus also to be able to treat more cancer patients in the same time. We want to achieve this progress using touch-based interfaces that allow users to control 4–5 degrees of freedom (DOF) simultaneously, as opposed to 2 DOF for mouse input. Touch input thus facilitates a much more integrated 3D data matching. Moreover, the simultaneous control of 4 or more DOF will likely result in a better matching. The investigations in this project will be based on previous work by the PhD supervisors as well as other recent advances in interactive 3D data visualization. The PhD student will examine existing methods, design new techniques, and integrate them into a new interaction platform. Moreover, the PhD student will create the new techniques in such a way that they satisfy the high precision demands of the application domain. This project thus promises not only advances in 3D interaction for scientific data analysis but also very practical advances in cancer treatment.

1

Context and state of the art

The proton therapy is a discipline of radiotherapy which is used to treat cancers by irradiating the tumor with proton beams [23]. Proton beams have several accuracy advantages compared to standard photon irradiation (e. g., [22, 24]) and are of much interest for pediatric cancers and for some very badly localized tumors (close to organ at risk like in head and neck cancers). One of the most important challenges of proton therapy is that proton beams are very precise tools and thus necessitate an extremely high level of accuracy when positioning the patient at the right three-dimensional position in front of the beam line [4, 5, 20]. The tolerances for positioning the patient in proton therapy are 1.5 mm for all three directions and 1.5° for all three rotational axes of space [21, 27]. In order to position a patient with this accuracy, proton therapy has been relying for decades on implanted radiopaque fiducial markers [6] and on the use of X-ray imaging matching. Nowadays, the preparation process includes a manual registration of the patient’s bony structures seen on real radiologic 2D orthogonal images (i. e., live X-ray images taken from the patient when placed in front of the beam) to digitally reconstructed 2D radiographs (DRR) based on CT images which represent the patient’s anatomy at the treatment position. This registration tells the therapist whether a patient is correctly positioned in space (within the 1.5 mm & 1.5° tolerances) or if the position needs to be corrected. Currently, this registration is done manually by the therapist using a dedicated software system (Rotaplus, developed by ICPO) and a traditional mouse-based interface (see Figure 1). For each patient, the process takes approximately 15 minutes and typically requires 2–3 iterations of the matching. In each of these iterations, the placement is adjusted using several separate manipulations of the 7 DOF arrangement with a mouse, then a new X-ray image is taken, and the match quality is determined by comparing the two overlaid 2D images (the live X-ray and the 3D reconstruction) as shown in Figure 1.

In general, interactive 3D scientific visualizations have made a significant impact in many different disciplines including medical data analysis and treatment planning. Yet, existing interactive visualization systems are not typically regarded as being easy to learn or use [16], including the Rotaplus system that is used for proton therapy patient placement. Touch-based interfaces, on the other hand, have the potential to improve this situation as users of touch-based systems commonly associate them with being ‘intuitive’ and ‘natural.’ Such interfaces have recently been popularized as a major style of interacting with, in particular, Figure 1: Rotaplus software: image overlay matching in use. mobile devices. Part of this recent popularity is certainly due to the dedicated UI design and the novelty of touch as an interaction paradigm, but research has also shown that it can indeed be more effective than indirect forms of interaction. For example, touch interaction has been shown to outperform mouse input for the selection of targets on the screen [18], to facilitate awareness in collaborative settings [9], and to provide somesthetic information and feedback that is beneficial for effective interaction both in real and virtual environments [29]. This has lead to several recent calls to action [11, 14, 15, 16, 17] which specifically emphasize the need for work that investigates new interaction paradigms in scientific visualization and related disciplines. Yet, only few approaches for three-dimensional navigation and data manipulation exist to date [10, 12] that make use of the benefits of touch input. Past work has concentrated largely on manipulating specific 3D objects—and not data—in 3D and without the need for precise control of the shapes (e. g., [2, 3, 7, 8, 13, 26, 28, 31]). Within a medical context, Lundström et al. [25] used a horizontal table to explore medical data, Coffey et al. [1] combined a stereoscopic vertical display with widget-based interaction on a touch table, and Song et al. [30] combine a large monoscopic touch wall with a mobile secondary touch and orientation input device. Our own previous work addressed general 3D navigation issues [33], the selection of sub-spaces [32], and the creation of a data exploration environment for the exploration of fluid dynamics simulations [19].

2

Challenges and objectives

The goal of this project is thus to improve proton therapy cancer treatment by benefiting from modern touch-based 3D interaction technologies as they are currently being investigated in scientific visualization and virtual reality. In contrast to the current iterative manipulation of single degrees of freedom using mouse-based interfaces, we want to allow the therapist to grab a patient’s 3D data and control its location, orientation, and scale with respect to the captured X-ray image to match them manually yet accurately to their DRR data, with the speed, intuitiveness, and directness facilitated by touch interaction. Our challenges in this context are thus manifold: 1. We need to establish adequate interaction techniques that permit the adjustment of the 3D DRR data. This requires either a newly developed interaction technique or the adjustment of existing ones. It is likely that we need to provide a combination of several interaction techniques, with effective ways to transition between them. 2. We need to establish mechanisms for the precise control of 3D data to be able to overcome the inherent precision problems of touch input. We can benefit from direct manipulation but will also need to investigate ways to make use of adequate control-gain ratios to be able to overcome the imprecision issues of touch input. 3. We need to develop representations and feedback mechanisms that facilitate the interaction in 2D space while manipulating 3D data to enable people to intuitively grasp the relationship between the 3D and 2D data types. 4. We need to understand how many degrees of freedom can simultaneously be controlled by humans in 3D interaction. Even if we can provide input with many DOF, evidence suggests that people can mentally only control up to 4 or 5 at the same time.

5. We need to demonstrate that the integrated manipulation of 3D data by means of touch input can indeed lead to improvements in interaction speeds, conceptual understanding of the data, precision of matching, and reduction in need for X-ray images. Our goal by addressing these challenges is to speed up the matching process. This promises a number of implications that have the potential to drastically improve cancer treatment based on proton therapy. First, the total time spent per patient could be significantly reduced. This means that it would be possible to treat more patients withing the same amount of time, making proton therapy available to more patients.

3

Methods

We will start by thoroughly analyzing the state of the art. That means that we will first study the user interface design that is currently used in proton therapy. At the same time, we will also survey the currently available interaction techniques for the manipulation of 3D data. For this purpose we will investigate both techniques used in a scientific visualization context as well as methods investigated by the general HCI and virtual reality communities. Based on the results of both types of studies, we will develop a catalog of requirements for a technique to be applied in proton therapy, and rate the existing interaction techniques according to their suitability to the interaction goal. It is likely that no technique exists that completely satisfies the requirements of the application domain. We will therefore work on designing a new technique based on the existing ones. In particular, we will compare widget-based control with gestural interaction, exploring their suitability for the complex 7 DOF interaction task (combined with additional interaction goals in the process). We expect, therefore, that different interaction modalities will need to be combined to facilitate effective patient placement. We will thus investigate ways to enable switching between the different techniques in a fluent, user-controlled fashion. As the proposed work highly depends on human-computer interaction, we will generally use a participatory, user-centered design approach in which input from the domain experts is used at all stages of the process. This is an established methodology within HCI for creating complex interactive systems such as our 3D patient placement. The participatory, user-centered approach requires a frequent evaluation of intermediate and final results. For all evaluation work in the project we will use mixed-method approaches. We will assess the suitability of specific interaction techniques by comparing them to existing approaches using controlled experiments with time and accuracy as independent variables. Moreover, we will also use rigorous qualitative techniques such as, in particular, observational studies to determine constraints as well as to validate the acceptance of the proposed results by the target user community. Going beyond time and error metrics, observational studies rather than controlled experiments are suited to realistic visualization environments due to the complexity of human interactions, tasks, and cognition that cannot be measured holistically. For tests in clinical conditions, sessions will be organized in ICPO treatment rooms to evaluate the new techniques in the realistic imaging and robotic ICPO environment.

4

Expected results

We expect to create an interaction design for three-dimensional patient placement that allows a faster and more convenient preparation of the proton therapy sessions (thus increasing the total number of patients that can be treated in a facility), that also requires less X-ray images and thus exposes the patients to less radiation. At the same time, we expect the results to be at least as precise as the current state of the art. These results will be validated by comparing them to the current patient placement process for which detailed performance metrics are available (fiducial matching, manual registration). We will compare our new techniques based on these same metrics using the evaluation methodologies outlined above, including in a realistic clinical context.

5

Scientific impact

Scientifically, this project will have several types of impact. First, we will develop one of the first solutions for touch-based 3D data manipulation that has a truly practical significance in an important application domain. In the chosen clinical context, we will be able to demonstrate the power of touch-based data control and that it can

be far superior to traditional input devices. In this context, we will be able to prove that, while requiring less time per session, we can provide the same or better accuracy—despite the perceived inaccuracy of touch input. Second, we will specifically investigate the interaction of highly-skilled expert users, thus exploring a target audience that is too not considered academic interaction research. That means we can include the impact of training in our investigation—for the current manual registration process people need to complete several weeks to be able to control it perfectly in clinical conditions. Third, because we are investigating 3D interaction we expect the developed techniques to be applicable to other interactive data exploration domains in the physical sciences and beyond. Of course, this work will also have implications beyond interaction science and the practical advantages to cancer treatment with proton therapy. Within radiotherapy in general, for example, we expect the new approaches to be easily applicable, thus also improving treatment in those domains. In summary, by combining forces between interactive visualization and proton therapy, our project promises to advance the field of data science in Paris-Saclay.

6

Required skills for the PhD student • highly motivated student • degree (M.Sc., M. Eng, or equivalent) in computer science or closely related fields • education background in one or more of the following fields: visualization, human-computer interaction, and computer graphics • interest in applications in the health sector and in 3D interaction • previous experience in these fields would be highly beneficial • experience in modern computer graphics (GPU) programming • fluent in written and spoken English (French language skills are not required but would be beneficial for living in France and interacting with people outside of the lab) • previous experience in research and publication of research results beneficial

7

Application package • detailed CV (including education, degrees and dates, publications/scientific presentations, skills/experiences in programming languages, project work, academic awards, . . . ) • motivation letter explaining why you apply specifically for this project and why you are the perfect candidate • summary of the Master’s thesis • transcript of the grades • contact details for at least two academic references • prepare all these application documents electronically and in English (except potentially the transcripts which may be in your native language) • send a link (e. g., though services such as DropBox, Box, OneDrive, or similar) to your complete application file (in one big PDF file named familyname_givenname.pdf) by e-mail to Tobias Isenberg

8

Dates and deadlines • application deadline: applications are reviewed as they are received; however, for full consideration please submit your application by August 1, 2015 • starting date: between October 2015 and the end of 2015

9

Research lab and project partner

The project is a collaboration between Inria’s Aviz team and the Centre de Protonthérapie d’Orsay (ICPO). The project will be hosted by Inria Saclay’s AVIZ research group and will be supervised by Tobias Isenberg. The AVIZ group focuses on the analysis and visualization of large and complex datasets by combining analysis methods with interactive visualizations, thus is an ideal environment for the proposed work. In the past, Tobias Isenberg has spearheaded the international research into the touch-based interaction with 3D visualizations, evident in fundamental developments in interaction techniques (e. g., [32, 33]), design studies in practical applications (e. g., [19]), and the publication of state of the art reports and research agendas on the topic (e. g., [10, 11, 12, 17]). The PI and the Aviz group thus contribute an in-depth understanding of the field of interactive visualization and 3D interaction with scientific data. Being part of the Radiation Oncology Department of the Institut Curie, the ICPO has been treating patients since 1991 and was the first high energy proton therapy facility in France. Currently, the ICPO treats approximately 40 patients per day, concentrating on head and neck and ophtalmic cancers. In this group of patients, the ICPO treats approximately 500 patients per year. The Rotaplus software (Figure 1) that currently controls the patient registration process was developed at ICPO by the project partner Michel Auger and thus the relevant expertise (> 15 years of development, > 1000 patients treated in > 70000 sessions) from the proton therapy application domain side is readily available to the project. The Rotaplus software will serve as the implementation platform of our project. contact/supervisor contact e-mail contact website

Tobias Isenberg [email protected] http://tobias.isenberg.cc/

research team/lab team website

AVIZ team, Inria-Saclay, France http:///www.aviz.fr/

partner team partner website partner contact partner e-mail

Institut Curie – Centre de Protonthérapie d’Orsay http://protontherapie.curie.fr/ Michel Auger [email protected]

References [1] D. Coffey, N. Malbraaten, T. Le, I. Borazjani, F. Sotiropoulos, A. G. Erdman, and D. F. Keefe. Interactive slice WIM: Navigating and interrogating volume datasets using a multi-surface, multi-touch VR interface. IEEE Transactions on Visualization and Computer Graphics, 18(10):1614–1626, 2012. doi> 10.1109/TVCG.2011.283 [2] A. Cohé, F. Dècle, and M. Hachet. tBox: A 3D transformation widget designed for touch-screens. In Proc. CHI, pp. 3005–3008. ACM, New York, 2011. doi> 10.1145/1978942.1979387 [3] J. Edelmann, S. Fleck, and A. Schilling. The DabR–A multitouch system for intuitive 3D scene navigation. In Proc. 3DTV. IEEE, Piscataway, NJ, USA, 2009. doi> 10.1109/3DTV.2009.5069671 [4] K. P. Gall, L. J. Verhey, and M. Wagner. Computer-assisted positioning of radiotherapy patients using implanted radiopaque fiducials. Medical Physics, 20(4):1153–1159, July 1993. doi> 10.1118/1.596969

[5] M. Goitein. Calculation of the uncertainty in the dose delivered during radiation therapy. Medical Physics, 12(5):608–612, Sept. 1985. doi> 10.1118/1.595762 [6] D. Habermehl, K. Henkner, S. Ecker, O. Jäkel, J. Debus, and S. E. Combs. Evaluation of different fiducial markers for image-guided radiotherapy and particle therapy. Journal of Radiation Research, 54(Suppl. 1):i61–i68, July 2013. doi> 10.1093/jrr/rrt071 [7] M. Hancock, S. Carpendale, and A. Cockburn. Shallow-depth 3D interaction: Design and evaluation of one-, two- and three-touch techniques. In Proc. CHI, pp. 1147–1156. ACM, New York, 2007. doi> 10.1145/1240624. 1240798 [8] M. Hancock, T. ten Cate, and S. Carpendale. Sticky tools: Full 6DOF force-based interaction for multi-touch tables. In Proc. ITS, pp. 145–152. ACM, New York, 2009. doi> 10.1145/1731903.1731930 [9] E. Hornecker, P. Marshall, N. S. Dalton, and Y. Rogers. Collaboration and interference: Awareness with mice or touch input. In Proc. CSCW, pp. 167–176. ACM, New York, 2008. doi> 10.1145/1460563.1460589 [10] P. Isenberg and T. Isenberg. Visualization on interactive surfaces: A research overview. i-com, 12(3):10–17, Nov. 2013. doi> 10.1524/icom.2013.0020 [11] P. Isenberg, T. Isenberg, T. Hesselmann, B. Lee, U. von Zadow, and A. Tang. Data visualization on interactive surfaces: A research agenda. IEEE Computer Graphics and Applications, 33(2):16–24, Mar./Apr. 2013. doi> 10.1109/MCG.2013.24 [12] T. Isenberg. Position paper: Touch interaction in scientific visualization. In Proc. Workshop on Data Exploration on Interactive Surfaces (DEXIS), pp. 24–27. Inria, France, 2011. [13] B. Jackson, D. Schroeder, and D. F. Keefe. Nailing down multi-touch: Anchored above the surface interaction for 3D modeling and navigation. In Proc. GI, pp. 181–184. CIPS, Toronto, 2012. [14] C. Johnson. Top scientific visualization research problems. IEEE Computer Graphics and Applications, 24(4):13–17, July/Aug. 2004. doi> 10.1109/MCG.2004.20 [15] C. Johnson, R. Moorhead, T. Munzner, H. Pfister, P. Rheingans, and T. S. Yoo. NIH/NSF Visualization Research Challenges Report. IEEE Press, Los Alamitos, 2006. [16] D. F. Keefe. Integrating visualization and interaction research to improve scientific workflows. IEEE Computer Graphics and Applications, 30(2):8–13, 2010. doi> 10.1109/MCG.2010.30 [17] D. F. Keefe and T. Isenberg. Re-imagining the interaction paradigm for scientific visualization. IEEE Computer, 46(5):51–57, 2013. doi> 10.1109/MC.2013.178 [18] K. Kin, M. Agrawala, and T. DeRose. Determining the benefits of direct-touch, bimanual, and multifinger input on a multitouch workstation. In Proc. GI, pp. 119–124. CIPS, Toronto, 2009. [19] T. Klein, F. Guéniat, L. Pastur, F. Vernier, and T. Isenberg. A design study of direct-touch interaction for exploratory 3D scientific visualization. Computer Graphics Forum, 31(3):1225–1234, 2012. doi> 10.1111/j. 1467-8659.2012.03115.x [20] A.-C. Knopf and A. Lomax. In vivo proton range verification: A review. Physics in Medicine and Biology, 58(15):R131–R160, Aug. 2013. doi> 10.1088/0031-9155/58/15/R131 [21] J. Liebl, H. Paganetti, M. Zhu, and B. A. Winey. The influence of patient positioning uncertainties in proton radiotherapy on proton range and dose distributions. Medical Physics, 41(9):091711:1–12, Sept. 2014. doi> 10. 1118/1.4892601

[22] A. Lomax. Intensity modulation methods for proton radiotherapy. Physics in Medicine and Biology, 44(1):185– 206, Jan. 1999. doi> 10.1088/0031-9155/44/1/014 [23] A. J. Lomax, T. Boehringer, A. Coray, E. Egger, G. Goitein, M. Grossmann, P. Juelke, S. Lin, E. Pedroni, B. Rohrer, W. Roser, B. Rossi, B. Siegenthaler, O. Stadelmann, H. Stauble, C. Vetter, and L. Wisser. Intensity modulated proton therapy: A clinical example. Medical Physics, 28(3):317–324, Mar. 2001. doi> 10.1118/1. 1350587 [24] A. J. Lomax, M. Goitein, and J. Adams. Intensity modulation in radiotherapy: Photons versus protons in the paranasal sinus. Radiotherapy and Oncology, 66(1):11–18, Jan. 2003. doi> 10.1016/S0167-8140(02)00308-0 [25] C. Lundström, T. Rydell, C. Forsell, A. Persson, and A. Ynnerman. Multi-touch table system for medical visualization: Application to orthopedic surgery planning. IEEE Transactions on Visualization and Computer Graphics, 17(12):1775–1784, 2011. doi> 10.1109/TVCG.2011.224 [26] A. Martinet, G. Casiez, and L. Grisoni. 3D positioning techniques for multi-touch displays. In Proc. VRST, pp. 227–228. ACM, New York, 2009. doi> 10.1145/1643928.1643978 [27] H. Paganetti. Range uncertainties in proton therapy and the role of Monte Carlo simulations. Physics in Medicine and Biology, 57(11):R99–R117, June 2012. doi> 10.1088/0031-9155/57/11/R99 [28] J. L. Reisman, P. L. Davidson, and J. Y. Han. A screen-space formulation for 2D and 3D direct manipulation. In Proc. UIST, pp. 69–78. ACM, New York, 2009. doi> 10.1145/1622176.1622190 [29] G. Robles-De-La-Torre. The importance of the sense of touch in virtual and real environments. IEEE MultiMedia, 13(3):24–30, 2006. doi> 10.1109/MMUL.2006.69 [30] P. Song, W. B. Goh, C.-W. Fu, Q. Meng, and P.-A. Heng. WYSIWYF: Exploring and annotating volume data with a tangible handheld device. In Proc. CHI, pp. 1333–1342. ACM, New York, 2011. doi> 10.1145/1978942. 1979140 [31] A. Wilson. Simulating grasping behavior on an imaging interactive surface. In Proc. ITS, pp. 137–144. ACM, New York, 2009. doi> 10.1145/1731903.1731929 [32] L. Yu, K. Efstathiou, P. Isenberg, and T. Isenberg. Efficient structure-aware selection techniques for 3D point cloud visualizations with 2DOF input. IEEE Transactions on Visualization and Computer Graphics, 18(12):2245–2254, 2012. doi> 10.1109/TVCG.2012.217 [33] L. Yu, P. Svetachov, P. Isenberg, M. H. Everts, and T. Isenberg. FI3D: Direct-touch interaction for the exploration of 3D scientific visualization spaces. IEEE Transactions on Visualization and Computer Graphics, 16(6):1613–1622, 2010. doi> 10.1109/TVCG.2010.157

Suggest Documents