Mixed Reality (MR) can be exploited as a technology in order to realize ... In this context, suitable MR-based .... The freeze technique allows the user to save a.
Implementing MR-based Interaction Techniques for Manipulating 2D Visualizations in 3D Information Space Ralf Dörner, Leif Oppermann, Christian Geiger Hochschule Harz, Wernigerode, Germany {rdoerner, loppermann, cgeiger}@hs-harz.de Abstract
navigate, save and restore) as well as visualization specific tasks like (zoom, filter and compare).
A 3D information space is a three-dimensional visualization that contains 2D visualizations and puts them in a semantic context. In this poster, we sketch how Mixed Reality (MR) can be exploited as a technology in order to realize better interaction in a 3D information space and, as a result, to develop new interactive visualization techniques. Here, the real space is used as a metaphor for interacting with the 3D information space.
1. Mixed Reality for visualization Visualization enables us to analyze, to interpret, and to comprehend complex data sets. Mixed Reality (MR) as a novel methodology for creating imagery has a potential for contributing to the improvement of visualization techniques – especially since it allows integrating 2D and 3D imagery in our 3D world. Because 2D visualizations are more commonplace, people are used to work with them. In addition, one major drawback of 3D so far has been that users find it difficult to interact with virtual 3D visualizations – in contrast to real 3D visualizations (like scale models). In this context, suitable MR-based techniques for interacting with 3D visualizations could mitigate these problems by combining the strength of visualization in a virtual space with the advantages of visualization in a real space. MR methodologies have already been applied for visualization purposes. An approach from University Graz allows calculating the volume data for liver resection planning in real time and features virtual resection using physical props for slicing [1]. An AR widget framework for augmented interaction is presented in [2] including specific physical widgets: a magnifying widget implementing a magic lens interaction technique, a Cylinder widget for viewing life-size objects and a Cube Widget that allows capturing distant objects and enhancing them with supplementary data on the cube’s sides. None of the previous work, however, has designed a MR set-up and interaction techniques that are dedicated to the visualization of a 3D information space. Thus, for this task, we need to identify and support suitable interaction techniques for general tasks (like
Figure 1: A reference bounce box set-up
2. Interaction framework For the design of MR-based interaction techniques it is important to consider the set-up that serves as a framework for interaction. We want to represent the 3D information space's frame of reference in the real world. This makes the frame of reference more permanent, independent from the view point of the user and haptically accessible. Our reference bounce box (RBB) is a five-sided cube of 60 cm side length where the top side is made of diffuse plexiglas and contains a lamp while the remaining sides are painted white (see Figure 1). We designed three interaction devices we call board, paddle, and gyrom (see Figure 2). For the gyrom, we use a cordless gyroscopic mouse with an ergonomic handle and attach a cardboard with a marker printed on it. This gives us a powerful low-cost interaction device. We can use it as a paddle (e.g. in order to specify a 2D plane in the RBB) and as well as a mouse (e.g. in order to interact with a 2D visualization) by just waving the gyrom. We can also use the gyrom buttons for different semantic purposes (e.g. toggle between paddle and mouse mode).
Proceedings of the Third IEEE and ACM International Symposium on Mixed and Augmented Reality (ISMAR 2004) 0-7695-2191-6/04 $20.00 © 2004 IEEE
3. Interaction techniques and implementation Based on our interaction framework we implemented elementary MR based techniques for interacting with a 3D information space. The locate interaction technique aims to specify a location in the 3D information space. The user might know the exact coordinates of this location or the user has an idea where this location lies in the reference system given by the 3D information space. For this, the user simply takes an interaction device and holds it in the RBB at a certain location. Visual feedback in the form of the current coordinates is given. Depending on the task, not only the position of the interaction device but also its orientation can be important. The 2D visualization at the specified location in the 3D information space is then shown on the plane associated with the interaction device.
Figure 2: Manipulate, compare, and locate interaction techniques with gyrom, board, and paddle
By having the user change the location dynamically we obtain the explore technique where the user moves the interaction device in the RBB. As a result the user sees an animation since the 2D visualizations associated to the locations the user passes while moving the interaction device are shown sequentially. Using the paddle the user is able to slice the 3D information space. An according
visualization is shown on the plane specified by the interaction device. Here, the orientation of the paddle is important. For example, if the z-axis of the RBB is mapped to time and the user rotates the interaction device around the x-axis, the user can see values of different points in time. This interaction technique is especially suited for volumetric data where each of the 3 dimensions represents spatial values. Compare is a two-handed interaction technique. With two interaction devices the user is able to view two 2D visualizations simultaneously (e.g., compare houses). For this, the board interaction device is particularly useful. The user can view the exact measurement of the difference between the locations of the two interaction devices. With the compare interaction technique the user is able to find trends in data or identify differences. We provide interaction techniques that allow the user to manipulate the 2D visualization interactively. For this, the 2D visualization shown on the interaction device can be thought of as a 2D window located in 3D where an interactive application is running, e.g. an application that allows annotating the 2D visualization (see Figure 2). The mouse or the gyrom is used to perform these interactions. A 3D mouse pointer is provided as visual feedback. We provide navigation through the 3D information space and use a center of workspace metaphor that combines with a 3D zooming interface. The freeze technique allows the user to save a particular interesting position and orientation in the 3D information space. The restore operation allows him to match the pose of the device with the pose stored. We implemented our MR interaction techniques using our ARINT system, an ARToolkit based application with dedicated 2D interaction capabilities in 3D space [3]. We choose the following simple approach to realize pixelprecise picking on a planar surface. For every 2D window a plane is registered but not rendered in the application. This object allows the system to determine the exact position of a 2D mouse cursor independent of the plane's position and orientation using color picking. This is realized by coloring the plane's vertices with red and green and interpolating between colors. The plane closest to the viewer is selected and the pixel color under the cursor is retrieved. The color code is used to specify the exact position of the cursor on the plane.
4. References [1] B. Reitinger, A. Bornik, R. Beichel, G. Werkgartner, E. Sorantin. Tools for augmented reality based liver resection planning. SPIE Medical Imaging '04, San Diego, February 2004 [2] L Brown, H. Hua, C. Gao.: A Widget Framework for Augmented Interaction in SCAPE. UIST’03, Vancouver, BC, Canada, 2003 [3] C. Geiger, L. Oppermann, C. Reimann. 3D-Registered Interaction-Surfaces in Augmented Reality Space. In 2nd IEEE Int. AR Toolkit Workshop, Tokyo, Japan, Oct. 7, 2003
Proceedings of the Third IEEE and ACM International Symposium on Mixed and Augmented Reality (ISMAR 2004) 0-7695-2191-6/04 $20.00 © 2004 IEEE