Interactive Textures as Spatial User Interfaces in X3D - CiteSeerX

3 downloads 2054 Views 6MB Size Report
§email:[email protected] .... embedded (D)HTML code. If a link was clicked on ... is kept and only the clicked link in the link changed eventOut.
Interactive Textures as Spatial User Interfaces in X3D Y. Jung∗

S. Webel†

M. Olbrich‡

T. Drevensek§

T. Franke¶

M. Rothk

D. Fellner∗∗

Fraunhofer IGD / TU Darmstadt, Darmstadt, Germany

Abstract 3D applications, e.g. in the context of visualization or interactive design review, can require complex user interaction to manipulate certain elements, a typical task which requires standard user interface elements. However, there are still no generalized methods for selecting and manipulating objects in 3D scenes and 3D GUI elements often fail to gather support for reasons of simplicity, leaving developers encumbered to replicate interactive elements themselves. Therefore, we present a set of nodes that introduce different kinds of 2D user interfaces to X3D. We define a base type for these user interfaces called ”InteractiveTexture”, which is a 2D texture node implementing slots for input forwarding. From this node we derive several user interface representations to enable complex user interaction suitable for both, desktop and immersive interaction. CR Categories: H.5.1 [Information Interfaces and Presentation]: Multimedia Information Systems—User Interfaces; I.3.7 [Computer Graphics]: Three-Dimensional Graphics and Realism— Color, shading, shadowing, and texture Keywords: Interaction, Texturing, 3D UI, Virtual Reality, X3D

1

Introduction

3D applications with high interactivity beyond simple navigation tasks often require elements to fine tune or configure the application in a certain way. In 2D space, this problem is usually solved through standard UI elements such as buttons and sliders. Attempts to replicate this functionality in 3D, especially for immersive applications via specially designed 3D widgets, so far haven’t had any convincing impact to flesh out a standard framework. In computer gaming this problem likewise persists to this day and forces developers to reinvent user interfaces with each new release. Although there exist several widget prototype libraries also for the use in X3D applications, they are neither generally available nor do they provide a standardized interface and behavior.1 Hence, for desktop VR applications mostly so-called head-up-displays (HUD) are utilized, which mostly consist of 3D elements that seem to be two-dimensional as they are rendered in such a way that they always appear parallel to the viewing plane in the user’s field of view. Furthermore, this kind of GUI has the disadvantage that on the one hand widgets are often created from scratch, while on the other this ∗ e-mail:[email protected] † e-mail:[email protected] ‡ e-mail:[email protected] § email:[email protected] ¶ email:[email protected] k email:[email protected] ∗∗ e-mail:[email protected]

affords having experience and skills in GUI design for providing intuitive and user friendly interfaces that allow the user to achieve his goals without having to learn new interaction methods. Therefore, we present a set of new node types to enable the usage of 2D interfaces in X3D applications, which has the benefit that users are already familiar with the (well-designed) interface elements from standard 2D applications and that also additional textual information describing the interaction task can easily be provided. By introducing a base node, which defines interactive behavior in texture space, we derive several exemplary UIs for selfdefined interfaces via a widget toolkit designer tool, for web browsing and for X11 applications. Supported by the X3D framework [Web3DConsortium 2008], we are able to transfer these user interfaces from desktop 3D applications to highly immersive environments like the CAVE (see Figure 4) or large distributed environments like the so-called HEyeWall as shown in Figure 6.

2

Related Work

The idea of bringing UI elements to 3D scenes is not new [Bowman et al. 2004], but even today there is ongoing research concerning standardization of the design of such interfaces. So even in 2D UI systems, where devices are standardized for years, there is still no real standard and many UI libraries such as GTK+ or Qt [Nokia 2010] exist. The VEWL library, which provides an interface metaphor for building window-based interfaces in a virtual environment by utilizing Qt, is presented in [Larimer and Bowman 2003]. Another integration of a 2D UI framework into a given 3D application was presented by [Topol 2000]. The authors make use of X11 and its benefit as a network client-server protocol with an integrated feedback channel for user input devices. The visual readout of the X-Window server is mapped into texture memory and therefore is available inside the 3D scene. For implementing user feedback, the mouse position is projected onto the rendered texture and press/ release events are transfered back to the X11 server. [Prentice 2008] recently presented a library to integrate the Mozilla engine into OpenGL applications for using web pages as textures. In contrast to standard devices like mouse and keyboard, devices for immersive interaction like wand and head tracker have no established standard like WIMP. Therefore, [Bowman et al. 2006] presented several 3D interaction techniques by considering domain-, task-, and display-specificity. Furthermore, the usage of VR devices is mostly not intuitive and needs to be learned for achieving certain interaction tasks beyond simple object manipulation, which often comes along with unnecessary cognitive overhead. A discussion of these problems including an overview and taxonomy of existing 3D menus e.g. can be found in [Dachselt and H¨ubner 2007]. Concepts for integrating VR interaction techniques in X3D were given in [Figueroa et al. 2005], where the authors present a set of nodes for handling 3D interaction devices as well as nodes for encapsulating 3D widget behavior. Likewise, in [Dachselt et al. 2002] a component-based architecture for the development of 3D UIs was proposed by introducing the “Behavior3D” schema for abstracting away X3D behavior. Immersive interaction concepts in the context of X3D in general were proposed by [Behr et al. 1 http://accad.osu.edu/

pgerstma/protolib/BuildingGamesInVRML.pdf

2004]. The authors show that the X3D Pointing Device Sensor concept [Web3DConsortium 2008] can be used in immersive multiuser environments by introducing the UserBody extension. The UserBody node is a ray-intersect or collision trigger working in conjunction with pointing sensor nodes similar to a mouse pointer and allow integrating interactive elements into immersive and desktop VR applications in a unified manner.

3

Extending the X3D Texture Concept

Because existing 3D interaction techniques are often not specific enough to provide sufficient usability for many application types, in this section we present our approach to integrate widely used 2D UI frameworks directly into X3D. We discuss an extension to the X3D texturing concept in that we propose to introduce interactive textures. All X3D node types specifying a single 2D texture map are defined in the Texturing component and include the ImageTexture, PixelTexture, and MovieTexture. Whereas the first two texture types denote still images, the latter can display a video, but none of them provides any interactivity. In contrast our proposed node set allows embedding standard 2D UIs, web-content or full applications into 3D scenes. These nodes are used as textures, so they can be bound directly to geometries. User interaction is implemented with input via keyboard and mouse in thought, but is open enough to allow more advanced input devices by employing the UserBody node presented in [Behr et al. 2004]. Corresponding examples are shortly discussed in section 4. InteractiveTexture : DynamicTexture { ... SFBool [in,out] enabled TRUE SFString [in,out] updateMode "onInteraction" SFFloat [in,out] maxFps 10 SFVec2f [in] pointer SFBool [in] button SFString [in] keyPress SFString [in] keyRelease SFInt32 [in] actionKeyPress SFInt32 [in] actionKeyRelease SFBool [in] altKey SFBool [in] controlKey SFBool [in] shiftKey }

The base type of all textures that handle 2D GUIs is called InteractiveTexture, and defines some basic slots to route interaction such as keyboard and mouse input to it. Input via mouse or other pointing devices is realized with two fields: pointer, which is the 2D position on the texture that internally is translated into the UI native space via a simple window-viewport transformation, and button to transfer clicking- or similar binary actions. If the SFBool button inSlot receives a TRUE event, internally a mouse pressed event is processed (which is usually treated as left mouse button), else a release event is triggered. E.g. by using a standard X3D TouchSensor node, the pointer and button slots can be filled with valid values by using the sensor’s hitTexCoord changed and isActive eventOut slots. Keyboard input has been implemented according to the interface specification of the standard KeySensor node, which in essence maps all outSlots as inSlots for keyboard interaction. Obviously, the field updateMode defines the update mode. It can have the following values: ”onInteraction” handles mouse-move-like events (which is the default), ”conservative” only regards click-like events and widget updates, and ”always” can handle time-based content, but soon gets slow when having many textures in the scene due to the GPU upload. Here, maxFps determines the update frequency.

Figure 1: Interactive color chooser realized with the UITexture (left) and BrowserTexture on interactive 3D iPhone model (right). Our proposed node inheritance hierarchy is shown next. All proposed texture nodes derive from the abstract X3DTexture2DNode and have been integrated into the Instant Reality framework [IR 2010]. As can be seen, we first introduce an abstract node type DynamicTexture that is the base type of all timeand interaction-dependent textures. The PlaybackTexture can only be used to play-back 2D content without direct interaction, but an InteractiveTexture can also serve as an interactive GUI. X3DTexture2DNode |__ DynamicTexture |__ PlaybackTexture | |__ MovieTexture |__ InteractiveTexture |__ X11Texture |__ WidgetTexture |__ BrowserTexture |__ UITexture

Whereas the X11Texture, which is explained below, is a rather specialized but nevertheless very useful node, the platformindependent WidgetTexture is the base type for all textures that can represent more or less powerful GUI elements (i.e. widgets, which can be further specialized for displaying e.g. a Pdf file). If show is TRUE, the dialog form is shown in a separate window, which is not only useful for debugging purposes, but also for rapid prototyping of interactive 3D desktop applications (cf. section 4). WidgetTexture : InteractiveTexture { ... SFBool [in,out] show FALSE MFString [in,out] url [] }

The proposed UITexture node derives from WidgetTexture and can be used to draw a user interface defined by the Qt Designer tool [Nokia 2010]. Therefore, the name of the Qt Designer based UI file can be set in the MFString field url. This texture node is very specialized as it only accepts Qt-based dialog forms. Alternatively, other frameworks and designer tools could be used such as Glade. Analogously to the Script node, a language protocol then could denote the toolkit (e.g. ”qt:color.ui”), but the functionalities are not generic enough for a unified interface. In the following we will exemplarily discuss the usage of the proposed interactive textures using the color chooser example shown in Figure 1. In the code fragment shown in Figure 2, the UITexture is applied onto a 3D Shape the usual way. Whereas ”color.ui” denotes the pre-designed Qt UI file, the texture additionally supports dynamic fields (similar to a Script) that denote certain signals and slots (the Qt concept for event propagation [Nokia 2010]) as defined in the UI file. In this example, the exposedField reddial references the Qt-specific QDial widget element for changing the amount of red in the material of the colored box behind the UI. Because the widget can be used as input and output (slot and signal) respectively, in this case we use an input/output field instead of an

in- or outSlot. Finally, when a value was changed, the current state of all QDial widgets is routed to a Script node for assembling an SFColor value, which is routed to the diffuseColor field of the box’s Material. If the types match, the last step can be skipped. DEF GUI Shape { appearance Appearance { texture DEF colorUi UITexture { url "color.ui" exposedField SFInt32 reddial 0 ... } DEF Box Shape { appearance Appearance { material DEF mat Material {} ... } ... ROUTE colorUi.reddial TO script.red ROUTE script.color TO mat.diffuseColor

Figure 2: Code snippet in VRML encoding showing the usage of an UITexture node for changing field values (cp. Figure 1, left). A BrowserTexture can be used to display web pages through a texture node. The fields updateMode, url etc. were already explained previously, with the exception that here url either denotes the location of the web page to be displayed and or directly contains embedded (D)HTML code. If a link was clicked on the web page, the link changed eventOut is updated with the new URL. This is useful in combination with delegateLinkHandling. If the field is TRUE, the texture’s location is updated and it displays the web page of the link that was clicked. Otherwise, the current page is kept and only the clicked link in the link changed eventOut is updated. The eventIn back triggers loading the previous document in the history list. Likewise, forward loads the next document in the list, reload reloads the current document, and stop stops loading the document. Similar to the X3D LoadSensor, the isLoaded event is sent when the page has finished loading. This is further refined by the progress field, which sends values between 0 and 1. Figure 1 (right) shows an example scenario. BrowserTexture : WidgetTexture { ... SFBool [in,out] delegateLinkHandling FALSE SFBool [in] back SFBool [in] forward SFBool [in] reload SFBool [in] stop SFString [out] link_changed SFBool [out] isLoaded SFFloat [out] progress }

The most generalized approach of displaying UI elements is to map an entire desktop environment onto a texture. The X11Texture

Figure 3: An XFCE desktop environment mapped on a curved surface (left) and multiple desktops mapped on simple planes (right).

Figure 4: UITexture used in an immersive VR Cave environment. can display an X11 screen with the help of Xvfb, a virtual framebuffer exporting the X11 display into an XWD image format. For this reason, this node is only available on Unix platforms. Since X11 allows running single applications without a desktop manager, the X11Texture can be used to run anything ranging from games to full blown desktop solutions (see Figure 3). Most of the parameters of the X11Texture are related to Xvfb itself: xserver specifies the location of the Xvfb server application to be executed in an environment defined by shell, where display defines the X11 display to be used. Finally, command is the command or application launched inside the server (e.g. ”xterm -e top”). X11Texture ... SFString SFString SFInt32 SFString }

4

: InteractiveTexture { [] [] [] [in,out]

xserver shell display command

"/usr/bin/Xvfb" "/bin/bash" 3 ""

Applications

We have integrated interactive texture nodes in various X3D applications for different purposes. The UITexture is ideal to change runtime configuration parameters, e.g. to visualize mathematical functions or fine tune anything inside a simulation. If the user has no default interface, as for instance inside a CAVE, manipulation from inside the virtual environment is beneficial. Especially parameters with no geometrical representation in the scene like clipping planes of colors can often be easily represented as standard UI elements. Figure 4 shows the configuration interface of a 3D painting application, which runs in a CAVE. The UI has a window-managerlike bar on its top, which allows the user to change its position and orientation. Due the lack of standard devices like mouse and keyboard, a pen-style 6-DOF tracking device is used for interaction. The transformation of this input device, represented as a so-called UserBody node [IR 2010], is used to send out a ray, whose intersection with the UI allows mouse-like interaction by internally triggering a standard X3D TouchSensor. In Figure 5, an interactive texture is used for the realization of a configuration menu of an augmented-reality-based training application for assembling LEGO models. A common approach in ARbased training applications is to use a HUD as visualization device. Therefore a menu is needed that is suitable for interacting with a HUD setup. An UITexture mapped on a 2D plane fulfills those requirements. Since the menu is implemented as child of the virtual camera (an Instant Reality node extension to ease setting up a HUD [IR 2010]), it is not affected from camera manipulations

a stack of sliders. Other value types like colors or scalars can be modified with proven and well-know 2D interfaces. Most users recognize these standard elements directly and are able to use them without further instruction, whereas the meaning and usage of corresponding 3D GUIs is often not intuitive to the user. More successful attempts are often replications of well-known 2D elements, but this approach forces the author to recreate their known behavior.

Figure 5: An UITexture used in an Augmented Reality application. The configuration menu is implemented using an interactive UITexture node (left). The AR application itself is shown on the right.

The presented UITexture simplifies this task via fast interface creation and easy integration with the X3D scene, which moreover scales from standard desktop 3D applications to fully immersive applications. The BrowserTexture allows authors to integrate and navigate through web information inside an X3D scene. Like the previously mentioned 2D GUI elements, web browsing is a known task to most users and can be used to display additional content, or even control or modify the scene itself. Furthermore, the presented X11Texture can also be used as an interface between user and scene. The more interesting application is the integration of applications from the 2D world inside a 3D scenario. The range of possible application to run inside this node knows almost no restrictions and includes nearly everything from simple tools and games to remote desktop clients or even full desktop environments. Currently, the pointer position is not modified by texture transforms, which is left for future work. Also, we would like to further investigate if the discussed protocol for specifying the respective 2D UI library can be generalized somehow. And finally, as the X11 texture is only available on Unix platforms, it could be generalized for other operating systems if a similar protocol is available.

References Figure 6: 3D arcade machine model, which runs a real game inside an X11Texture node displayed in a cluster window environment. (what happens continuously in AR applications), and thereby the menu always remains at the same position on the screen. The menu also includes different kinds of buttons and tooltips. Such a type of menu is easy to handle and to understand even for those users who are not familiar with 3D user interfaces. The user can intuitively interact with the menu by clicking buttons and check boxes. Button tooltips show how the currently selected configuration will affect the visualization and workflow in the training application. The X11Texture additionally allows including applications that were never planned to be part of a 3D scenario into VR and AR scenes. Figure 6 shows a virtual representation of an arcade machine which runs a real, playable game inside an X11Texture. Such applications can be directly controlled via sensors from inside the X3D scene, since e.g. the keyboard inputs are represented as simple SFString events. By utilizing the data stream sensor concept presented in [Behr et al. 2004], events from devices like joystick or spacemouse are propagated to the X3D application. Another interesting application is the use of multi display installations to render simple desktop applications. Such installations, like the HEyeWall, consist of multiple clustered computers that are used as render servers within a distributed environment. With the X11Texture, applications like slideshows or remote desktop clients like rdesktop can be directly used on those screen clusters. Figure 6 shows an example, whereas distribution of the 3D application is done on the X3D level following the approach of [Behr et al. 2004].

5

Conclusions

3D interaction is intuitive for accessing values that have a threedimensional representation like a position or orientation in space. 2D GUIs can access these parameters only in an abstract form, like

¨ B EHR , J., D AHNE , P., AND ROTH , M. 2004. Utilizing X3D for immersive environments. In Web3D ’04: Proc. of the ninth int. conf. on 3D Web technology, ACM Press, NY, USA, 71–78. B OWMAN , D. A., K RUIJFF , E., L AV IOLA , J. J., AND P OUPYREV, I. 2004. 3D User Interfaces: Theory and practice. AddisonWesley/Pearson Education. B OWMAN , D. A., C HEN , J., W INGRAVE , C. A., L UCAS , J. F., R AY, A., P OLYS , N. F., L I , Q., H ACIAHMETOGLU , Y., K IM , J.-S., K IM , S., B OEHRINGER , R., AND N I , T. 2006. New directions in 3d user interfaces. IJVR 5, 2, 3–14. ¨ DACHSELT, R., AND H UBNER , A. 2007. Virtual environments: Three-dimensional menus: A survey and taxonomy. Comput. Graph. 31, 1, 53–65. DACHSELT, R., H INZ , M., AND M EISSNER , K. 2002. Contigra: an xml-based architecture for component-oriented 3d applications. In Web3D ’02: Proc. of the 7th int. conference on 3D Web technology, ACM, New York, USA, 155–163. F IGUEROA , P., M EDINA , O., J IM E´ NEZ , R., M ART´I NEZ , J., AND A LBARRAC´I N , C. 2005. Extensions for interactivity and retargeting in x3d. In Web3D ’05: Proc. of the 10th int. conference on 3D Web technology, ACM, New York, USA, 103–110. IR, 2010. Instant Reality. http://www.instantreality.org/. L ARIMER , D., AND B OWMAN , D. A. 2003. Vewl: A framework for building a windowing interface in a virtual environment. In Proc. of IFIP TC13 Int. Conference on Human-Computer Interaction (Interact’2003, Z¨urich), IOS Press, 809–812. N OKIA, 2010. Qt – cross-platform application and UI framework. http://qt.nokia.com/. P RENTICE , C., 2008. LLMozLib. http://www.ubrowser.com/. T OPOL , A. 2000. Immersion of xwindow applications into a 3d workbench. CHI ’00: Conf. Hum. Fact. in Comp. Sys., 355–356. W EB 3DC ONSORTIUM, 2008. X3D. http://web3d.org/x3d/.

Suggest Documents