A Proposed Hardware-Software Architecture for ...

7 downloads 61067 Views 3MB Size Report
2 Department of Applied Mechanics, Chalmers University of Technology,. SE-41296 ... tutes a challenging problem for industries such as aerospace and auto- ... Keywords: Computer Graphics, Virtual Reality, Software Engineering, Com-.
A Proposed Hardware-Software Architecture for Virtual Reality in Industrial Applications F. Chionna1 , P. Cirillo1 , V. Palmieri1, and M. Bellone2⋆ 1

2

Consorzio CETMA, c/o Cittadella della Ricerca S.S. 7 - Km 706+030 I-72100 Brindisi, Italy Department of Applied Mechanics, Chalmers University of Technology, SE-41296 G¨ oteborg, Sweden

Abstract. Increase the level of interaction with CAD engines constitutes a challenging problem for industries such as aerospace and automotive which require high degree of details inspection during the design process. A high level of interaction can be provided using immersive virtual environments. However, the use of virtual reality for industrial applications introduces a number of problems since interaction requires tracking systems but also the design of user-friendly 3D interfaces. Our work aims at develop a cost-effective hardware/software VR platform which increases the level of interaction, interfacing with the most common CAD engines. On one hand, the realization of such a complex platform requires high performance devices. On the other hand, industrial applications require reliable and cost-effective solutions and, the presented platform features a novel solution for the accurate hand tracking which combines a depth camera and the WiiRemote in this concern.

Keywords: Computer Graphics, Virtual Reality, Software Engineering, Computer Aided Design

1

Introduction

Technologies in virtual reality and augmented reality keep continuously growing, breaking up new boundaries in the design review process. From old 2D models to modern 3D computer graphics, developers always look for new, and more communicative, ways for the creative expression of ideas. The natural consequence, after 3D computer graphics, consists in the development of novel immersive systems which bring users into the ideas, using virtual environments. In this concern, researchers have been working since years on the investigation of VR and computer graphics [1]. From their advent, the application of VR and immersive systems has been investigated on several fields such as medical [2], entertainment [3], design review [4], [5] and much more. Specifically, with design review researchers generally refer to the specific process of digital examination ⋆

corresponding author, [email protected]

and inspection of products, mechanical parts or even construction projects, in order to take specific decisions before their physical realization. Studying past researches in this field, it is evident that the development brought technologies from simple 3D visualization systems toward immersive VR systems, in which the user can interact with, and design review is revealed as a potential application attracting economic and industrial interests. Focusing on the latter, in [6] the authors have explored the problem of CAD models handling in virtual environments. Whereas, in [7] the impact that immersive VR technologies can have on the visualization of a design review scenario was studied. More specifically, the authors investigated the construction of a disabled bathroom in a block of flats in an immersive environment, helping in the contemplation of unexpected events during the design process. In support to recent developments of VR in the design review process for industrial application, in [8] the authors describe a method for virtual prototypes inspection through an immersive virtual environment. However, the method addressed in [8] requires the user to dress cyber gloves for hand localization and tracking. Normally, to create immersive VR environments different techniques exist [9], classifiable by the screens types and number. Single surface display solution is widely used for its cost-effectiveness. In [10] the authors investigated the possibility of using a wide screen stereoscopic displays against a head mounted devices. Whereas, multiple screen solution is mostly used to provide to a group of users an immersive sensation, such as in [11] in which the authors studied interaction with graphical menus in virtual environments using a multi screen solution. In spite of the great step forward that immersive technologies made in the last years, problems and open issues are countless resulting in a large number of research activities which involve worldwide universities. Among others, the development of simple and user-friendly 3D interactive user interfaces, referred to as 3dUIs, constitutes a crucial development line. A recent work in this issue cope with this problem using a smart-phone-based menu system interface for immersive virtual environments [12]. Although this research represents a step forward in the use of interaction interfaces in immersive reality, the use of a mobile device may result in a poor immersive sensation for the user due to possible distractions when the user look at the mobile screen instead of projectors’ screen. Moreover, during a design review process often happens to discuss about different aspects of the same product. For instance, in the automotive field, the review process is related to mechanics, style and ergonomic issues, and the comparison of different models and solutions needs to run different instances of different softwares. In such situation, the users may require to switch between visualizations or to discuss on different issues and even an interactive modification of 3D models [13]. In the light of all cited criticisms, it results evident that new contributions and further researches are required in order to increase the level of immersive sensation and usability for the user. As a contribution, our proposal includes an immersive system for CAD visualization in an immersive environment in which

a 3dUI movable interface is projected onto the screen and it can be simply controlled using a tracking system and a simple remote controller. Our software, referred to as Dune, aims to simplify the design review process through an immersive virtual reality system based on the combination of opensource platforms and cost effective tracking devices. Specifically, the Dune platform features a module which allows to load common 3D CAD models such as ’step’, ’iges’ or ’stl’ files in order to provide the possibility to present, interact and manipulate them in the immersive environment through user movements. In summary, the main strength of the proposed platform are: – the interaction between user and virtual reality engine is mostly done using cost-effective tracking devices; – the development of a software architecture that is simple to maintain and expand, thanks to its modularity; – the implementation of a user friendly 3dUIs. The rest of the paper is organized as follows. Section 2 discusses the system overview including a general hardware description. The software architecture is better detailed in Section 3, in which its single modules are thoroughly exposed, including our custom solutions to handle tracking devices. As further strength of our software architecture, Section 4 introduces the 3D user interfaces specifically conceived for this application. A discussion on users’ opinions is given in Section 5. Finally, Section 6 gives some final conclusions.

Projector 1

Projector 2

Projector 3

Slave Screen 1

Slave Screen 2

Slave Screen 3

Master Server Input and tracking devices

(a)

(b)

Fig. 1. Virtual Reality Center screens in stereographic rendering during a immersive session (a), and a general explanation of the master-slave configuration in the Virtual Reality Center (b).

2

System Overview

The aims of our custom platform is to create an immersive design environment that could answer to the designers requirements. It is evident that an immersive system requires the combination of hardware and software components. The

main considerations about software concern reliability, modularity, maintainability and expandability, but also the implementation of a user friendly immersive interface, preserving the cost effectiveness of hardware components. Following in the present section, a general overview about the hardware of our system is proposed. Particular focus has been paid on the visualization architecture and trackers devices, since they are revealed as key issues to provide a high level user interactivity in the virtual environment.

2.1

Visualization architecture

Our virtual reality center features a projection room of about 144 square meters and a server room of about 25 square meters, the VRC is housed in cinema mode with soundproof walls and has an audience of 34 seats arranged in 3 steps. Figure 1 shows our reality center screen during a design review session (a).The display system is characterized by the system MOVe of the BARCO and it features 3 movable screens. Each screen is characterized by a surface area of 3.3 m × 2.4 m for a total of 9.6 m long and 2.40 m height, the sides screens can be rotated from CADWALL configuration to a CAVE configuration, passing from intermediate configurations 0.0 deg, 22.5 deg, 45.0 deg, 90.0 deg. The choice of a multiple display is motivated by the necessity of increasing the user space mobility, this solution offers an high level of immersion [14]. The screens’ surfaces are made using a translucent material for projection and stereo rendering. Moreover, a rear projection system has been employed in order to ensure the maximization of user mobility and immersive experience avoiding any possible shadow in the scene. The set of the described features, as much as the use of a 3D stereo projection system, allow users to fully enjoy different scenarios and simulations into immersive environments. Stereographic rendering requires two images to be generated and displayed at the same time, one computed for the right and one for the left eye. Our solution uses a high refresh rate CRT projector featuring up to 120 Hz as refresh rate. Using the inherent speed of DLP technology, the Optoma EX785 can output video and images at an astonishing rate of 120 Hz, allowing us to show full screen, full color, stereoscopic 3D scenes. The 3D effect is generated by splitting the signal into two standard video streams, one for each eye. As the visualization system requires high graphics performance, our computing system features 4 high-end workstations with Nvidia Quadro FX 4500. The workstations are interconnected in a master/slave architecture, better detailed in Figure 1(b) Specifically, our architecture uses a single workstation as a master server and 3 slaves, one for each screen. The master has the task of visualization synchronization and sensors data acquisition, whereas the slaves are in charge of a single portion of the scene rendering. The connection between master and slave is made using a client-server structure.

2.2

Trackers and input devices

The difference between the implementation of a simple immersive environment and a design review immersive environment consists in the increasing level of interactivity which is required in the latter. Past immersive design review softwares could only visualize 3D parts, then a low level of interactive functions have been implemented. However, they generally include only basic functions of rotating and moving parts. Our challenge concerned the creation of a virtual environment dedicated to the visualization of CAD objects in which the user can interact with, in real-time using its own movements. In this concern, tracking systems provide an effective way to interpret users’ movements in the real world and convert them into actions in the virtual world. Nowadays, numerous technologies can achieve movements recognition, from visual skeleton tracking to electromagnetic motion tracking. The former provides high mobility and freedom for the user involving visual information, whereas electromagnetic sensors are much more accurate in position estimation but they may reduce user mobility since they are generally wearable devices. Bearing in mind the above considerations, Dune uses two different approaches to track the user position, a kinect device for visual skeletal information and a FASTRACK for accurate head and hand position. The kinect sensor is well-known and common in the computer graphics community. From RGB camera and depth information, the software is able to calculate a skeleton model of the user with the corresponding nodal points including their position with respect to the camera reference frame. The FASTRAK tracks the position (x, y, and z Cartesian coordinates) and orientation ( ρ, θ, φ roll, pitch and yaw) of a small sensor as it moves in the space. However, noise, vibration and the resulting jitter is sometimes unavoidable in the tracking applications. A numerical filter, the 1 filter [15], has been applied to all the joints position in order to minimize jitter and lag. The 1 filter is an adaptive first-order low-pass filter: it adapts the cutoff frequency of a low-pass filter for each new sample according to an estimation of the signal speed. Input devices aim to simplify the functionalities activation by designers. Gamepads or space navigators help designers to navigate into the scene and to control functionalities. However, if the user wants to be free to move using one hand to activate functionalities and to control object, the use of a wireless remote control become mandatory. Our choice is the Nintendo Wii Remote device. The Wii Remote includes a set of buttons for the basic functions, an IR sensor for tracking up to four IR sources with a refresh rate of 100 Hz and a three axis linear accelerometer that provides motion sensor capabilities. Dune uses Wii Remote capabilities as a pointer, for the wrist motion recognition and functionalities activation. In its original form, Wii Remote is used as a pointer in combination with an IR bar. While internal sensors, such as accelerometer and gyroscope, only allow to obtain the wrist orientation with a reasonable accuracy, the use of an IR bar is required to obtain wrist translation data [16]. However, in an immersive environment the use of an IR leds bar may be not effective due to the screen size and position. As the matter of fact, the

Fig. 2. Dune detailed software architecture and its modules. Please refer to the section 3 for further details.

IR sensor in the Wii Remote must be visible for the IR leds bar, and this can be a problem while pointing toward a high size screen. Interaction in VR applications, using the Wii Remote accelerometer data, ranges from basic shake triggering, to tilt-and-balance control, till simple gesture recognition. Furthermore, the remote control is also used as a pointer. Using the Wii Remote, the Dune.Review immersive environment implements a large variety of functionalities, including mapping, selecting objects, moving the objects in the VE while the user moves in the real world, change and reset camera view, and much more.

3

Software architecture

The complete Dune software architecture, schematically depicted in Figure 2, is fully coded using C++, and it is based on popular open source libraries in 3D community, such as OpenSceneGraph, Delta3D and OpenCascade. At the current stage, our software is composed of several modules: Dune core, components, engines, IO and visualization. Dune core: the core is in charge of messages parsing (among master and slaves), scene handling (light, virtual camera and viewpoint) and communication between modules. As mentioned in Section 3, a master workstation has the task of: – system configuration (visualization, I/O devices, slaves, scene); – I/O data and events handling; – data communication;

Optical Tracker API

Microsoft Kinect API

Fastrak Tracker API

Other Tracker API...

Nintendo Wiimote API

Smartphone API

dune Virtual Device Interface

Dune Tracker Module

Dune Tracking Interface

Game Actor Game Engine

3D GUI Actor

Fig. 3. Component diagram showing the tracking policy and communication with 3dUIs using virtual devices interfaces.

– synchronization among clients; – editing functionalities of the scene (among which animations and other simulation features). The communication architecture is based on the Delta3D engine, which foresees that any module sends a message to the master which forwards the message to the appropriate module or device. Components and actors: as derived from Delta3D, Dune objects in the immersive environment can be distinguished as components and actors. A component is a module handling 3D objects that must be always in the scene, such as lights or virtual camera, whereas all other objects dynamically loaded and projected into the scene are referred to as actors. Dune can handle several types of actors such as CAD models or UIs. As an actor can be any object, it becomes simple for programmers to implement a new module which simply adds a new actor into the scene, providing a high level of flexibility to the software. Moreover, this specific property of modularity allows to handle any actor using different manipulation functionalities, e.g. a CAD model can be modified in its shape,

(a)

(b)

Fig. 4. Illustration of interaction system which uses an immersive keyboard in stereo visualization. The user is interacting with the virtual model (a) using the numeric keypad (b) as input method system.

whereas the UIs can be used to select a new action, to change the viewpoint of the virtual camera or to modify lighting in the scene. In this way, the data logic functionalities are transferred to the actors. Using such approach, it is possible to populate the virtual world with multiple game actors simultaneously and to interact with each actor in an independent way, applying different functionalities. External Libraries: the set of libraries at the base of Dune which allows the loading of any actor in the virtual environment, as much as the communication with any device. The libraries have been chosen according to the principles of effectiveness, adaptability, portability, scalability and reliability. IO: in order to manage with different input and tracking devices, Dune includes a set of I/O drivers and interfaces. Dune has a modular and flexible software architecture. In this concern, devices can be integrated in an easy way, through plug-in modules that translate sensory data in specific messages inside the platform. Such messages are sent to the master using network protocols in a cabled connection. At the current stage, Dune integrates several devices such as: GamePad, Space Navigator (3DConnexion), Fastrak Stylus, Microsoft Kinect and Nintendo Wii Remote controller. As further strength of its modularity, the cited devices are integrated using open source libraries such as OpenNI and WiiYourself. Visualization: this module is in charge of the visualization handling in Dune. This part is the most important in order to provide an effective immersive sensation during the design review sessions. The visualization on each screen is dedicated to one of the slaves which uses an Ethernet communication to be synchronized with the master. Following the principles of flexibility and portability, the visualization module allows the visualization on single channel, multichannel and even stereo visualization on portable devices (see Figure 2). As one of the key requirements of our software consists in the maximization of the immersion perception, the user has to interact with the virtual scene, moving

pie slice

(a)

(b)

Fig. 5. Representation of our custom pie men´ u featuring different functions for the user. In (a) its 3D rendering is shown with emphasis to one of the slices, marked as blue, and a selected pie elements, labeled using green color. The fast reverse button is located in the center. Whereas the pie men´ u projected into a virtual environment is shown in (b).

in the real world. In this concern, figure 3 exposes how tracking devices data are handled in Dune.review. For the sake of preserving the modularity, all sensors APIs communicate with a so called Virtual Device Interface, which provides data for the tracking module. This gives much strength to our software allowing to simply add a new device by including a new API module (see Figure 3). The tracking module is in charge of elaborating pose data (e.g. joints Cartesian coordinates and orientations) independently of which sensors they come from, forwarding them to the tracking interface. As last, pose data can be used to control scene components or they can be projected by the game engine as a new actor (e.g. skeleton visualization). This functionality, achieved using the tracking sensor installed at the top of the center screen, provides a high sensation of immersion, since the user can change his viewpoint by interacting with the virtual camera, e.g. the user can see both sides of an object by moving his head on left or right.

4

3D interaction

One of the main issues for a deep immersive experience is the simplicity of software functionalities selection. As designers are totally involved in the visualization system and require a total freedom of movements, an accurately designed graphics user interface must be embedded inside the virtual world. The 3D immersive GUI should be intuitive, immediate, and user-friendly [13]. In the proposed software for design review, two different immersive GUI have been implemented offering different levels of interaction. The former is an immersive keyboard, see Figure 4, whereas the latter is a pie men´ u, shown in Figure 5. Using immersive keyboards or numerical pads, designers are able to assign controlled values to geometrics, physics or mechanics magnitude. This feature is

(a)

(b) Fig. 6. Examples of Dune functionalities during a design review session in which the user regulates the shape of an airplane model (a), uses a torch function to navigate inside the virtual environment (b) and studies the car interior ergonomics (c).

important when designers want to assign specific values to objects (e.g. position, scale, colors, material). A pointing system helps the user to interact with the immersive UIs through his arms movements. In order to provide a realism during interaction [17], the virtual pointer in the virtual world must accurately follow the users hand movements. Such pointing is realized using the Wii Remote device, hence the user points his hand toward the screen and a virtual ray is shown in the pointing direction, as shown in Figure 4. In particular, specific algorithms for six Degree Of Freedom (a.k.a. 6DoF) wrist pose estimation through Inertial Measurement Unit (IMU) information have been used. It results evident that such measure must be done in real time to ensure the proper immersive sensation. The wrist orientation is calculated using three methods: (i) Simple Simpson integration with hard thresholding and weighted gravity update; (ii) Gradient descent; (iii) Mayhony’s

Fig. 7. Example of design review session concerning a new mall, where the user can navigate inside the virtual environment.

DCM (Direct Cosine Matrix). Methods (ii) and (iii) are based on Madgwick’s implementation [18] with minor modifications. In order to use the Wii Remote as an orientation tracker for short-term and highly dynamic motions, researchers’ choice focused on Madgwick’s gradient descent approach, since it shows higher accuracy. As in this situation is not possible to use the IR bar (see Section 2 for more details), the Wii Remote can only provides information about wrist orientation through its inertial sensor. Hence, the problem of wrist position estimation with reference to the camera frame still remains. In this regards, a tracking system based on Kinect information has been also used. Our solution of combining Kinect and Wii Remote information was inspired by [19], in which an effective approach that solves such problem has been thoroughly investigated. In such way, it is possible to obtain an effective pointing system, which is required to easily manipulate 3D objects through an immersive interface. Classical design review softwares have a large number of functionalities, generally accessible through 2D interfaces which include windows, menus or tool boxes. On the contrary, during an interactive session, it is not possible to use common solutions. A pie men´ u was designed in this purpose, depicted in Figure 5. Such 3D pie men´ u represents a starting point to define the guidelines for the implementation of richer 3dUIs. Specifically, our pie men´ u has two reversible faces, each one including four widget and four handles. Each handle allows to change the visible widgets, which are binded to specific software actions, whereas the circle in the middle allows its fast reverse. The green color marks the selected pie-elements, see Figure 5-

(c). Using this approach, in a single pie space, it is possible to collect up to 32 software functionalities easily actionable by user interaction, e.g. translations, rotations, scaling or wireframe visualization of a specific object. Increasing the number of handles on a hypothetical concentric circle level around the pie men´ u, it is possible to increase the available functionalities. The user interacts with the men´ u using the virtual ray. Hence, the user simply points the ray toward a specific section of the 3D men´ u and push the Wii Remote button to activate the functionality.

5

Discussion

Along with the development, Dune.review has been intensively tested by developers in order to implement functionalities of increasing quality. Moreover, during our design review implementation two types of users have been involved: designers and unexperienced users3 . Along with the tests, functionalities that have been perceived as a necessity by both have been implemented and tested. Specifically, while designers have pointed out the necessity of high level of interaction, unexperienced users have focused on navigation and visualization issues. As first example, Figure 6-(a) shows a designer while using the pie men´ u during a design review session. The designer is using CAD functionalities to regulate the shape of the plane while at the same time looking at its real shape in an immersive environment. The pie men´ u shows the CAD functionalities and, specifically, the user has selected the action to modify a NURBS (Non-Uniform Rationale B-Splines) through control points. It is worth to mention that this functionality allows to select a set of control points, or even a single point, and to manipulate them by freehand drag and drop. Moreover, the user can quickly display the result of the manipulation as soon as the control points are moved. One should note that, for this specific functionality, the accuracy of the pointing system must be enough high to ensure a simple manipulation. As further functionality, in Figure 6-(b) an unexperienced user is looking inside the 3D model using a virtual torch function, which illuminates the portion of the model where the virtual ray is pointing on. This function has been detected as really helpful by users in order to focus on specific areas into a 3D model. The designers feel this functionality as helpful to thoroughly inspect the 3D models in order to detect possible unexpected defects. Lastly, Figure 6-(c) shows an example of design review session in CAVE configuration, in which the user is interacting with a sportive car, studying its ergonomics in an immersive way. One of the important aspect of this analysis consists in the study of the car blind spots. It is worth noting that, when the user navigates into the virtual environment, the pie men´ u may be located at the front of the virtual camera, occluding some parts of objects. On the contrary, it may be lost in a huge virtual environment, such as in Figure 7. 3

Research activity conducted as part of the Italian research program − PONREC PROGIMM Cod. DM28904

In helping to solve these issues, the pie men´ u has two specific functions that allow to move it out of the viewpoint but also to bring it at the front of the user when it is necessary. This functionality can be activated pressing a shortcut available on the pointing device.

6

Conclusion

In this paper, a novel platform for design review was presented, with focus on 3D immersive interfaces. Specifically, the proposed 3dUIs are designed to be user-friendly while, at the same time, offering immersive CAD functionalities. Extensive tests performed by both designers and unexperienced users revealed that the introduction of the pie men´ u into the virtual environment may enhance the access to immersive CAD functionalities. The pie men´ u was detected as helpful by designers for the high level of manipulation actions made available, as much as by unexperienced users for its navigation capabilities. However, more research is required to improve usability for beginners, since in literature it does not exist a cost effective solution to bind CAD functionalities and user-friendly interfaces in the immersive environments. To cite a future development line, the collaborative features will allow networked users to share the same CAD project in real time using an editing token policy. Further tracking or haptic devices could be also integrated to provide increasing level of interaction thanks to our modular architecture. Finally, the introduction of FEM analysis data, such as stress analysis or fluid dynamics, could bring these technologies toward new boundaries of the immersive design review.

Acknowledgments The research activities of this paper are partly funded by the research program PONREC VIS4Factory. Grant Cod. PON02 00634 3551288.

References 1. Czernuszenko, M., Pape, D., Sandin, D., DeFanti, T., Dawe, G.L., Brown, M.D.: The immersadesk and infinity wall projection-based virtual reality displays. SIGGRAPH Comput. Graph. 31(2) (May 1997) 46–49 2. Lim, D., Ibrahim, H., Ngah, U.: Development of virtual reality system for medical application using opengl. In: Innovative Technologies in Intelligent Systems and Industrial Applications, 2008. CITISIA 2008. IEEE Conference on. (July 2008) 44–48 3. Zyda, M.: From visual simulation to virtual reality to games. Computer 38(9) (Sept 2005) 25–32 4. Jayaram, S., Jayaram, U., Wang, Y., Tirumali, H., Lyons, K., Hart, P.: Vade: a virtual assembly design environment. Computer Graphics and Applications, IEEE 19(6) (Nov 1999) 44–50

5. Hughes, C.E., Zhang, L., Schulze, J.P., Edelstein, E., Macagno, E.: Cavecad: Architectural design in the cave. In: 3D User Interfaces (3DUI), 2013 IEEE Symposium on. (March 2013) 193–194 6. Weidlich, D., Cser, L., Polzin, T., Cristiano, D., Zickner, H.: Virtual reality approaches for immersive design. {CIRP} Annals - Manufacturing Technology 56(1) (2007) 139 – 142 7. Bassanino, M., Wu, K.C., Yao, J., Khosrowshahi, F., Fernando, T., Skjrbk, J.: The impact of immersive virtual reality on visualisation for a design review in construction. In: Information Visualisation (IV), 2010 14th International Conference. (July 2010) 585–589 8. Fillatreau, P., Fourquet, J.Y., Bolloch, R.L., Cailhol, S., Datas, A., Puel, B.: Using virtual reality and 3d industrial numerical models for immersive interactive checklists. Computers in Industry 64(9) (2013) 1253 – 1262 Special Issue: 3D Imaging in Industry. 9. Fuchs, P., Moreau, G., Guitton, P.: Virtual Reality: Concepts and Technologies. 1st edn. CRC Press, Inc., Boca Raton, FL, USA (2011) 10. Naceri, A., Chellali, R., Dionnet, F., Toma, S.: Depth perception within virtual environments: A comparative study between wide screen stereoscopic displays and head mounted devices. In: Future Computing, Service Computation, Cognitive, Adaptive, Content, Patterns, 2009. COMPUTATIONWORLD ’09. Computation World:. (Nov 2009) 460–466 11. Dang, N., Perrot, V., Mestre, D.: Effects of sensory feedback while interacting with graphical menus in virtual environments. In: IEEE Virtual Reality Conference, VR 2011, Singapore, 19-23 March 2011. (2011) 199–200 12. Gebhardt, S., Pick, S., Oster, T., Hentschel, B., Kuhlen, T.: An evaluation of a smart-phone-based menu system for immersive virtual environments. In: 3D User Interfaces (3DUI), 2014 IEEE Symposium on. (March 2014) 31–34 13. A. Martini, L. Colizzi, F.C.F.A.M.B.P.C., Palmieri, V.: A novel 3d user interface for the immersive design review. In: IEEE Symposium on 3D User Interfaces 2015, ISBN: 978-1-4673-6886-5 (2015) 175–176 14. Peternier, A., Cardin, S., Vexo, F., Thalmann, D.: Practical Design and Implementation of a CAVE Environment. In: Proceedings 2nd International Conference on Computer Graphics Theory. (2007) 129–136 15. Casiez, G., Roussel, N., Vogel, D.: 1 eFilter: A Simple Speed-based Low-pass Filter for Noisy Input in Interactive Systems. In: CHI’12, the 30th Conference on Human Factors in Computing Systems, Austin, United States, ACM (May 2012) 2527–2530 16. Lee, J.: Hacking the nintendo wii remote. Pervasive Computing, IEEE 7(3) (July 2008) 39–45 17. : Ergonomics of human-system interaction: Human-centred design for interactive systems : ISO 9241-210. Number pt. 210 in DIN EN ISO. ISO (2010) 18. Madgwick, S.: An efficient orientation filter for inertial and inertial/magnetic sensor arrays. Report x-io and University of Bristol (UK) (2010) 19. Hald, K.: Low-cost 3dui using hand tracking and controllers. In: 3D User Interfaces (3DUI), 2013 IEEE Symposium on. (March 2013) 205–206

Suggest Documents