Collaborative Gaming in a Mobile Augmented ... - Semantic Scholar

3 downloads 2078 Views 616KB Size Report
the area of gaming. To solve the ... mented reality gaming and significant potential in this area. .... on a laptop, detects a marker by its square arrangement of.
Collaborative Gaming in a Mobile Augmented Reality Environment Reiner Wichert Dept. of Mobile Information Visualization Computer Graphics Center Fraunhoferstr. 5, 64283 Darmstadt, Germany [email protected]

Abstract This paper describes my work constructing a mobile collaborative Augmented Reality (AR) environment using web technologies. Based on an existing AR system solution, it provides a specific collaborative application in the area of gaming. To solve the problems associated with certain collaborative aspects, an augmented reality 3D game like Tetris was developed for multiple users. A group of users wearing head-mounted displays in the same room or a remote user via Internet can interact with the same game in real time, in an individualized way and from any place in the world. The Tetris clone is used to identify problems and possible solutions for further implementation, i.e. specifically in industrial applications of collaborative AR. In this work, these applications have expanded to include multiple user interfaces for heterogeneous distributed environments. This combination of Augmented Reality, Computer Supported Collaborative Work (CSCW) and multiple user interaction brings with it a few advantages over common virtual systems. Keywords Augmented Reality, CSCW, Mobile Computer Game, Information Visualization, Multi-modal Interface

1. INTRODUCTION Augmented Reality has become an important part of computer graphics. There are many situations in which we would like to interact with unreal objects in the surrounding real world. An Augmented Reality can make this possible by presenting a virtual world that enriches the real world [Feiner93]. In previous years, many applications were developed in industry, service and commerce. But there are only a few applications in augmented reality gaming and significant potential in this area. Computer Supported Cooperated Work (CSCW) has emerged as an identifiable interdisciplinary research field that studies the role of computers in group work. It investigates the combination of the enabling technologies of computer networking, systems and end-user applications with the intrinsic nature of group work [Marcos98]. Wilson describes CSCW as a generic term which combines the understanding of the way people work in groups with the enabling technologies of computer networking, and associated hardware, software, services and techniques [Wilson91]. By combining CSCW with Augmented Reality, an exciting new collaboration - collaborative AR - became possible. There are a couple of different definitions of collaborative AR. A collaborative AR system is one in which augmentation of the real environment of one user occurs through the actions of other users and no longer

relies on information pre-stored by the computer [Renevier01]. Reitmayr and Schmalstieg see collaborative AR where co-located users can experience a shared space that is filled with both real and virtual objects [Reitmayr01]. Collaborative AR integrates a number of novel interface technologies like Augmented Reality, Collaborative computing, Physical interfaces, Spatial 3D user interfaces, Computer vision tracking and registration [Billinghurst00]. Most of the existing AR systems do not support collaborative AR. The base AR system used for the Tetris application is the situation-oriented and user-centered system of ARVIKA [Arvika] - a German research project for Augmented Reality Technologies sponsored by the German Federal Ministry of Education and Research (BMBF). Its primary aim is to test Augmented Reality in development, production, and servicing supported by applications on high-end servers and low-end wearable computers. The required information might be text, pictures, video sequences, diagrams or spoken instructions and will be visualized on different devices like Pen computers or with help of head-mounted displays (HMD) worn by skilled workers and technicians. This AR system relieves skilled workers the planning and monitoring of production and provides support in installation work by directly receiving information on the status of the respective job. It currently supports individ-

ual users exclusively. So far, it is not possible for several users to interact with augmented objects at the same time. Thus, another component is needed to support cooperative work in the future. Collaborative AR enables such a common interaction and supports co-operation within the AR environment by way of communication mechanisms. 2. RELATED WORK Collaborative AR has been a research topic in a few previous projects. The Shared Space interface of the Human Interface Technology Laboratory (HITLab) demonstrates how Augmented Reality can radically enhance face-toface and remote collaboration. For face-to-face collaboration, this approach allows users to see each other and the real world at the same time as three-dimensional virtual images between them, supporting natural communication between users and intuitive manipulation of the virtual objects. For remote collaboration, their system allows life-sized, live virtual video images of remote users to be overlaid on the local real environment [HITLabATR]. The “Studierstube” of the Vienna University of Technology proposes an architecture for multi-user Augmented Reality with applications in visualization, presentation and education. The system simultaneously presents threedimensional stereoscopic graphics to a group of users wearing light-weight, see-through head- mounted displays [Schmalstieg96]. The TELEPORTAL system seeks to support a group of users fully immersed and engaged with a 3D task in a high information bandwidth environment. It allows multiple local and remote collaborators to simultaneously interact with virtual and real objects and models [Teleportal]. The CAMELOT project aims toward the development of an interactive, task-oriented cooperative environment based on Augmented Reality Technology – the Virtual Round Table. The visualization of a synthetic scene within a real world work area is realized by using an individually adapted, stereo, see-through projection for each user and the multi-user virtual reality toolkit SmallTool [Camelot]. 3. GENERAL APPROACH A collaborative AR would give support and assistance to a group of users in their common work. Multiple users can be at the same place, at the same time, interacting together directly in the augmented world. Furthermore, group discussions are supported within the field of Augmented Reality. As a result, a common central view is established for each participant in order to get a defined understanding of the problem and to meet the requirements for this specific problem. In another aspect of assistance and collaboration, an AR system has to support remote service with the help of Augmented Reality over large distances. However, with a mobile aspect, the system would give multiple users access to information any time, anywhere.

As of now, only a couple of applications have been realized in a computer-based gaming environment. Most of them are only 3D board games played in two dimensions. To test more powerful requirements of collaborative aspects in an AR system, a game should be used with a high level of data traffic between clients and the server. A game was needed with the advantage of 3D group interaction and individual views for multiple users cooperating at the same place or with a connection over the Internet to remote users. As a result of these requirements, a 3D Tetris clone was implemented. The game explores how wearable computers can be used to support collaboration between multiple users wearing see-through, head-mounted displays and remote users playing on traditional desktop interfaces. It simulates the collaboration of skilled production workers or the connection of technicians to a remote expert over the Internet. In addition, it clarifies how to distribute the functionality in a collaboration. Exploring this research topic within a game presents a significant challenge. The Tetris game poses a good possibility for realizing the requirements of a collaborative AR system, because it can be played at the same place or from a remote location. It is possible to login dynamically and share the AR Tetris game with other participants. Thus, collaboration within one game is as possible as playing AR Tetris in parallel games like the common multi Tetris games. To specify the requirements of this collaborative component and the architecture to be implemented, the scenarios that occur most frequently in the area of collaborative work in service and maintenance have been analyzed. In order to meet these requirements, it was possible to derive two standard situations: 1.

cooperative interaction at the same place with skilled workers having different views to the augmented world

2.

direct interaction with a remote expert using Augmented Reality Technologies like interactive video and having the same view as the skilled worker

After careful consideration of these scenarios, different questions arose: ƒ

How can I interact together with others?

ƒ

How can I represent the results?

ƒ

Viewing private / common data (How can I get a private view?)

ƒ

How can multiple users, computers, applications, documents, places be integrated in an augmented environment?

ƒ

Which information from the other system is used?

ƒ

How does the functionality have to be distributed?

ƒ

How do the systems communicate together?

ƒ

How do mutual exclusions have to be implemented (competing interaction)?

In response to these questions and in order to create the cooperative component, a 3D augmented reality Tetris clone has been developed. It will be described in the next chapter. 4. SYSTEM ARCHITECTURE The web-based client-server architecture presented in this paper enables the visual overlaying of computergenerated virtual objects over real objects via Augmented Reality Technology over the Internet using HTTP protocol. The advantage is that all users have access through firewalls over large distances and a remote user can be very easily connected to the AR system from any place in the world. The rendering and visualization component is realized by the AR Browser as a Plugin in the web browser on the client side. A video server captures the picture of the camera and provides it to the AR Browser. The tracking component can easily be changed over the device interface. The Context manager is responsible for holding runtime information needed by client- and server- side components. The components must be notified of a value change by way of an event mechanism. A push mechanism was conceptualized to notify the components immediately after an event occurs. With the event mechanism, both client-sided and serversided components can be informed in real time and direct the further sequence of events, depending on the incoming notifications. The server side Context manager includes profiling in order to inform other clients when information has changed (see Figure 1). Collaborative AR applications like the Tetris game interchange information within a shared profile.

Client

Server

Internet Browser

Game Server Application

Video Server

(Intersense, hybrid, ...)

Collaboration

Tracking

(Marker based)

Internet

Tracking

(IDEAL)

Web-Server Collaboration

DeviceInterface

Game Client Application

AR Browser

Context Manager

Context Manager

Platform (Windows)

Platform (Windows)

Figure 1: System Architecture

4.1 AR Browser The AR Browser is the visualization component of our AR system. It is a complete VR System with special augmented reality features realized by an ActiveX component in the Internet Explorer. Consequently, the AR Browser can be very easily integrated into every webbased application.

The augmented reality core functionality is implemented in the AR Browser. A thin interface layer provides the scripting interface and the HTTP Access. It enables the system to be configured in a rich variety of ways. Large parts of the application logic can be put into simple Javascript functions embedded in the web page where the AR Browser is also included. From the AR Browser, it is possible to access the renderer, the tracking and interaction interface. The augmented reality core can be used on several platforms [Müller]. 4.2 Tracking To permanently synchronize the augmented scene, the viewing direction of each user is tracked continuously and in real-time. The graphical overlays are generated using the tracking software library of the Fraunhofer IGD. The computer vision-based tracking is connected to the AR Browser, and a camera, mounted on the HMD or on a laptop, detects a marker by its square arrangement of edges and the matrix on it [IGD]. Markers consist of 16 black or white squares on a white background and contain a grid used as a binary encoding area, which is transferred as a pattern for identification. The markers are used to recover the camera position and the orientation. Of its possible 65536 different marker IDs, some identifiers cannot be used. The combinations, which lead to a symmetric code, must of course be excluded. The corners provide four 3D points. Thus, 3D tracking is already possible with only one marker. 4.3 Event Handling Data management is done by the Context manager component. It communicates with almost every component of the AR system. The Context manager stores user-, device- and flow-dependent information in profiles and transmits the data to other components that is necessary for their functionality. The different components of the AR system supply data to the context management while other components want to have access to this data. The Context manager is integrated into the Web-based approach and exists on the client- and server-side. The profile data are synchronized by a push mechanism. In the case of radio interruptions, the events can be redirected into queues. Components can be added as listeners to a specific property. After property changes, a notification will be sent directly to all listeners by the push mechanism. Thus, client- and server side components and applications will be informed. In cooperative applications like the Tetris game, a profile can be shared between several users at the same time. A session spanned event mechanism is possible over different tables. 4.4 Hardware Requirements The Augmented Reality system is realized from off-theshelf hardware. We use a powerful notebook with 900 MHz and a Nvidia GeForce2Go video chip. With this equipment, we get overlaid pictures with 25 frames per second in real time. The operating system is Windows2000. To get access to the server for collaboration,

we connected a wireless LAN adapter to the notebook. The Sony Glasstron PLM-A35 or the Cy-visor DH4500VP eyeglasses offer an excellent augmented view with SVGA (800×600) quality display. For marker-based tracking, we use a Phillips PCVC740K ToUcam Pro with up to 60 frames per second. The Interface for interacting with the 3D Tetris is an optical tracked pen. 4.5 Tetris Application In June of 1985, Alexey Pazhitnov creates Tetris on an Electronica 60 at the Moscow Academy of Sciences' Computer Center. It was ported to the IBM PC by Vadim Gerasimov. In the first game, the player had to move the 4-square pieces (Tetramino) around the screen using cursor keys. Pajitnov then had a new idea to make Tetramino fall into a glass [Gerasimov01]. Tetris was born. It is one of the few games that has achieved ultimate popularity. It has been ported to every computer and game console known to man, and has sold millions of cartridges, tapes, and disks across the land [Atari98]. Since its inception, hundreds of clones have been created. Nearly all of them are only 2-dimensional with interaction by keyboard.

Figure 2: GameBoy Tetris

In order to research Collaborative AR aspects, a 3D AR Tetris game was implemented, with which the ContextManager could be tested as to the needs of a multi-user cooperation. In this game, a user can directly interact with the pieces in a 3D virtual Tetris clone, which is overlaying the real world. The advantage of this game is to be a part of the game, to have the feeling of being directly inside it. The game can be held in the hand and, by rotating the game, a user can get another angle of view. The pieces can be moved with a tracked pen. A collaboration can be initiated and multiple users can play together or against themselves. The Tetris game is split in the functionality realized by the Tetris Application DLL component and the visualization by the AR Browser ActiveX component. Both are embedded in the web page inside Javascript functionality. The Tetris main loop with the timer in this page addresses this DLL and also the visualization component of the AR Browser. The frame-based web page displays the status, the preview to the next piece, and the augmented view of the game. The Tetris application can be started by any player and other users can login dynamically and share the AR Tetris with these participants. To realize

this, the status of the game is transmitted into the context management. The first player who starts the game has the role of the master. The other players don’t start another game, but get this status at the beginning of their session and initialize their game with these settings. The structure of the Tetris clone contains a vector to store the position of the center cube of the actual piece and another vector for the number of free cubes in each level. Two 2D arrays represent the highest elevations and the orientation of the actual piece for visualization. Another one stores the structure of the actual piece. If a piece falls down, it splits into cubes. These fallen cubes are saved in a 3D array. In order to have better recognition, the cubes are dyed another color in each level (see Figure 3). The number and direction of the new piece will be defined randomly. The number of the new piece and the 3D array is set in the Context manager and the other clients are notified directly.

Figure 3: Tetris Clone

After each movement of a piece, the boundary of the game has to be verified. If a player rotates a piece, the position array and the rotation array have to be calculated. The orientation of the actual piece will be set in the Context manager. If a translation is done, the position array and the depth array have to be calculated and the new position of the center cube is set in the context management. If a level is full, the level will be deleted. If a player is a remote user, there are two possibilities. The remote user has the same model of the real world, which has to be augmented and can interact the same way as a collaborative user at the same site. In the case of this game, it is the pen and the board. The second possibility is to transmit the image of the video of one user to him. The 3D world will be mapped onto 2D video. The interaction over an interactive graphic (seeFigure 4) will be sent through the context management, as well. In the case of face-to-face collaboration, for 3D interaction a pen with markers is used. This allows a very natural interaction mechanism with the game. In a later ver-

sion, speech recognition will be integrated to enable a multi-modal interface.

of the skilled workers and technician, to see different aspects of the same thing adapted to the circumstance. Figure 6 shows how these adaptations could be converted to the Tetris game. The users play the same game and get a private view of the game and their interactions. The common data is the next piece and the score. The users can add and customize visual aspects to their needs like the deleted levels or the time played.

Figure 4: Remote Tetris

5. FIRST RESULTS The Tetris game currently supports multiple users playing one game. This simulates group discussions and the connection to a remote teacher collaborating with a student while assembling a mechanical engine. It also handles the problem of common competing interaction.

Figure 6: Private and common view of data

How to visualize cooperation will be a further step in this work. If many people play the same game remotely a player wants to know why the piece moves to the left even though he pushed the piece to the right. Therefore, the representation of the results of the interaction is an important field. The best way to do this will be to use multiple indications like color, arrows and numbers (see Figure 7).

Figure 5: Selection of another field of view

The next area of work will be the integration of the actual field of view of other players and their different games. In a next step, another view can be selected and the interaction can be switched to the selected game (see Figure 5). This represents the scenario of a master who supervises several trainees learning the AR system or being instructed how to handle a machine. The master can select the view of a trainee and can give him advice and support in his augmented environment. As a result, a master can supervise a few learners. Another aspect is to display the partial occlusion of real objects and the view of common and private data. Different information dependent on the expertise of engineers and experts will be visualized in this scenario. This should support the inspection of power stations by the acceptance procedure and, again, the trainees / master scenario. The result is to customize the view to the needs

Figure 7: Visualization of cooperation

Another scenario that has to be worked out is the joint solution of tasks belonging together, for example, the attention handling of two workflows. There are a lot of difficulties to solve in sharing virtual worlds. Mutual exclusions have to be implemented and collision detection is needed. Semaphores have to control the access to interaction mechanisms. Another way to pursue is the visualization of problem spaces as shown in Figure 8. In industry, it is important to recognize danger very quickly; for example, through

cooperative installations or by searching for errors in plants and machinery. Back to the Tetris game: this problem can be approached in a game for two players, where two pieces are falling down. Each person can interact with one piece, gets personalized information and has to coordinate his work with the other player while getting information when collisions are detected.

applications. But until then, a lot of experiences can be garnered and further research work completed.

Another aspect is to swap out some functionality or the whole game to the server. Then, no replication between the games is needed and thin clients like PDAs equipped with a camera can be used for Augmented Reality. The video picture will be streamed to the server via W-LAN, where the marker detection and the augmentation are done. The overlaid picture is streamed back and displayed on the thin client.

9. REFERENCES [Arvika] ARVIKA, Augmented Reality for Development, Production and Servicing, Germany, started 1999 http://www.arvika.de/

8. ACKNOWLEDGEMENTS This work was partially sponsored by the German Federal Ministry of Education and Research (BMBF) within the scope of the project ARVIKA (Grant No. 01 IL 903).

[Feiner93] Feiner, S., MacIntyre, B., Seligmann, D. Knowledge-Based Augmented Reality. Communications of the ACM, 1993. 36(7): pp. 53-62. [Marcos98] Fernandes-Marcos, Adérito. Modelling Cooperative Multimedia Support for Software Development and Stand-Alone Environments. Shaker Verlag, 1998. [Wilson91] Wilson, P. Computer Supported Cooperative Work, Kluwer Academic Publishers, 1991. [Renevier01] Renevier, Philippe, Nigay, Laurence. Mobile Collaborative Augmented Reality: the Augmented Stroll, Proceedings of EHCI'01, IFIP WG2.7 (13.2) Conference, Toronto, May 2001, LNCS 2254, Spinger-Verlag, pp. 315-334. [Reitmayr01] Reitmayr, Gerhard, Schmalstieg, Dieter. Mobile Collaborative Augmented Reality. ISAR2001.

Figure 8: Collision detection and Semaphores

6. CONCLUSION An AR Tetris game as an example of a collaborative augmented reality system has been described. It simulates the collaboration of skilled production workers or the connection of technicians to a remote teacher. The webbased architecture enables access via the Internet from every place in the world. The game clarifies how to distribute the functionality in collaboration. It is possible to login dynamically to the same group and share the augmented space remotely or face-to-face with other participants at any given time. In order to research Collaborative AR aspects, the ContextManager could be tested as to the needs of a cooperation with multiple users. By using a Push Mechanism, access time to time-critical data can be substantially shortened. It has been observed that the fast event mechanism can provide the information to other participants in real time. Several examples of collaborative work have been shown. Possible scenarios have been derived from specified requirements and deployed on a 3D Tetris clone. Many possible scenarios can be worked out and evaluated in the near future. 7. OUTLOOK A further step is to evaluate the outcome of this cooperative game and to transfer these first results to industrial

[Billinghurst00] Billinghurst, M., Poupyrev, I., Kato, H., May, R. Mixing Realities in Shared Space: An Augmented Reality Interface for Collaborative Computing. In Proceedings of the IEEE International Conference on Multimedia and Expo (ICME2000), July 30th - August 2, New York. [HITLab-ATR] The Human Interface Technology Laboratory, University of Washington; ATR Media Integration & Communication, Kyoto, Japan. http://www.hitl.washington.edu/research/ shared_space/

[Schmalstieg96] Schmalstieg, D., Fuhrmann, A., Szalavari, Z., Gervautz, M. Studierstube - An Environment for Collaboration in Augmented Reality. Proc. of CVE '96 Workshop. 1996. [Teleportal] Teleportal: Face-to-Face Networked Augmented Reality. Michigan State University. http://www.mindlab.msu.edu/mweb/research /publications/teleportal.pdf

[Camelot] Camelot: Collaborative Augmented MultiUser Environment with Live Object Tracking. German National Research Center for Information Technology (GMD-FIT), Sankt Augustin http://fit.gmd.de/camelot

[Gerasimov01] Gerasimov, Vadim. The Story, March 2001. http://vadim.www.media.mit.edu/Tetris.htm

[Atari98] Atari Gaming Headquarters. The Tetris Saga, Dec. 1998.

http://www.atarihq.com/tsr/special/ tetrishist.html

[Müller] Müller, Dr. Stefan, Stricker, Didier, Weidenhausen, Jens. An Augmented-Reality System to support product development, production and maintenance. March 2001.

http://www.inigraphics.net/publications/ topics/2001/issue3/3_01a07.pdf

[IGD] Institut Graphische Datenverarbeitung. Visualisierung und Virtuelle Realität. Darmstadt, Germany. http://www.igd.fhg.de/igd-a4/index.html