Tangible Interaction in Learning Astronomy through ...

3 downloads 0 Views 2MB Size Report
Abstract. Live Solar System (LSS) is an Augmented Reality book-based edu- cational tool. Augmented Reality (AR) has its own potential in the education.
Tangible Interaction in Learning Astronomy through Augmented Reality Book-Based Educational Tool Aw Kien Sin and Halimah Badioze Zaman Faculty of Information Science and Technology Universiti Kebangsaan Malaysia, 43600 Bangi, Selangor [email protected], [email protected]

Abstract. Live Solar System (LSS) is an Augmented Reality book-based educational tool. Augmented Reality (AR) has its own potential in the education field, because it can provide a seamless interaction between real and virtual objects. LSS applied the Tangible Augmented Reality approach in designing its user interface and interaction. Tangible Augmented Reality is an interface which combines the Tangible User Interface and Augmented Reality Interface. They are naturally complement each other. This paper highlights the tangible interaction in LSS. LSS adopts the ‘cube’ as the common physical object input device. Thus, LSS does not use the traditional computer input devices such as the mouse or keyboard. To give users a better exploration experience, Visual Information Seeking Mantra principle was applied in the design of LSS. Hence, LSS gives users an effective interactive-intuitive horizontal surface learning environment. Keywords: Tangible Augmented Reality, Tangible User Interface, Augmented Reality, virtual environment, tangible interaction.

1 Introduction Augmented Reality (AR) superimposes virtual objects in the real environment by registering in 3D and giving real time interactivity to users [1][2]. In 1965, Ivan Sutherland created the first AR by building the first Head-Mounted-Displays (HMD) which can display a simple wireframe cube overlaid on the real world [3]. In 1990, the term AR was coined by Boeing in a project, with the intention of helping their workers assemble cables into aircraft [4]. AR is a variation of Virtual Reality (VR) [1], therefore, both have the same fundamental elements, namely virtual objects, real-time response and visual equipment [5]. However, they are different in a few aspects. AR only superimposes virtual objects on the real objects, where the real environment can still be seen by users, as showed in Figure 1. In VR, the real environment is totally replaced with the virtual environment. Hence, VR will limit user activities within a room area because users cannot see the obstructs around them. Whereas, outdoor activities can be carried out by using AR, because the real environment still exists in AR [6]. VR and AR both have a remarkable requirement differences in the depth of immersion [7]. The main objective of VR is a totally immersive artificial environment, which requires high level of realism of virtual environment, but that is not a necessary goal in AR [5]. H. Badioze Zaman et al. (Eds.): IVIC 2009, LNCS 5857, pp. 302–313, 2009. © Springer-Verlag Berlin Heidelberg 2009

Tangible Interaction in Learning Astronomy

303

Fig. 1. Example of Augmented Reality in LSS. Unlike VR, user can see the real environment

For the rest of the paper, we will first review the role of AR in education, and Tangible Augmented Reality as user interface in Section 2.0. Section 3.0 describes the implementation process of LSS such as the system work flow and the applied interaction techniques, and section 4.0 concludes the paper.

2

AR and Augmented Virtuality

2.1 AR in Education A Reality-Virtuality Continuum was proposed by Milgram & Kishino [7] to explain the relationship between real and virtual environment. According to the continuum (Figure 2), real environment is located on the left end side of the continuum and the virtual environment on the other hand, is located on the right side. AR and augmented virtuality (AV) are located in between them. In the continuum, AR position is closer to the real environment, and this means that there is a higher proportion of real environment in AR. Whereas, AV is closer to the virtual environment because of its higher proportion of virtual environment. Mixed Reality however, is a domain which contains all these elements mentioned between the real and virtual environment. In recent years, a number of AR applications have been widely applied in various domains, such as military, entertainment, medical and education. In storytelling, MagicBook [8] allows users to read the book and see the virtual objects through handheld displays (HHD). Without using any AR displays, users still can read the text on the book, look at the pictures and flip the pages. Augmented Chemistry [9] is an example of the implementation of AR in science learning. It is a workbench consisting of a table and a rear-projection screen. Augmented Chemistry has helped users to understand the molecules or atoms structure by showing the structure in 3-Dimension. Users still uses booklets, gripper and cube to interact with these molecules and atoms models. In children’s education, ‘The Book of Colours’ [10] uses AR to explain the basic theory and concepts of colours through Head-Mounted-Displays (HMD). The children can interact and observe the visual feedback from 3-Dimension virtual character. AR is not only applied in formal education, but it is also applied in informal

304

A.K. Sin and H. Badioze Zaman

Fig. 2. Reality-Virtuality Continuum (Source: Milgram & Kishino, 1994)

education. Orlando Science Center in US had used AR to show sea creatures. Visitors can navigate a Rover through the ocean environment, to explore reptiles and fishes in DinoDigs exhibition hall [11]. AR is able to enrich educational benefits by supporting seamless interaction with both the real and virtual environment, using tangible interface metaphor for object manipulation, and offers smooth transition between reality and virtuality [12]. Shelton and Hedley [13] explored the used of AR in teaching undergraduates on earth-sun relationship in terms of axial tilt and solstices, and found that AR is useful in teaching subjects where students cannot possibly experience it first hand in the real world. Besides, AR can clearly show spatial concepts, temporal concepts and contextual relationships between real and virtual objects, and all these factors enable AR to be a powerful educational tool [14]. 2.2 Tangible Augmented Reality Tangible User Interface (TUI), was introduced by Ishii & Ulmer in 1997. They defined TUI as augmenting the real physical world by coupling digital information to everyday physical object [15]. TUI provides a physical interaction by turning physical object into input and output device for computer interface [16]. Physical interaction is able to provide direct, manipulability and intuitive understanding to users [17]. For instance, it is very natural for humans to pick up, rotate and place a physical object. Hence, when physical objects are used as input or output device, users are not required to learn much about the way of manipulating these physical objects. TUI gives an intuitive and seamless interaction with digital and physical objects, because TUI do not require any special purpose input devices [18]. Fitzmaurice & Buxton [19] and Patten & Ishii [20] in their respective research, showed that TUI is an outperformed standard mouse and keyboard-based-interface [21]. However, TUI has limitation in dynamically change an object’s physical properties [22]. Thus, the physical object needs to be chosen carefully to avoid any spatial gap. Humans deduce the inherent functionality of objects from their physical qualities such as shape, weight, size and colour [23], so physical objects can offer a clear spatial mapping to their functions. Spatial gap will exist when the chosen object does not match with the desired functionality in a computer interface. The previous sections had discussed on user interaction with virtual content through HMD, HHD or projection screen in AR interface. Interaction gap will exist when users use the special-purpose input device in AR interaction, because this will result in two different interfaces (virtual and physical workspace) [18].

Tangible Interaction in Learning Astronomy

305

TUI offers a seamless interaction, but consists of a spatial gap. Whilst AR interface supports spatially seamless workspace, it has the interaction gap. Due to the complimentary nature of AR and TUI, Billinghurt et. al. [24] proposes a new interface known as Tangible Augmented Reality (TAR). TAR is an interface which combines Tangible User Interface (TUI) with Augmented Reality Interface. With TAR, each object is registered as a physical object, and users interact with the virtual objects by manipulating the corresponding tangible objects [24]. Tangible Augmented Street Map (TASM) is a cube designed using the TAR concept tiled with a street map in cartographic visualization [25]. The digital graphic of the street map is superimposed on the cube. By rotating the cube (up, down, left, right), users can access the map in 4 different directions (north, south, east, west). Tangible Augmented Reality Modeling (TARM) system is a modeling system which provides users with the natural way to create 3D models [26]. By applying TAR, users can create models by manipulating the physical block. Users can also use a real pen to draw a curve or surfaces directly on the models. Immersive Authoring of Tangible Augmented Reality (iaTAR) is another authoring system which allows users to develop and test their AR application concurrently [27]. iaTAR has few interactions built for users to use in their development. Users can build their AR application via interaction to create or destroy a virtual object, browsing a component, or modify object’s properties by connecting it with inspector pad and keypad. The involved physical objects are in reality, papers and a cube printed with marker. Besides, Ulbricht & Schmalsticg [28] had built a TAR Tabletop Game. During the game, users are sitting or standing around the glass table, and shooting the opponent’s 3D virtual balloons by 3D virtual catapult. Users conduct the game by moving the cardboard with a marker on it. Augmented Chemistry [9] is not only a learning science AR application as discussed in the previous section, but it is also a TAR application. Booklet, Gripper with button and cube with markers are the interaction physical objects. In story telling, children can use both hands to unfold the Magic Cube [29] and explore the contents of the “Noah’s Ark” story. There are many examples to show that TAR is already applied in various fields. Hence, TAR is able to offer intuitive and tangible interaction to users.

3 Implementation of LSS 3.1 Live Solar System (LSS) Live Solar System (LSS) is an AR based educational tool to help student learn about astronomy. Therefore, various multimedia elements like video, graphic, text and 3D objects were included in LSS to achieve the intended purpose. Figure 3 shows that LSS consists of both the physical world as well as the digital world. Its implementation involves the use of a webcam to capture the real world view and then send for the marker detection process. Associated digital contents such as the 3D objects will be loaded after ID of the marker is determined. Then, the captured real world view will be combined with the digital contents and sent out to the users either through the HMD or monitor. In LSS, the cube was used as the input device.

306

A.K. Sin and H. Badioze Zaman

Fig. 3. Live Solar System Work flow

LSS allows students to explore freely in the AR solar system, so that they can understand in a more concrete manner the eight planets in our solar system. Besides, they can also learn about the characteristics of the planets for example, the sun as well as its phenomena, through the combination of physical and digital learning materials. To provide a better exploration experience to users, the Visual Information Seeking Mantra principle was used as a guide in designing LSS. The Visual Information Seeking Mantra, which adopts “Overview first, zoom and filter, then details on demand” describes how data should be presented on screen so that it is most effective for users to understand [34]. In LSS, users will have an overview of the solar system through observation. Then, they can zoom and filter down to one of the planets by selecting their planet of interest. Users can acquire more information about the selected planet by performing some interaction related to ‘detail on demand’. This process will be discussed in detail in section 3.2. Based on LSS, the mouse and keyboard devices were not involved during the interactions. This is due to the fact that LSS applied the TAR concept. This means that LSS offers an intuitive-tangible interaction to users. Cubes were chosen as the physical object which will be used during these interactions. Cube was chosen as it is one of the familiar physical objects in our daily lives, and all of us know how to manipulate a cube. Sheridan et. al. [35] had studied about the cube affordance to humans and the ways humans manipulate a cube. Actions like ‘place’, ‘pick up’, ‘throw’, ‘squeeze’, ‘press’, ‘rotate’ and ‘hold’ are usually performed by humans. Thus, some of the selected natural behaviors of humans will be used in the LSS interaction (Figure 4). Figure 5 shows the cubes and some printed markers used in LSS. These cubes were specifically designed in 3x3x3 cm, so that they are small enough to hold in users’ hands, and easy for them to manipulates the cubes.

Tangible Interaction in Learning Astronomy

307

Fig. 4. Interaction techniques uses familiar actions usually performed by humans on a cube

Fig. 5. LSS cubes used as physical input objects

3.2 Interaction Techniques in LSS Figure 4 shows a few interaction techniques used in LSS. These techniques represent the natural and intuitive actions usually performed by humans in their daily lives when handling physical objects. Thus, users need not learn new interaction techniques when using LSS. When using LSS for the first time, users can observe the solar system (Figure 6). They can observe the surface and size of each planet, the revolution speed on each planet’s orbit and the rotation speed of each planet on their own axis. Besides, users can change the current mode (normal mode) to temperature mode or gravity mode. To do so, they just need to hold and rotate the “menu cube” in their hand (Figure 7). The menu cube designed has 3 different unique markers printed on it, and each of them represents normal, temperature and gravity mode. The planet’s surface will change when users change the normal mode to other modes by rotating the menu cube. Meanwhile, the text in 3D form will be shown on the screen to notify users which mode they are currently viewing. In these modes, users can compare the different temperatures and gravity forces of each planet with the help of the scale provided.

308

A.K. Sin and H. Badioze Zaman

Fig. 6. Observing specific planet’s surface, size, revolution and rotation speed

Fig. 7. Changing mode by rotating the menu cube in hand

By manipulating the ‘pointer cube’, users can pick any interested planet to have a closer and better view of the planet (Figure 8). A pointer cube with a marker will be superimposed by a computer generated red pointer on it. To pick a planet from the solar system, users only need to hold the cube and approach the interested planet. Once the red pointer touches the selected planet, the planet will be copied on the cube. Collision detection method is used to determine which planet is selected by users. The pseudo code used for this method is as indicated below: IF collisionDectection==True && 3D object on cubeMarker ==3D pointer object READ collision object copy and rename collision object

Tangible Interaction in Learning Astronomy

309

set scale on copied new object hide the 3D pointer object from cubeMarker load the new copied object on cubeMarker END IF After having a planet on the cube, users can view the other side of the selected planet by rotating the cube. This is because the planet will rotate according to the performed actions. And similar actions are normally conducted in users’ daily lives. For instance, when shopping, they are used to rotating a package of some sort to know the contents of the package or to know the expired date of the package.

Fig. 8. Picking planet by manipulating the ‘pointer cube’

LSS allow users to know more about the planets in the solar system. First, users can select the interested planet with the ‘pointer cube’. They can then, place the cube on an ‘information center’. This information center is a paper printed with a marker and ‘I’ icon (Figure 11). Approximation method is used to detect the distance between the information center’s marker and the pointer cube’s marker. Once the distance is close enough, the information of the selected planet, such as diameter, mass, average temperature, gravity, total days of rotation and evolution, will be shown on the HMD or monitor (Figure 9). To help users get the appropriate distance, an area was drawn on the information center’s paper. Thus, users just need to place the cube on the designated area to perform the desired action (Figure 10). ‘Trash bin’ is another paper printed with a marker with a trash bin icon. Its function is to allow users to delete the selected planet, so that they can then select other planets (Figure 11). The way that it functions is similar to the information center. Both uses the approximation method. The pseudo code for this method is indicated below: WHILE CubeMarker==True && TrashBinMarker==True IF cubemarker 3D object != 3D pointer object READ CubeMarker position

310

A.K. Sin and H. Badioze Zaman

READ TrashBinMarker position COMPUTE Distance between CubeMarker and TrashBinMarker IF Distance < DeleteDistance delete 3D object from CubeMarker load 3D Pointer object on CubeMarker END IF END IF END WHILE

Fig. 9. Placing cube on the designated area to show details of selected planet

Fig. 10. Delete selected planet from cube

Tangible Interaction in Learning Astronomy

311

Fig. 11. ‘Trash bin’ and ‘Information center’ in LSS

Video is one of the multimedia elements used in LSS to show the phenomena of the sun. The metaphor of ‘on and off the television and monitor’ in daily lives was applied in LSS video interaction approach. Figure 12 shows the way users play and stop the video in LSS. Users can press the ‘play’ button to start the video, and at the same time the ‘play’ button will automatically change to ‘stop’ button. Users can repeat the same action to stop the video, and the ‘stop’ button will change back to ‘play’ button. Simple occlusion method is used to determine whether the marker is occluded by the finger or not. The following shows the pseudo code of the method mentioned: Initialise PressCount to zero WHILE MovieMarker==True && ButtonMarker==True IF ButtonMarker==False add one to PressCount IF Press mod 2==0 Start playing movie change play icon to stop icon ELSE Stop playing movie change stop icon to play icon END IF END WHILE

Fig. 12. Techniques to ‘play’ and ‘stop’ video in LSS

312

A.K. Sin and H. Badioze Zaman

4 Conclusion Augmented Reality is overlaid with virtual objects on real objects offers a new interactive approach that can be conducted between users and the computer. The characteristics of an AR allows it to be a new approach in teaching and learning. The combination of AR interface and TUI provides a seamless interaction to users. Many research has been conducted to help learning in various subject matter domains. In this research, the Live Solar System (LLS) is an AR book-based educational tool with the purpose of helping students learn astronomy. LSS applied the TAR approach in designing which allows users to have a natural and intuitive interaction with the system. This new approach gives a new and more engaging learning experience process for learners. The visual information seeking mantra approach embedded in the system, allows users to have an overview of the whole solar system, then select an interested planet among the eight planets with the ‘pointer cube’, and then obtain details of the selected planet by placing it on the ‘information center’. By manipulating the cube using natural processes such as rotating, picking up, placing and holding, users are able to have effective interaction during exploration of the LSS. Thus, the traditional computer input devices are no longer needed with the introduction of LSS.

References 1. Azuma, R.: A Survey of Augmented Reality. Teleoperators and Virtual Environments 6(4), 355–385 (1997) 2. Azuma, R., Baillot, Y., Behringer, R., Feiner, S., Julier, S., Maclntyre, B.: Recent Advance in Augmenteed Reality. IEEE Computer Graphics and Application (2001) 3. Sutherland, I.: The Ultimate Display. In: Interational Federation of Information Processing, vol. 2, pp. 506–508 (1965) 4. Barfield, W., Caudell, T.: Fundamentals of Wearable Computers and Augmented Reality, pp. 447–468. Lawrence Erlbaum Associates, Mahwah (2001) 5. Jong, S.P.: Augmented Reality Introduction. In: Slaid. Deparment of Computer Science and Engineering, University Incheon (2005) 6. Jung, Y.M., Jong, S.C.: The Virtuality and Reality of Augmented Reality. Journal of Multimedia 2(1), 32–37 (2007) 7. Milgram, P., Takemura, H., Utsumi, A., Kishino, F.: Augmented Reality: A class of displays on the reality-virtuality continuum. Proceedings of Telemanipulator and Telepresence Technologies (1994) 8. Billinghurst, M., Kato, H., Poupyrev, I.: The MagicBook: Moving Seamlessly between Reality and Virtuality. IEEE Computer Graphics and Applications 21(3), 6–8 (2001) 9. Fjeld, M., Voegtli, B.M.: Augmented Chemistry: An Interactive Educational Workbench. In: Proceedings of the 1st international Symposium on Mixed and Augmented Reality, vol. 259. IEEE Computer Society, Los Alamitos (2002) 10. Ucelli, G., Conti, G., Amicis, R.D., Servidio, R.: Learning Using Augmented Reality Technology: Multiple Means of Interaction for Teaching Children the Theory of Colours. In: Maybury, M., Stock, O., Wahlster, W. (eds.) INTETAIN 2005. LNCS (LNAI), vol. 3814, pp. 193–202. Springer, Heidelberg (2005) 11. Hughes, C.E., Smith, E., Stapleton, C., Hughes, D.E.: Proceedings of KSCE (2004), http://www.mcl.ucf.edu/research/seacreatures/ KSCE04HughesEtAl.pdf 12. Billinghurst, M.: Augmented Reality in Education, http://www.newhorizons.org/ strategies/technology/ billinghurst.htm

Tangible Interaction in Learning Astronomy

313

13. Shelton, B., Hedly, N.: Using augmented reality for teaching earth-sun relationships to undergraduate geography students. In: The 1st IEEE international augmented reality toolkit workshop, Darmstadt, Germany (2002) 14. Woods, E., Billinghurst, M., Aldridge, G., Garrie, B.: Augmenting the Science Centre and Museum Experience. In: Proceedings of the 2nd international conference on Computer graphics and interactive techniques in Australasia and South East Asia, pp. 230–236 (2004) 15. Isii, H., Ulemr, B.: Tangible Bits: Towards Seamless Interfaces between People, Bit, and Atoms. In: Conference on Human Factors in Computing Systems, CHI 1997, pp. 234–241 (1997) 16. Kim, M.J., Maher, M.L.: Comparison of Designers Using A Tangible User Interface and A Graphical User Interface and the Impact on Spatial Cognition. In: Proceedings of International Workshop on Human Behaviour in Designing, pp. 81–94 (2005) 17. Wang, Q., Li, C., Huang, X., Tang, M.: Tangible Interface: Integration of real and virtual. In: 7th International Conference on Computer Supported Cooperative Work in Design, pp. 408–412 (2002) 18. Poupyrev, I., Tan, D.S., Billinghurst, M., Kato, H., Regenbrecht, H., Tetsutani, N.: Developing a Generic Augmented-Reality Interface. Computer 35(3), 44–50 (2002) 19. Fitzmaurice, G., Buxton, W.: An Empirical Evaluation of Graspable User Interfaces: towards specialized, space-multiplexed input. In: Proceedings of the ACM Conference on Human Factors in Computing Systems (CHI 1997), pp. 43–50 (1997) 20. Patten, J., Ishii, H.: A Comparision of Spatial Organization Strategies in Graphical and Tangible User Interface. In: Proceedings of Designing Augmented Reality Environments, pp. 41–50 (2000) 21. Hoven, E.V.D., Frens, J., Aliakseyeu, D., Martens, J.B., Overbeeke, K., Peters, P.: Design research & Tangible Interaction. In: 1st International conference on tangible Embedded interaction, TEI 2007, pp. 109–115 (2007) 22. Billing, M.: Crossing the chasm. In: Proceedings of International Conference on Augmented Tele-Existence (ICAT 2001) (2001), http://www.hitl.washington.edu/ publications/r-2002-62/r-2002-62.pdf 23. Sharlin, E., Waston, B., Kitamura, Y., Kishino, F., Itoh, Y.: On tangible user interface, human and spatiality. Personal ubiquitous Computer 8(5), 338–346 (2004) 24. Billinghurst, M., Kato, H., Poupyrev, I.: Collaboration with tangible augmented reality interfaces. In: HCI International 2001, pp. 234–241 (2001) 25. Moore, A.: A Tangible Augmented Reality Interface to Tiled Street Maps and its Usability Testing. In: Progress in Spatial Data Handling, pp. 511–528. Springer, Heidelberg (2006) 26. Park, J.Y., Lee, J.W.: Tangible Augmented Reality. In: Rauterberg, M. (ed.) ICEC 2004. LNCS, vol. 3166, pp. 254–259. Springer, Heidelberg (2004) 27. Lee, G.A., Nelles, C., Billinghurst, M., Kim, G.J.: Immersive Authoring of Tangible Augmented Reality Applications. In: Proceedings of the 3rd IEEE/ACM international Symposium on Mixed and Augmented Reality, pp. 172–181 (2004) 28. Ulbricht, C., Schmalstieg, D.: Tangible Augmented Reality for Computer Games. In: Proceedings of the Third IASTED International Conference on Visualization, Imaging and Image Processing 2003, pp. 950–954 (2003) 29. Zhou, Z., Cheok, A.D., Li, Y., Kato, H.: Magic cubes for social and physical family entertainment. In: CHI 2005 Extended Abstracts on Human Factors in Computing Systems, pp. 1156–1157 (2005) 30. Sheridan, J.G., Short, B.W., Laerhoven, V.K., Villar, N., Kortuem, G.: Exploring cube affordances: towards a classification of non-verbal dynamics of physical interfaces for wearable computing. Eurowearable. pp. 113-118 (2003) 31. Shneiderman, B.: The eyes have it: A task by data type taxonomy of information visualizations. In: Proceeding IEEE Visual Languages 1996, pp. 336–343 (1996)

Suggest Documents