Remote Hands-on Experience: Distributed Collaboration ... - CiteSeerX

4 downloads 1633 Views 535KB Size Report
and exploit benefits from distance learning and remote collaboration. .... objects in 3D, which helps in the development of spatial abilities [10], and the ability ... desktop, his or her hands and physical objects spatially augmented by virtual 3D ... The software to drive the Spinnstube® is based on the Open Source VR/AR.
Krauss, M., Riege, K., Pemberton, L. & Winter, M. (2009). Remote Hands-on Experience: Distributed Collaboration with Augmented Reality. In: Learning in the Synergy of Multiple Disciplines, Proceedings of the EC-TEL 2009, Vol. 5794, Berlin/Heidelberg: Springer.

Remote Hands-on Experience: Distributed Collaboration with Augmented Reality Matthias Krauß1, Kai Riege1, Marcus Winter2, Lyn Pemberton2 Fraunhofer IAIS, Schloss Birlinghoven, 53754 Sankt Augustin, Germany {matthias.krauss, kai.riege}@iais.fraunhofer.de 2 University of Brighton, School of Computing, Mathematical and Information Sciences, Lewes Rd, Brighton BN2 4GJ, East Sussex, UK {Lyn.Pemberton, Marcus.Winter}@brighton.ac.uk 1

Abstract. One claim of Technology-Enhanced Learning (TEL) is to support and exploit benefits from distance learning and remote collaboration. On the other hand, several approaches to learning emphasize the importance of handson experience. Unfortunately, these two goals don't go well together with traditional learning techniques. Even though TEL technologies can alleviate this problem, it is not sufficiently solved yet - remote collaboration usually comes at the cost of losing direct hands-on access. The ARiSE project aimed at bringing Augmented Reality (AR) to School Environments, a technology that can potentially bridge the gap between the two goals mentioned. The project has designed, implemented and evaluated a pedagogical reference scenario where students worked hands-on together over large distances. This paper describes the AR learning approach we followed and discusses its implementation and its future potential. It shows a simple and successful distributed AR learning approach and suggests features for improvement.

Keywords: Augmented Reality, Collaboration, Remote Presence, Virtual Reality, Technology-Enhanced Learning, Human Computer Interaction

1 Introduction One major claim of the Technology-Enhanced Learning (TEL) domain is to foster collaborative learning processes. Thanks to electronically conveyed media and the Internet, collaboration is supposedly no longer limited to co-located work, but can be extended over long distances. Remote collaboration has been a vivid and fruitful topic of the Computer-Supported Collaborative Learning (CSCL) research community. Several different types of communication and collaboration have been developed, evaluated and implemented in everyday learning practice. In most of these settings, collaboration is centered around the concept of shared spaces – i.e. places to work together. In traditional co-located collaboration, multiple contributors can work jointly on one physical object and talk directly to each other.

For remote collaboration, these means of communication are partially replicated: virtual shared spaces allow collaboration through multiple, technologically synchronized views on a common object. The choice of communication channels and synchronized aspects opens new types of cooperation, but interaction limitations impose new problems and challenges. So far, the vast majority of remote collaboration tools for learning have been limited to desktop PC settings, using the WIMP (Windows, Icons, Menus, Pointing) interaction paradigm – for example, web-based learning applications. This type of interaction is simple to implement and its requirements are easy to meet. Due to its abstraction, tools can be designed to be generic and application-independent. However, due to its generic and limited interaction paradigm, WIMP interaction comes at the cost of limited graspability, with the learner’s experience remaining indirect. Recently, the TEL domain has increased efforts to extend e-learning experience from PCs to other platforms, allowing perceptually richer experiences and more direct interaction. Augmented Reality (AR) is one possible alternative. In Augmented Reality applications a certain part of the real world is combined with a virtual one. The user interaction within such an environment is usually characterized by a very direct means of manipulating references, i.e. physical objects with optically tracked markers, resulting in the adaptation of the corresponding virtual parts. This supports building scenarios in which a user can interact directly with his or her hands. This resembles learning-by-doing far more than moving a pointer on a screen by moving a computer mouse on a table. Because of this directness of experience, AR is claimed to open up new ways of learning. AR comes in several technological varieties, ranging from severely limited implementations on mobile phones over head-mounted displays (HMDs) to fully featured stationary AR workspaces. Our work, which we conducted within the EC research project ARiSE – Augmented Reality in School Environments [1], is based on the Spinnstube®, a fully featured, low-cost AR workspace specifically developed for education applications [2]. Due to its mixing of physical and virtual aspects, collaboration in AR imposes a new, specific challenge: physical objects cannot easily be shared over distance. Local AR collaboration can be accomplished by multiple virtual augmentations on shared physical objects, but remote collaboration requires either a solution for physical sharing of objects or it has to circumvent this problem in a different way. During the ARiSE project, we have developed an AR learning platform and a remote collboration application. The result has been evaluated in several different aspects (see [3]). This paper focuses on qualitative evaluation of social interaction using the prototype. Main research questions of this evaluation were: Which communication problems does the chosen technology yield? Which shortcomings of communication and usability can be identified? How do learners deal with these shortcomings? After a discussion of related approaches to distributed AR collaboration, the paper describes the pedagogical reference scenario we used to design the AR application. The following section illustrates its technical implementation. We introduce our evaluation method and present evaluation results. The paper concludes with a summary of results, a discussion of potential enhancements of the system and future work.

2 Related Work Collaboration is a central aspect of the TEL and CSCL domains. As a consequence, various remote collaboration scenarios, most based on traditional PC interfaces, have been developed and evaluated. Mixed and Augmented Reality technologies have recently entered the e-learning research domain. Earlier research in AR collaboration was mostly focused on co-located collaboration, i.e. multiple users sharing the same physical space. Reitmayr and Schmalstieg [3], Ohshima et al [5] as well as Regenbrecht and Wagner [6] describe typical set-ups. More recently, different approaches have been taken to solve or circumvent the problem of distributing partially physical realities. Müller and Erbe [7] discuss various challenges of distant collaboration within mixed physical-virtual labs. Bruns’ Hyper-bonds approach [8] proposes synchronisation of physical objects through remote force-feedback, using networked sensors and actuators. This approach can provide physical remote synchronisation, but is inherently limited to specific physical set-ups and, due to its implementation requirements, only feasible for a low number of synchronized aspects. A different technique that resembles the approach taken in our studies is to share only virtual aspects while maintaining independent physical halfworlds for each participant, resulting in a rather virtual experience with directness of augmented reality interaction. Among others, Chastine et al [9] used this approach in their studies and highlight the problem of referencing in AR collaboration.

3 Pedagogical reference scenario AR has a range of affordances that support learning, including the ability to present objects in 3D, which helps in the development of spatial abilities [10], and the ability to combine real and virtual objects in tangible user interfaces, which may be more suitable for certain kinds of learning activities [11]. In addition, AR has the ability to offer different views on the same object or situation, which aids cognitive development [12], promotes knowledge transfer [13], and facilitates extrapolation by helping learners to go beyond the information given [14]. While these learning affordances have been exploited in previous prototypes [15, 16], the pedagogical reference scenario described here focuses on Spinnstube's support for remote collaboration in a shared workspace. Collaborative learning is based on social constructivist ideas of learning that emphasise learning through active knowledge construction [12, 14], communication and social interaction [17, 18]. The exchange of ideas amongst peers engaged in the same activity helps learners to develop a deeper understanding of the subject [19] and to reflect upon and conceptualise their experiences as they explain findings, e.g. the meaning of words [20]. In addition, collaborative settings often lead to situations involving peer tutoring [21], which according to Pask’s Conversation Theory [22] is a critical method of learning.

Based on these ideas, the reference scenario involved students selecting suitable topics from their local history and culture, preparing 3D digital artefacts, and then using these artefacts in a summer school project to anchor and illustrate one-to-one remote discussions with a peer from another country. The preparation phase involved local collaboration between students to select and discuss suitable topics and artefacts, and it gave them an opportunity to familiarize themselves with the AR learning platform by creating their own demonstration models. The AR application allows learners to sculpt 3D models with simple operations in free space using a light pen. At the summer school, pairs of students first communicated via a video link to get to know each other, and then they started their collaborative AR session. Besides a shared, interactive 3D workspace, the remote collaboration application provides an audio link for verbal communication. Students used this set-up to discuss their mutual local cultures, taking turns to explain customs and traditions and scaffolding their presentation with the prepared artefacts. After their presentation, students asked their counterpart questions about the presented content, both to test the partner’s understanding and to enquire about similarities or equivalents in their own local culture. This part also included an exercise where the presenting student erased part of a prepared artefact, using the aforementioned light pen, and asked their counterpart to reconstruct it in the shared workspace. Both students were able to observe the reconstruction process and they could comment on the progress and result. Collaborative sessions took approximately one hour with students switching roles at half time so that each side could present their content. The summer school was followed by whole-class discussions where students consolidated and conceptualised what they had learnt.

4 Technical implementation As a base for our distributed Augmented Reality set-up we used the Spinnstube® display system [2]. Simply attached to a desk, this projection-based display can be used as an extension to a conventional desktop work environment. The system consists of a stereo-capable video projector that throws an image onto a projection screen placed above the desktop. Through a half-silvered mirror, a learner can see the desktop, his or her hands and physical objects spatially augmented by virtual 3D content. The Spinnstube® hardware was designed to keep the workspace free from any technical parts in order to avoid obstacles for direct-hands interaction. Figure Fig. 1 shows a sketch of the display.

Fig. 1. Sketch of the Spinnstube® display system (left) and a learner working inside (right)

The Spinnstube® is equipped with two kinds of tracking systems to gather information about a user’s current viewpoint, as well as the interaction taking place on the desktop. Hence, two infra-red (IR) cameras are mounted on the mirror looking towards a user to track his or her head as well as the movement of the mirror itself. Two conventional FireWire cameras are used to observe the interaction area on and above the desktop. For the remote AR collaboration application, we developed a light pen consisting of an LED tip that can change its colour when a button is pressed (Figure Fig. 2). The two FireWire cameras are used to track the colour and position of this pen. Hence, the light pen can be used as a 3D cursor. The different colours are associated with functions such as adding, removing 3D material or colouring the surface. Figure Fig. 1 shows a learner reviewing a 3D model created with the light pen.

Fig. 2. A handheld light pen as sculpting and interaction device.

The software to drive the Spinnstube® is based on the Open Source VR/AR framework Avango® [24]. We enhanced this software with the necessary modules to communicate with the Spinnstube® hardware. Avango® provides a module for

group-based communication of multiple Avango® applications on different machines via the Internet, using the Ensemble distributed communication system [25]. Each Avango® application can decide which part of the content should be local and which one distributed. In the remote collaboration application discussed here, the 3D workspace and the position of each user’s cursor are synchronized, whereas the menu interaction, status information and users’ perspectives onto the scene are local. Unless a connection to another machine is established, a fully functional standalone application is provided to a user. The distribution is implemented via group management. A gossip server manages the state of the group, i.e. its current participants [25]. It is used as the central instance to register as a group member. After joining the group, members communicate directly with each other via UDP, minimising latency. In the remote collaboration application, groups consist of two members. Shared content does not belong to a specific user. The group itself owns the content. If a member leaves the group, the member’s contribution remains. To support remote discussions, each Spinnstube® is equipped with headphones. When entering a collaboration session, a Skype [26] connection is established to the remote partner. The presence of a collaboration partner is indicated by their actions, the visibility of their 3D cursor, and their voice.

5 Evaluation design The remote collaboration application was evaluated in a distributed summer school project involving 13-14 year old students in Siauliai, Lithuania and St. Augustin, Germany. Three independent evaluations were carried out, including summative pedagogical and usability evaluations involving questionnaires and interviews, and a formative evaluation involving synchronised video observation in the two locations. This discussion focuses on the formative evaluation with a view to inform the future development of this and other similar collaborative learning platforms. No established methods or techniques are described in the literature for evaluating AR display systems in a remote collaboration scenario. Gutwin and Greenberg [23] distinguish between taskwork and teamwork: Taskwork is no different for a group than it is for an individual, while teamwork is essentially the added effort of working together in a team. This distinction enables us to break down the evaluation into AR related aspects on the one hand, and collaboration-related aspects on the other hand. However, even on these partial aspects the literature offers no established evaluation approaches. An analysis of 266 AR related publications [27] found that only 8% addressed some aspect of HCI and involved formal user-based experiments. On a similar note, a review of 45 groupware evaluations [28] found that only one-quarter involved a real-world setting. Reasons for this "evaluation crisis facing distributed system development" [29] include logistical difficulties in collecting data and the number and complexity of variables to consider among the main barriers to groupware evaluation. The formative evaluation described in this paper draws on the idea that traditional methods developed for the evaluation of single-user systems do not properly address

collaborative aspects and are therefore inappropriate for producing design solutions for collaborative systems [29, 30]. With respect to quantitative versus qualitative methods, it has been argued that quantitative metrics have not only been elusive, but also are rarely good indicators on their own for improving collaborative systems [29]. By contrast, naturalistic user-based methods are seen as the most promising for formative evaluations [31], and are acknowledged to significantly improve evaluation results [32]. The formative evaluation described here was therefore based on a naturalistic approach involving in-depth observation of the remote collaboration between students by a panel of usability experts. 5.1 Data collection Two researcher teams were deployed to video record the remote collaboration between students in the two summer school locations. In each location, a front camera recorded participants’ gestures and facial expressions, and a rear camera recorded the projection surface of the AR display to get an idea of what participants could see at any given time (Figure Fig. 3). To facilitate editing and analysing the resulting material, the cameras were synchronised via a common time server on the Internet, and replicated each other’s view-points and zoom settings.

Fig. 3. Synchronised video observation in two locations recording collaborative sessions.

Recording the remote collaboration sessions from both ends posed a number of challenges. The AR display is stereoscopic: With the help of shutter glasses, the AR display system produces separate images for each eye, which are then merged by a user’s brain into one 3D image. While the video cameras only picked up the double image on the projection surface, but not the 3D image seen by a user, the resulting material still gave a sufficiently accurate idea of what users did actually see. Another challenge was the requirement for a semi-dark environment for the AR display due to specific requirements of the sensing mechanism. While this was not a problem for the rear view recording the projection surface, the front-view recording participants'

gestures and facial expressions had to employ a special night-view mode in order to record sufficient detail. A total of eight collaborative sessions were recorded over two days, involving 16 students in each summer school location. Collaborative sessions were between 22 and 58 minutes in length. 5.2 Data analysis To prepare the video material for analysis, a combined view was produced for each collaborative session, merging the front- and rear-views from both locations into a single screen. The resulting 4-in-1 overviews (Figure Fig. 4) comprehensively document each collaborative session from both ends, offering more detail about communication and collaboration issues than traditional observation techniques, which document only one side leaving evaluators to speculate about what happens at the remote end.

Fig. 4. The 4-in-1 overview created from the video material for each collaborative session.

With a view to hardware ergonomics of the AR display, two additional videos were produced that showed for each location how students prepared for their collaborative session, taking their seat in the AR display, putting on headphones and shutter glasses, adjusting the equipment and checking the interaction devices. The data analysis involved a panel of four usability experts from the Interactive Technologies Research Group at the University of Brighton watching the edited video material for each collaborative session. Critical scenes were reviewed and watched again as required to better understand the usability problems at hand. Notes were taken during the screening and then compared and discussed after each session. Guiding the analysis and discussion were three sets of usability heuristics, which the evaluators had discussed beforehand, and in addition had available in printed form, in order to provide a common reference frame. Each set of heuristics covered different

aspects of the formative evaluation, including remote collaboration in a shared workspace, augmented reality specific aspects, and general usability heuristics to complement the first two more specialised sets. The heuristics relating to remote collaboration in a shared workspace were based on the assumption that small groups need to perform certain low level actions and interactions in order to collaborate effectively: these include communication, planning, monitoring, assistance, coordination, and protection [23]. As insufficient support for these mechanics of collaboration causes usability problems, groupware usability can be defined as “the degree to which a groupware system supports the mechanics of collaboration for a particular set of users and a particular set of tasks” [23]. A detailed description can be found in [33]. Guidelines relating to AR-related aspects draw on the idea that the specific hardware and software required by AR displays present usability issues relating to hardware ergonomics, software robustness, display, and interaction quality. As currently no set of common design guidelines exists for the development of AR systems [34], the guidelines used in the evaluation draw on a range of sources including VR usability heuristics [35], a taxonomy of mixed reality visual displays [36], a previous usability evaluation of the Studierstube AR system [37], and previous project experience [15, 16]. The resulting five heuristics are shown in Table 1. Table 1. Five design guidelines for AR displays synthesised from [34, 35, 36, 37, 15, 16] 1. 2. 3. 4.

5.

Reproduction quality - a user should be unaware that overlaid objects are virtual. Reproduction of virtual objects should be in real-time, high-fidelity, 3D animation. Registration - users should not perceive gaps or discrepancies between real objects and augmented content; virtual objects should be fully aligned with the real world. Realistic feedback - the effect of users' actions on virtual objects should be instantly visible and conform to the laws of physics and the user’s perceptual expectations. Technical robustness - systems should be reliable and consistent, i.e. avoid freezes, crashes, and frequent need to re-calibrate Hardware ergonomics - display components should not create physical discomfort for users, e.g. accommodation problems, bad fitting helmets or headphones, eyestrain, cyber sickness.

The third set of heuristics relating to general issues is Nielsen's [38] well-known list of ten usability heuristics. These guidelines complement the heuristics focusing on collaboration and the design guidelines for AR systems. They cover usability aspects of taskwork [23], equally applicable to group and individual work, together with more traditional concepts and GUI components used in the AR application. A detailed description can be found in [38]. The evaluation took place over two days and was based on six videos of scheduled collaborations plus two videos of students getting ready for their collaborative session in the two locations.

6 Evaluation results Analogous to the three sets of heuristics informing the expert evaluation of the video material, usability problems are presented in three sections relating to remote collaboration issues, AR related issues, and general usability issues. The presentation of results is rounded off by a general discussion covering all three aspects. 6.1 Remote collaboration The AR remote collaboration prototype provides an audio channel but no video channel for explicit communication, which sharpened language problems between collaborating students from different countries. In addition, it led to monitoring and awareness problems, e.g. at the start of collaborative sessions students had to repeatedly ask (without getting a response) whether their partner was present in the remote AR display. The additional provision of a video channel would enable audiovisual communication and thereby improve support for explicit communication, monitoring and awareness. The prototype offered no functionality to synchronise perspectives between collaborating students, which led to a whole range of problems relating to consequential communication, coordination of action, monitoring and assistance. The video evidence suggests that some students were not aware of the independent perspectives, which further exacerbated the problem. Others however seemed aware of their decoupled perspectives, and one pair of students even managed to work around the problem by synchronising their views manually using the audio channel and their mutually visible pointing devices as a common reference. An operation mode allowing synchronisation of perspectives, and, in addition, resetting the perspectives to a common default when objects are loaded, would significantly improve support for remote collaboration. Students using the prototype were not aware of each other’s control and menu actions resulting in problems regarding coordination of action and monitoring, e.g. student A inspects an object in the shared workspace while student B loads a new object into the shared workspace. The implementation of functionality to make remote control and menu actions visible to collaborators and enable a veto on certain operations (e.g. loading a new object into the shared workspace) would improve the support for coordination of action, monitoring and awareness. 6.2 AR display The video material did not allow a direct evaluation of the criteria of reproduction quality, 3D registration and realistic feedback as it showed only a 2D representation of the 3D image seen by the user. A complete absence of participants’ comments relating to these issues suggests however that these aspects are implemented in a satisfactory way. Similarly, there was no evidence of any system freezes or crashes, and neither system had to be re-calibrated during operation, suggesting an overall high technical robustness of the prototype.

A wide range of issues relating to hardware ergonomics was observed in the video material. Some of these relate to specific products used in the AR display (e.g. accommodation problems with shutter glasses, headphones), while others relate more general to the design and technology of the AR display (e.g. semi-transparent mirror perceived as obstructing the line of view). 6.3 General issues The AR remote collaboration prototype offers no undo / redo functionality, which reduced user control and freedom, and led some students to accept unsatisfactory sculpting results rather than un-doing their actions and trying an alternative approach. Another issue is that the prototype has no preview functionality for the currently selected object when loading 3D objects into the workspace, reducing the visibility of system status, and leaving students in some cases unable to find previously saved 3D objects. Finally, it was observed that students preferred to request assistance from each other, from their supervising teacher or from researchers present in the room, suggesting that the inbuilt help screen in the prototype does not fulfil its purpose.

6.4 Discussion The description of usability problems in the previous sections aims to inform the future development of the ARISE platform and emphasises aspects that could be improved. However, it must not distract from the fact that the overall impression of the prototype was positive: Students showed high acceptance of the technology and engaged in vivid discussions. Over large sections of the collaboration process, the video observation showed that learners were fully immersed into discussions without significant distractions caused by the technology surrounding them. Overall, the prototype seems well suited for remote collaboration, with some weaknesses being balanced by strong points of the platform. The combination of a 3D sculpting tool, shared interactive workspace, and additional audio channel supports most of the mechanics of collaboration [23], with particularly strong support for intentional communication (verbal, remote cursor gestures) and consequential communication based on the manipulation of shared artefacts (artefact feed-through [33]). The video analysis consistently showed that collaborative sessions became more animated, communicative and interactive when students used the sculpting tool to explain issues and complete collaborative tasks. AR specific usability problems overwhelmingly concern hardware ergonomics. These problems suggest that the system would benefit from more user involvement in the design process, and from exploring emerging lightweight technologies as alternatives to the current display design. There is a substantial overlap between the general usability heuristics [38] used in the evaluation and the more specialised guidelines for remote collaboration and AR specific aspects. General usability issues identified relate mainly to control and support aspects of the prototype that interface with the underlying operating system and are therefore based on standard GUI concepts. It can be expected that in the future these metaphors will be replaced by concepts more appropriate to the AR context.

While many of the described problems were observed consistently for all sessions during the video analysis, it also was evident that ultimately their impact on the collaboration was limited, as participants naturally worked around these issues in order to get on with their session, confirming similar observations in the literature [23] about the resilience of users at adapting their interactions to overcome usability issues and succeed with their task.

7 Conclusions and further work We have developed an Augmented Reality system that supports remote collaboration of learners through a shared virtual space. Based on a pedagogically driven reference scenario of a learning unit, we have implemented a simple prototypical AR application for using AR in schools and evaluated it in a field test under classroom–similar conditions. While summative evaluations [3] found a high acceptance rate among students and teachers and confirmed the pedagogical effectiveness of the prototype AR application, the formative evaluation resulted in a number of recommendations informing future development: sharing viewpoints could simplify referencing problems in communication (see also [9]), video-conferencing features could avoid uncertainties related to remote presence and traditional HumanComputer Interaction features such as Undo could improve ease of use. Overall however, the formative evaluation found that the prototype is well suited for hands-on remote collaboration, and that minor implementation issues are more than compensated for by students' resilience and motivation to complete their collaborative tasks in the shared AR space. As additional features increase system complexity, reducing the charm of directness and simplicity seen in the current implementation, additional research is needed to solve the trade-off between usefulness of features and directness of AR interaction. The Spinnstube® AR system has been shown to be a useful tool and test-bed for AR applications in school environments. However, our evaluation has shown a number of usability issues of the workplace setup. These shortcomings will be solved in an upcoming design revision of the hardware infrastructure. The evaluation shows that AR technology can be a beneficial and learnermotivating addition to classroom learning. The results also give examples of how learners can develop their own problem-solving strategies to work around existing communication shortcomings with conceptually simple basic tools at hand.

Acknowledgements We thank the other members of the ARiSE project. Design, development and evaluation of the work described within this paper were a joint effort of all project partners. Furthermore, we wish to thank the participating students of the RabanusMaurus-Gymnasium Mainz, Germany and the Juventa Basic School, Siauliai, Lithuania, for their enthusiastic participation and willingness to do extra work in their

free time. The ARiSE project was co-funded by the European Commission within the Sixth Framework Programme (contract number IST-027039). Last but not least we want to thank Jürgen Wind, formerly with the Fraunhofer Gesellschaft, who managed the ARiSE project between 2006 and 2008.

References 1. ARiSE Project home page, http://www.arise-project.org 2. Wind, J.; Riege, K. and Bogen, M.: Spinnstube: A seated augmented reality display system. Proceedings 13th Eurographics Symposium on Environments, 10th Immersive Projection Technology Workshop, July 15-18, Weimar, Germany; Aire-la-Ville: Eurographics Association, (2007) 3. Lamanauskas, V., Pribeanu, C. and Pemberton, L.: Deliverable D5.1: Report on the methods and the usability including report on all summer schools. Public deliverable of the ARiSE project. [online] http://www.arise-project.org – downloads section (2009) 4. Reitmayr, G. and Schmalstieg, D.: Mobile collaborative augmented reality. Proceedings of IEEE and ACM International Symposium on Augmented Reality (ISAR 2001), October 2930, New York, NY, USA, (2001) 5. Ohshima, T.; Satoh, K.; Yamamoto, H. and Tamura, H.: AR2Hockey: a case study of collaborative augmented reality, Proceedings of the IEEE Virtual Reality Annual International Symposium. March 14-18, Atlanta, Georgia, USA, (1998) 6. Regenbrecht, H. T. and Wagner, M. T.: Interaction in a collaborative augmented reality environment. In CHI '02 Extended Abstracts on Human Factors in Computing Systems, April 20-25, Minneapolis, Minnesota, USA, (2002). 7. Müller, D.; Erbe, H.-H.: Collaborative Remote Laboratories in Engineering Education: Challenges and Visions. In: Gomes, L., Garcia-Zubia, J. (eds.): Advances on remote laboratories and e-learning experiences. Bilbao, Spain: University of Deusto, (2007) 8. Bruns, W.: Hyper-bonds – distributed collaboration in mixed reality. Annual Reviews in Control, Oxford: Elsevier, (2005) 9. Chastine, J. W.; Nagel, K.; Zhu, Y.; and Yearsovich, L.: Understanding the design space of referencing in collaborative augmented reality environments. Proceedings of Graphics interface 2007, May 28 – 30, Montreal, Canada. New York, NY: ACM, (2007) 10. Seichter, H.: Augmented Reality and Tangible Interfaces in Collaborative Urban Design. Proceedings of the 12th International CAAD Futures Conference: Integrating Technologies for Computer-Aided Design. July 11-13, University of Sydney, Sydney, Australia, (2007) 11. Billinghurst, M.: Augmented Reality in Education. New Horizons for Learning. [online] http://it.civil.aau.dk/it/education/reports/ar_edu.pdf. (2002) 12. Piaget, J.: The Science of Education and the Psychology of the Child. New York: Grossman, (1970) 13. Spiro, R.J., Coulson, R.L., Feltovich, P.J. and Anderson, D.K.: Cognitive flexibility theory: Advanced knowledge acquisition in ill-structured domains. In V. Patel (ed.), Proceedings of the 10th Annual Conference of the Cognitive Science Society. Hillsdale, NJ: Erlbaum, (1988) 14. Bruner, J.: Going Beyond the Information Given. New York: Norton, (1973) 15. Lamanauskas, V., Vilkonis, R. and Bilbokaite, R.: Pedagogical Evaluation of the Augmented Reality Platform. Internal Report on Task 5.3 in the ARISE project. Šiauliai University, Lithuania, (2008) 16. Pribeanu, C., Balog, A. and Iordache, D.: Usability Evaluation Summer School 2007. Unpublished Report on Task 5.1 in the ARISE project. National Institute for Research and Development in Informatics, Bucharest, Romania, (2008)

17. Bandura, A.: Social Learning Theory. New York: General Learning Press, (1977) 18. Vygotsky, L.S.: Mind in Society. Cambridge, MA: Harvard University Press, (1978) 19. Salomon, G. (ed.): Distributed Cognitions: Psychological and educational considerations. Cambridge: Cambridge University Press, (1993) 20. Roschelle, J., Rosas, R. and Nussbaum, M.: Towards a Design Framework for Mobile Computer-Supported Collaborative Learning. In Proceedings of the 2005 Conference on Computer Supported Collaborative Learning, Taipei, Taiwan, pp. 520 - 524, (2005) 21. Ryokai, K., Vaucelle, C. & J. Cassell.: Virtual Peers as Partners in Storytelling and Literacy Learning. Journal of Computer Assisted Learning, 19(2), pp. 195-208, (2003) 22. Pask, G.: Conversation, Cognition, and Learning. New York: Elsevier, (1975) 23. Gutwin, C. and Greenberg, S.: The Mechanics of Collaboration: Developing Low Cost Usability Evaluation Methods for Shared Workspaces. Proceedings of the 9th International Workshops on Enabling Technologies: Infrastructure for Collaborative Enterprises (WET ICE'00), (2000) 24. Kuck, R.; Wind, J.; Riege, K.; Bogen, M.: Improving the AVANGO VR/AR Framework: Lessons Learned, Schumann, Marco (Ed.) et al.: Virtuelle und Erweiterte Realität : 5. Workshop der GI-Fachgruppe VR/AR, Aachen: Shaker, (Berichte aus der Informatik), pp. 209-220, (2008) 25. The Ensemble Distributed Communication System – A group communication toolkit developed at Cornell University as well as Hebrew University of Jerusalem, [online] http://www.cs.technion.ac.il/dsl/projects/Ensemble/, 26. Skype home page, http://www.skype.com 27. Swan, J.E. and Gabbard, J.L.: Survey of User-Based Experimentation in Augmented Reality. Proceedings of the 1st International Conference on Virtual Reality, July 22-27, Las Vegas, Nevada, (2005) 28. Pinelle, D. and Gutwin, C.: A Review of Groupware Evaluations. Proceedings of WETICE 2000, Workshops on Enabling Technologies: Infrastructure for Collaborative Enterprises, IEEE Computer Society, 86-91, (2000) 29. Neale, D.C., Carroll, J.M., Rosson, M.B.: Evaluating Computer-Supported Cooperative Work: Models and Frameworks. In Proceedings of CSCW 2004: Conference on ComputerSupported Cooperative Work. ACM Press, New York, pp. 368–377, (2004) 30. Baker, K., Greenberg, S. and Gutwin, C.: Empirical development of a heuristic evaluation methodology for shared workspace groupware. In Proceedings of the 2002 ACM Conference on Computer Supported Cooperative Work. New Orleans, Nov., ACM Press, 96–105, (2002) 31. Steves, M., Morse, E., Gutwin, C. and Greenberg, S.: A comparison of usage evaluation and inspection methods for assessing groupware usability. In Proceedings of the 2001 International ACM SIGGROUP Conference on Supporting Group Work. ACM Press, 125– 134, (2001) 32. Pinelle, D. and Gutwin, C.: Groupware walkthrough: Adding context to groupware usability evaluation. In Proceedings of the 2002 SIGCHI Conference on Human Factors in Computing Systems. ACM Press, 455–462, (2002) 33. Baker, K., Greenberg, S. and Gutwin, C.: Heuristic evaluation of groupware based on the mechanics of collaboration. In M. Little and L. Nigay (Eds) Engineering for HumanComputer Interaction, LNCS Vol 2254, p123-139, Springer, (2001) 34. Dünser, A., Grasset, R., Seichter, H. and Billinghurst, M: Applying HCI principles to AR systems design. MRUI'07: Second International Workshop at the IEEE Virtual Reality Conference, Charlotte, North Carolina, USA, (2007) 35. Sutcliffe, A. and Gault, B.: Heuristic evaluation of virtual reality applications. Interacting with Computers, 16, pp. 831-849, (2004)

36. Milgram, P. and Kishino, F.: A taxonomy of mixed reality visual displays. IEICE Transactions on Information and Systems (Special Issue on Networked Reality), vol.E77D, no.12, pp. 1321-1329, (1994) 37. Kaufmann, H. and Dünser, A.: Summary of Usability Evaluations of an Educational Augmented Reality Application. In R. Shumaker (ed.), HCI International Conference, Beijing, China, pp. 660-669, (2007) 38. Nielsen, J.: Heuristic evaluation. In Nielsen, J., and Mack, R.L. (Eds.), Usability Inspection Methods, John Wiley & Sons, New York, NY, (1994)

Suggest Documents