Keywords: haptics, virtual reality, mixed reality, handheld terminals, android. ... which was only affordable by a select number of stakeholders, principally ... mobile phones and PDAs is notable worldwide, although having diverse impact.
Haptic Interaction in Virtual Reality Environments through Android-based Handheld Terminals Antonella Arca1 , Juan Luis Villalar1 , Jose Antonio Diaz-Nicolas1 , and Maria T. Arredondo1 1
Life Supporting Technologies-Technical University of Madrid, ETSI Telecomunicacion-Ciudad Universitaria, 28040-Madrid, Spain {aarca, jlvillal, jadiaz, mta}@lst.tfo.upm.es
Abstract. The integration of intuitive haptic interaction for navigating within virtual spaces may suppose a key factor for spreading 3Drelated technologies in Ambient Assisted Living environments. This paper presents a case study that makes use of an Android-based portable terminal for evaluating low cost solutions for everyday life applications derived from virtual and mixed reality. Keywords: haptics, virtual reality, mixed reality, handheld terminals, android.
1
Introduction
After a couple of decades of frustrated expectations for an incipient boom [1], Virtual and Mixed Reality (VMR) seems to be finally receiving its definite expansion impulse in a wide variety of domains. Formerly, VMR settings relied on expensive, non-standard, dedicated infrastructure (customised 3D caves, autostereoscopic screens, high-performance graphics clusters, head-mounted displays, wired gloves, motion trackers, ad hoc haptic gadgets, specialised CAD software, etc.), which was only affordable by a select number of stakeholders, principally powerful private companies and research institutes with huge capacity of investment revenue. Nowadays, standardisation is turning out to be essential around VMR, converging mostly on the ISO standard X3D, the successor to VRML [2]. At the same time, personal computers have considerably increased their process, graphics and connectivity capabilities, being actually prepared for basic 3D management. Furthermore, the open-source community is taking advantage of Internet for boosting up the design, reuse and combination of 3D applications and scenarios [3]. These facts are facilitating the approach of common population to VMR technologies, even from their habitual environments. However, some research effort is still required so as to provide feasible VMR interface devices which are really usable, interoperable, intuitive and inexpensive. This paper presents a case study for exploring the adoption of existing multimodal equipment for haptic interaction with virtual spaces. The underlying hypothesis
M. Tscheligi et al.(Eds.): AmI09 Salzburg, Austria, 2009 ICT&S Center, University of Salzburg
Haptic Interaction in Virtual Reality Environments
261
of this work states that the incorporation of convenient interaction devices to mass market will definitely pave the way for designing, implementing, evaluating and deploying low cost VMR-based solutions for a wide range of everyday life applications (e.g. training and education, therapeutic recreation, info-accessibility, telepresence).
2
Evaluation scenario
The research group where this work is being carried out has a long experience in investigating on usability and accessibility of both hardware and software interfaces in several application domains, like education, healthcare or domotics [4]. Following this line, the group participates in the VAALID initiative, which aims at providing VMR-founded tools that facilitate the process of designing accessible solutions for ambient intelligence environments to face everyday problems of elderly people [5]. From the VMR perspective, two pillars are essential in VAALID: software infrastructure and user interaction devices. On the one hand, software infrastructure is based on the InstantReality framework [6], developed at Fraunhofer IGD in close cooperation with the industry, in compliance with the VRML/X3D standards. On the other hand, as far as the pre-validation results are to be extrapolated to some extent from the virtual environment, there is a need to exploit intuitive interaction techniques so as not to distort the immersion feeling of users. Taking advantage of the simplicity and flexibility of InstantReality, certain research is being accomplished to leave behind the traditional VMR interaction gadgets in favour of the integration of original multimodal approaches derived as much as possible from existing widespread devices. In this sense, there is no doubt that the level of market penetration of mobile phones and PDAs is notable worldwide, although having diverse impact depending on the group of population. In particular, the most recent generation of smartphones is adding three-dimensional movement detection (through 3-axis accelerometers and compasses) to traditional touchscreens, trackballs, vibrators or voice recognition systems. By using just one general-purpose device -which is portable, wireless and, above all, integrated- the interaction possibilities with 3D scenarios increase considerably. Furthermore, the introduction of Android -the mobile operating system maintained by the Open Handset Alliance [7] - has significantly improved the development effort of new smartphone applications, providing high-level Java libraries for managing most hardware and software components without deep programming knowledge. As a result, a special setting was prepared to perform some technical and usability tests with a small group of users: – Hardware: The core platform was running in a personal computer with average characteristics. A panoramic 40” flat screen TV was used for visualisation. This 2D display was preferred instead of VMR devices because of its relative low cost and its high degree of dissemination at homes. An Androidbased smartphone -HTC Magic- was selected for multimodal interaction, being wirelessly linked to the computer through a Wi-Fi connection.
262
A. Arca et al.
– Software: A complete 3D scene of an interactive house was provided by Fraunhofer IGD, comprising several ambients, rooms and appliances. The scene was deployed in VRML and included the static representation of the scenario, the scripts for defining behaviour of dynamic objects and a virtual pointer to assist users in performing tasks. InstantReality was used for playing the scene, supporting peripherals integration. An Android application was developed to manage multimodal interaction with users over the smartphone. – Testing plan: Users were required to follow some guidelines autonomously, composed of an initial survey, a short familiarisation phase, four travelling tasks, one manipulation task and an overall questionnaire. Apart from the questionnaires, additional feedback came from times measurement and external supervision. After considering different approaches, multimodal user interaction was defined using the handheld device as follows, focusing on haptic interfaces: – Device rotation (i.e. forwards, backwards, clockwise and counter-clockwise): performs 3D movements within the virtual environment (respectively: advance, retreat, turn right and turn left). – Finger dragging over touchscreen: performs horizontal movements of the virtual pointer. – Trackball rotation: performs vertical movements of the virtual pointer. – Trackball click: sequentially picks up/releases a particular virtual object. – Vibrator: provides vibration feedback to the user when the virtual pointer collides with the virtual object.
3
Results
As this preliminary trial just faced technical and usability issues from a general perspective, no specific restrictions were required to the population sample. Tests were performed by 10 people, 7 males and 3 females, between 24 and 33 years old, all of them engineers, without significant impairments. While all users had advanced computer skills, only the 30% had experience with virtual environments (2 out of 3 related to videogames). Finally, most of them had low experience in testing products or software (less than once per year). It was detected that people who performed the familiarisation phase faster (1.36 min), spent more time in completing the subsequent tasks. On the contrary, those who took their time in this warming up (6 min), paid more attention in learning so they could finish tasks quicker. Regarding interaction intuitiveness, the first intention for 3D travelling was to approach the smartphone or move it away from one’s body, instead of rotating the device as explained in the guidelines. People were inclined to start tests rapidly rather than read instructions and give a try to each procedure. Turning left/right was the less intuitive movement for most users, mainly due to vague collision with walls, doors, etc. One frequent observation was the trend of users to take the virtual pointer as a marker of their location
Haptic Interaction in Virtual Reality Environments
263
in the 3D scene, so sometimes they tried to move the pointer, not the device, to change place. When exploring the environment, users got nervous because this pointer could go through walls and disappear temporally, giving sensation of being trapped in the house. To avoid these inconveniences, it was suggested to add a compass or a map in order to have a better idea of orientation within the house. Both trackball and vibrator were highly appreciated, although one of the users did not perceive any vibration. Whereas there were some confusion between the functionality of touchscreen and trackball (rotation), the trackball was widely preferred because of a faster control response. From the five tasks defined, the one that usually took longer was manipulation, consisting on grabbing a book from the floor of the virtual living room so as to release it over an adjacent table. Despite all given support, users got difficulties in fixing the distance between the virtual pointer and the other objects, which probably has much to do with the usage of a 2D screen.
4
Conclusions
Considering the collected data, users assessed the procedure for travelling through the 3D environment as intuitive, even taking into account some initial disorientation. The tests highlighted that it would be more convenient to combine movement and viewing direction, as users often confuse these controls. To avoid these phenomenon, the first thing would be to take care of any imprecise design in the virtual world, such as improper object trespassing. Personalisation could also be helpful, for example in sensitivity and progressiveness of simulation controls. In order to reinforce immersiveness as well as manipulation feedback, some additional multimodal outputs could be beneficial, either visually (e.g. colour change when the user holds an object) or acoustically (including realistic sounds like footsteps or door movement). To conclude with, an innovative interaction method for VMR environments based on widespread handheld terminals has been implemented and evaluated in a preliminary stage. The solution proposed, focused on haptic interfaces, shows promising results considering a two-fold perspective: the low effort required for multimodal application development and the positive assessment received from potential users regarding usability and intuitiveness. These outcomes seems to provide a good starting point for further research on this direction, although future steps should definitely involve elderly people as target population of the VAALID project.
5
Acknowledgements
This research work has been partially funded by the European Union 7FP in the context of the VAALID project (ICT-2007-224309), coordinated by SIEMENS S.A.
264
A. Arca et al.
References 1. Wedde, H.F.: How and where do we live in virtual reality and cyberspace - Threats and Potentials. Rendiconti del Seminario Matematico e Fisico di Milano (1997) 67:103-137, December 01 (1007) 2. Web3D Consortium. X3D/VRML standards: http://www.web3d.org/x3d/ (last access: Oct’09) 3. Carlsson, C., Hagsand, O.: DIVE - a platform for multi-user virtual environments. Computers & Graphics (Pergamon). Vol. 17, no. 6, pp. 663-669 (1993) 4. Jimenez-Mixco, V., De Las Heras, R., Villalar, J.L, Arredondo, M.T.: A new approach for accessible interaction within smart homes through virtual reality. In: Universal Access in Human-Computer Interaction. Intelligent and Ubiquitous Interaction Environments. LNCS (Springer). Vol. 5615/2009, pp. 75-81 (2009) 5. VAALID Project. http://www.vaalid-project.org/ (last access: Oct’09) 6. Fraunhofer IGD. InstantReality 1.0 story - what is it. Technical report. Available at: http://www.instantreality.org/ (last access: Oct’09) 7. Open Handset Alliance: http://www.openhandsetalliance.com/ (last access: Oct’09)