EXMAR: EXpanded view of Mobile Augmented ... - Semantic Scholar

2 downloads 527 Views 1MB Size Report
Figure 1. EXMAR supports exploration of off-screen view. (a) user's view when using EXMAR .... Our prototype was implemented in Objective-C on iPhone 3GS.
EXMAR: EXpanded view of Mobile Augmented Reality Sungjae Hwang, Hyungeun Jo, and Jung-hee Ryu Graduate School of Culture Technology KAIST

Figure 1. EXMAR supports exploration of off-screen view. (a) user’s view when using EXMAR (b) magnification of (a) when using EXMAR. User can explore off-screen point of interests with environmental contextual information by simple dragging gesture. Green dot represents augmented tags ( looking through the sequence of images from left to right )

ABSTRACT

1

There have been many studies to minimize the psychological and physical load increase caused by mobile augmented reality systems. In this paper, we propose a new technique called “EXMAR”, which enables the user to explore his/her surroundings with an expanded field of view, resulting in a decrease of physical movement. Through this novel interaction technique, the user can explore off-screen point of interests with environmental contextual information by simple dragging gestures. To evaluate this initial approach, we conducted a proof of concept usability test under a set of scenarios such as “Exploring objects behind the user”, “Avoiding the invasion of personal space” and “Walk and type with front-view.” Through this initial examination, we found that users can explore off-screen point of interests and grasp the spatial relations without the increase of mental effort. We believe that this preliminary study gives a meaningful indication that employing the interactive field of view can be a useful method to decrease the physical load without any additional mental efforts in a mixed and augmented reality environment.

Augmented Reality (AR) is the system that combines real and computer-generated information in the real world [5]. Over the past few years, as mobile devices have grown rapidly, browsing information in mobile augmented reality has become a widely improved practice. However, browsing around to find points of interest with handheld devices carries a number of drawbacks. A major problem is the limited field of view and no support for stereo, resulting in less depth cues [1]. For instance, the iPhone 3GS has much less Field Of View (FOV of 38.7 horizontally and 50.1 degrees vertically) comparing to the naked eye (FOV of about < 180 degrees). This disadvantage also leads to an increase of physical movement. To overcome this, a lot of studies have focused on showing spatial cues for off-screen points of interest. However, these methods cannot convey environmental relations between computer-generated tags and real objects to the user. Furthermore, occlusion by overview occurs. Another problem is the “invasion of personal space” when using mobile devices to get someone’s information in the AR environment. Recently, face recognition based prototypes to get augmented ID was introduced. However, this scenario depends on directed and gaze-based scanning of the face at a personal distance. We believe that it is not socially acceptable to face mobile devices towards unknown people directly in real life. To overcome these two key issues, we propose a new interaction method EXMAR, which consists of two novel techniques: The first is an expanded and undistorted field of view from fisheye lens. The second is an interaction method to dynamically change the field of view on the screen.

KEYWORDS: Augmented Reality, Mixed Reality, Distortion Correction, Expanded Field Of View (EFOV), Fish-eye lens, interaction INDEX TERMS: H.5.1. [Information System]: Multimedia information systems — Artificial, augmented, and virtual realities

[email protected] [email protected] [email protected]

2

INTRODUCTION

RELATED WORK

It is challenging to display off-screen or occluded points of interest in augmented reality [3]. To address this issue, a number

Figure 2. The sequential images of correcting distortion. Figure 1. The sequence of correcting distorted image from fisheye camera.

of studies have focused on visualizing spatial cues for off-screen objects. Several examples are 2D radar from top-view and 3D arrows [4] aiming for off screen objects. However, these approaches have no clues of environmental context of relations between computer-generated tags and real objects. Moreover, occlusions by overview occur. Another approach to display offscreen areas is World In Miniature (WIM) in augmented reality by Bell et al [2]. This technique provides the user with a 3D miniature of the full-scale surrounding environment. Unlike 2D radar and 3D arrows, this allows users to grasp spatial relations between computer-generated information and model of surrounding space. However, occlusions also occur with this method from the miniature view and require additional mental transformation from miniature to real environment. Sandor et al [3] displayed the off-screen points of interest using egocentric visualization. This type of egocentric information display provides perceptual advantages that allow users to grasp spatial relations faster than an exocentric map display. However, this visualization conveys space-distorting which might result in cognitive dissonance. 3

DESIGN OF EXMAR

3.1 Design Considerations We designed EXMAR in the aspect of physical and mental load. In order to reduce physical and mental effort caused by augmented reality system, we brought a concept of dynamic expanded field of view. 3.2 Hardware and Software Design Our prototype was implemented in Objective-C on iPhone 3GS. To deliver expanded field of view, we used 180º fish eye lens ( K180 - Digital King ). Figure 2 shows the sequence of processing the image from camera. First, grab the preview frame from fisheye lens. To increase a performance, we used light weight frame to process. After that, correct distorted image using OpenCV. Finally, show a center of undistorted image on the screen as same size of original image. In figure 3, (a) is an example of distorted raw image from fisheye lens. (b) is an undistorted image using OpenCV. (c) is the final rectangular image on the screen of mobile device. We made final field of view as same of original one. 4

three types of scenarios: “Exploring objects behind the user when sitting”, “Avoiding the invasion of personal space” and “Walk and type with front-view holding the device horizontally.” After initial trials, we had a simple interview with them. All participants mentioned that this is much useful and comfortable. Two mentioned that this is most useful particularly on “Walk and type scenario”. Three stated that it is most useful in the “Avoiding the invasion of personal space” scenario. Based on the user feedback, we have made two important enhancements to our initial prototype. First, visual guidance is needed. Users are sometimes confused about the initial position of field while changing the field of view by dragging gestures. Second, interaction methods should be improved. Some users complained about the repetitive gestures needed to explore at times. This can be improved by other interaction techniques such as tilting the device or accelerating the device. CONCLUSION We have presented our initial concept of expanded view in mobile augmented reality and conducted a proof of concept usability test. While EXMAR technique is in the early stage, we can conclude that users can explore off-screen points of interest and grasp the spatial relations without the increase of mental effort. Furthermore, we also found that users can explore surroundings with less physical movement and avoid invading others’ personal space. For future work, we plan to extend our ideas to use other interaction techniques such as tilting or accelerating device rather than dragging it. We also plan to compare this technique with additional related works and conduct a formal evaluation. Acknowledgements The authors wish to thank Comogard from Ben Gurion University for providing source code to access live video on the iPhone 3GS. REFERENCES [1]

[2]

[3]

[4]

USER TESTS

We conduct a quick and dirty test on 7 participants, of which 2 were women and 5 were men, with an average age of 26.6. During the test, participants were freely interacting with EXMAR under

[5]

A. Henrysson, “Bringing Augmented Reality to Mobile Phones”, Linkoping Studies in Science and Technology Dissertations, No. 1145, 2007 B. Bell, T. Hollerer, and S. Feiner. “An Annotated Situationawareness Aid for Augmented Reality” Proc. UIST 2002 ( Proceedings of the 15th Annual ACM Symposium on User Interface Software and Technolgy), pp. 213 – 216, 2002 C. Sandor, A. Cunningham, U. Eck, D. Urquhart, G. Jarvis, A. Dey, S. Barbier, M. R. Marner, and S. Rhee, “Egocentric Space-Distorting Visualizations for Rapid Environment Exploration in Mobile Mixed Reality”, Proc. ISMAR 2009, pp. 211 – 212, 2009 M. Tonnis, and G. Klinker, “Effective Control of a Car Driver’s Attention for Visual and Acoustic Guidance towards the Direction of Imminent Dangers,” Proc. ISMAR ’06, 2006 R. T. Azuma, “A Survey of Augmented Reality,” Presence: Teleoperators and Virtual Environments vol. 6, no. 4, pp 355 – 385, 1997