MobileSurface: Interaction in the Air for Mobile Computing - CiteSeerX

2 downloads 0 Views 375KB Size Report
Ji Zhao1, Hujia Liu1, Chunhui Zhang1, and Zhengyou Zhang2. 1Microsoft Research Asia. No.49, Zhichun Road, Hai Dian District. Beijing 100190, China.
MobileSurface: Interaction in the Air for Mobile Computing Ji Zhao1, Hujia Liu1, Chunhui Zhang1, and Zhengyou Zhang2

1

2 Microsoft Research Asia Microsoft Research No.49, Zhichun Road, Hai Dian District 1 Microsoft Way Beijing 100190, China Redmond, WA, USA {jzha, v-hujliu, chhzhang, zhang}@microsoft.com

ABSTRACT

We describe a virtual interactive surface technology based on a projector-camera system connected to a mobile device. This system, named mobile surface, can project images on any free surfaces and enable interaction in the air within the projection area. The projector used in the system scans a laser beam very quickly across the projection area to produce a stable image at 60 fps. The camera-projector synchronization is applied to obtain the image of the appointed scanning line. So our system can project what is perceived as a stable image onto the display surface, while simultaneously working as a structured light 3D scanning system. ACM Classification: H.5.2. [Information interfaces and presentation]: User interfaces—input devices and strategies; Graphical user interfaces;

Figure 1: System hardware setup description MAIN FEATURES AND IMPLEMTATIONS 3D scanning system during normal projection

General terms: Design, Human Factors, Algorithms.

The laser projector scans a color laser beam across the projection area line by line.The projector will output the vertical synchronization signal, which indicates the time the projector starts to scan a new frame. Let the first scanning line start at T0, and the last scan line finish at T1. Assuming that there are N scanning lines in one frame, then the start time and the end time of any scanning line can be calculated. If we let the camera start exposure at the beginning of one scanning line and stop exposure at the end, we can obtain the image of that scan line. The dynamic scanning process is shown in Figure 2. 3D information can be calculated in this way as described in [1].

Keywords: Anywhere interaction, mobile, pico-projector,

projector-camera system INTRODUCTION

With the increasing use of Pico-Projector system combining with camera, it allows us to realize a new mobile interactive technology that can manipulate digital content without a screen, beyond what is possible with the regular computing device, such as desktop computer or mobile device. There are many compelling applications of such systems. One is to manipulate digital objects directly with our fingers and hands, analogous to our interaction in the real world. Another is to enable further interactions in an augmented reality environment, bridging both the digital and real world. 3D sensing technology can be applied to overcome the flatsurface limitation. Moreover, it is also useful for gesture detection and tracking, and user interaction in general. Noticeably, from the structure point of view, the projectorcamera system based on laser beams is quite similar to the structured light system, which is a well-known 3D scanning technique in surface measurement area. We can therefore utilize this feature to implement our system.

Copyright is held by the owner/author(s). UISSTT’1’100,, October 3–6, 2010, New York, NY, USA.

Figure 2: The dynamic scan process

1

EXPERIMENTS Projection on a curved surface

Virtual pinching

In order to realize virtual pinching, we make some changes to the system. We set the camera working with normal exposure time and short exposure time alternately. During normal exposure, the system works as an image capturing device and during short exposure, the system works as a line structured light 3D scanning system. We apply a robust and real-time computer vision algorithm as reported in [2] to detect users’ pinch and release gestures. The algorithm converts the image to binary and uses a simple connected components analysis to identify the users’ gesture. When users put their thumb and index fingers looking like Ushaped and having similar orientation, the gesture is identified as a pinch.

In this experiment, we use a piece of paper printed with grid to show the effect of adaptive distortion correction (ADC). The system models the paper with 6 scan lines, creating a surface mesh model with almost 400 triangles. The surface model can be updated 8.6 times per second.

After the U-shaped pinch gesture detection, we dynamically put a scan line at the tip positions of two fingers. Then we can get the 3D positions of the thumb and index finger according to the method introduced in the above sections.

Figure 3: Comparison of ADC effect

Figure 3 shows the effect of ADC. We can see that without ADC, the content is distorted on a free curved paper. While with ADC, the content remains aligned and undistorted. Writing in the air

As mentioned above, as a dynamic structured light system. It can detect objects intersected with the laser light plane. The 3D position of the intersected points can also be calculated. With this feature, the finger tips can be continuously detected and tracked.

Figure 5: With virtual pinching, user can lift up a virtual box and put it onto another.

As shown in Figure 5, by keeping the pinching gesture tracked constantly, the system enables users to pinch over a virtual object, move the object, lift it up or down in the 3D scene. When the tracking of the pinch gesture is interrupted, the object is released back down to the virtual ground. Conclusion

In this paper, we have introduced Mobile Surface, a new type of virtual interactive surface technology for mobile computing. The main contributions of our work are as follows:  3D scanning system while projecting  Distortion-corrected projection on a free shaped surface  Interaction in the air

Figure 4: Writing and multi-touch in the air In Figure 4, the left image shows the ability to write in the

air with mobile surface. The right image shows the ability to do multi-touch in the air. In this experiment, only one scan line is used, and is set at the fixed position. Our system monitors any object intersects with the light plane defined by this scan line. With this setup, the system is able to report the positions of tracked points 60 times every second, enabling a smooth user experience.

REFERENCES

1.Blais, F. Review of 20 years of range sensor development. J. Electron.Imag. 13, 1(2004), 231–240. 2.Wilson, A.D. Robust computer vision-based detection of pinching for one and two-handed gesture input. In: Proc. ACM UIST'06, Montreux, Switzerland, ACM Press (2006) 255-258. 2

Suggest Documents