ã»OS : windows vista (32 bit). ã»CPU : Inter® Coreâ¢2 Duo T8100 2.1GHz. ã»Memory : 2.00 GB. ã»Video card : GeForce 8400 GS. Accelerometer (Wii Remote).
Projection Stabilizing Method for Palm-top Display with Wearable Projector Teppei Konishi, Keisuke Tajimi, Nobuchika Sakata, Shogo Nishida Division of Systems Science and Applied Informatics Graduate School of Engineering Science, Osaka University, 1-3 Machikaneyama-cho, Toyonaka-city, Osaka 560-8531, Japan {konishi,tajimi,sakata,nishida}@nishilab.sys.es.osaka-u.ac.jp
Abstract. Wearable projectors have been extensively studied with a focus on miniaturization. However, stabilization of the projected images is critical, for otherwise the information cannot be read when the user walks or runs. This paper introduces two methods of stabilization for images projected on the user’s hand: optical stabilization and gyroscopic stabilization. We conducted an experiment to quantitatively evaluate the stability achieved using both methods. The results show that both achieve sufficient stabilization while the user is walking. However, gyroscopic stabilization is more effective against large-impulse vibrations caused by footfalls. Finally, we propose an algorithm that switches between these two methods depending on the degree of vibration as measured by an accelerometer. Key words: Mobile AR, Mobile Interface, Mobile Projector, Procams, Stabilization, Palm-top Display
1
Introduction
Advanced, flat-panel LCD and OEL displays can reach widths greater than previously possible. And many researchers are developing HMDs and clip-on displays as means of viewing information anywhere and at anytime. However, while embedded LCDs have limited visible areas, HMDs and clip-on displays tend to cause fatigue and discomfort when worn for long periods. As an alternative, this study examines the suitability of small projection devices such as mobile LED projectors for mobile display. Blasko [6] proposed wrist-worn projectors that project information on a wall to increase the visible area. Yamamoto [3][4] proposed shoulder-worn projectors capable of projecting virtual buttons onto a user’s palm, allowing it to be used as a remote control device. ’SIXth Sense’ [7][8] bridges the gap between the digital and physical world by augmenting the world around us with digital information and allowing interactions with natural hand gestures. SIXth Sense projects information on to any surface and allows users to interact with the information through natural hand gestures, arm movements, or by manipulating the surface itself. As stated above, embedding a projector in a mobile device allows for projection of the information on a wider surface
2
Stabilizing Projection Method for Palm-top Display with Wearable Projector
in real world than the device size. Furthermore, the user can interact with the projected information in the real world. The need for bulky devices is also eliminated. However, most studies have only evaluated the efficacy of such a system when users are stationary, ignoring complicated movements such as walking and running. Of course, users do not have to be stationary to view their cell phones; we often view information such as an SMS, a map, or a railway timetable on a cell phone while walking or running. Therefore, the purpose of this paper is to examine the efficacy of a mobile projector when viewed while walking and running.
2
Palm-top display
We propose a palm-top display that projects 2-D information on the user’s palm via a mobile device (Fig. 1). We attach the device, which is composed of a camera and a projector, on the user’s shoulder, while an ARToolKit [5] marker is placed on the user’s wrist. The system measures the position of the ARToolKit marker. The system draws the information using OpenGL. The position of the image is changed by OpenGL in accordance with the position of the marker. The system then projects the image onto the user’s palm. We defined this projection method as ”normal”. In this paper, a palm with information projected onto it is called a palm-top display. Mayol [1] conducted a simulation to measure the level of decoupling between camera movement and the wearer’s posture and motions through a combination of feedback from inertial and visual sensors, as well as active controls. To maximize the field of view of the handling space and minimize the amount of camera movement while walking, a wearable visual robot with a two-axis actuator, some sensors, and a camera is placed on the shoulder. Indeed, Kurata [2] asserted that the shoulder is suitable place for mounting a pro-cam in the context of real-world task support by means of MATLAB simulation. Therefore, our device is strapped around the breast and mounted on the shoulder. However, it is difficult to find a flat area of projection when the user is moving. Thus, this system projects information on the user’s palm. Although there is little occlusion problem which is caused objects and parallax between camera and palm, user’s palm is easy to be used as flat surface. Most studies [4][8] have focused on palm-top displays while users are in a stationary, upright position. This study focuses on the use of a palm-top display while the user is walking and running.
3
The problem of stabilizing the projected image using sensors while moving
Projected information on the palm-top display may not be stable because the device bounces when the user moves. Even if the wearable projector is firmly fastened, the user’s footfalls when walking or running make the device vibrate. When the user is not moving, however, the device can successfully project information. We measured the vibration of the device in three conditions (stationary,
Projection Stabilizing Method for Palm-top Display with Wearable Projector
Projected information on palm-top
3
Y X Pitch
Worn on breast
Z
Roll
Recognizes palm position by ARToolKit
Mobile device (camera、small projector)
Yaw Relative distance between the marker and the camera (Y direction)
Fig. 1. Palm-top display and definition of each axis
walking in place, and running in place) using a gyroscope, an accelerometer, and a optical sensor on the user’s shoulder (Fig. 2). We define each axis in Fig. 1. It would appear that the vibration from footfalls is the most significant impediment to the successful projection of information on the user’s palm. Therefore, the intent of this study is to stabilize the projection of information while the user is walking and running.
Degree of the angle (degree) Degre e(° )
Gyro sensor Gyroscope
Stationary
30
Walking in place
Pitch Yaw
Running in place
Roll
20 10 0 -10 -20
Footfall
-30 0
2
4
6
8
10 Time(s) Time (s)
12
14
16
18
20
ac cele ro meter
time(s) Acceleration (G)
Accelerometer
Y Z
3 2.5 2 1.5 1 0.5 0 -0.5 -1 -1.5 -2
0
2
4
6
8
10
12
14
16
18
20
ac cele ration Time (s)
Relative distance Rela tive disbetween tance(cm) the camera and the marker (cm)
Vision sensor Vision sensor
Vision sensor recognizes footfall
Y
Cannot track marker
20 15 10 5 0
Delay
-5 -10 -15 -20 0
2
4
6
8
10
Time(s) Time (s)
12
14
16
18
20
Close-up of domain bounded by a dotted line
Fig. 2. Sensor data in the three conditions (stationary, walking in place, and running in place)
Stabilizing Projection Method for Palm-top Display with Wearable Projector
相対距離(cm) Relative distance (cm)
4
55 50 45 40 35 30
Stationary
0
2000
Walking
4000
Time (ms) 時間(ms)
6000
8000 Normal LPF
Fig. 3. Sensor data in the two conditions (stationary and walking in place) relating to the relative distance between the marker and the camera (normal and LPF, Y-axis)
4
Projection stabilization method
In this section, we introduce two projection stabilization methods. 4.1
Optical stabilization using an LPF
Even if we use a 60-fps camera and a high-spec computer that can track in real time, the system would not be able to successfully track the marker, as is shown in Fig. 2. The position of image on user’s palm is not on correct position by unavoidable delay occurred with the process of capturing image, recognizing marker and projecting information while moving. Furthermore, when highfrequency vibration occurs due to footfalls, the delay makes a kind of resonance between projected image and projector itself. Consequently, this problem makes projected image unstable. Furthermore, it is difficult to consider that a mobile device immediately has a higher-spec camera and computer than those used in this study and the time of projecting image seems to be not quite short empirically. To solve this problem, we removed only the high-frequency vibration. If the system can remove the vibration, it can reduce the resonance caused by the delay. Therefore, we used an LPF (Low-pass filter) to measure the distance between the marker and camera (Y direction). The position of the information is changed by OpenGL in conjunction with the position of the marker, which applies the LPF. The system then projects the information on the user’s palm. The LPF’s cutoff frequency was determined to be about 5.5 Hz. As shown in Fig. 3, the LPF can remove the high-frequency vibration. To research the effect that the filter has on the system as a whole, we compared the LPF and normal. We used the camera and the projector as shown in Fig. 4. We measured the spatial displacement between the correct position and the projected position in both cases (walking in place and running in place), as shown in Fig. 5. If the system can track in real time, it will accurately project a red mark on the real red mark. Thus, the real red mark could be considered a representation of the ideally projected information. We shot subject’s hand with video, and then measured the displacement from the video grab image (50 frames; time: 4 sec; the subject walked 9 steps and ran 16 steps during this time). To make this measurement easier, the subject wore white glove to increase the visibility of the
Projection Stabilizing Method for Palm-top Display with Wearable Projector
5
projected information. When the subject was walking (Fig. 7), the LPF reduced the average displacement of normal by half. In addition, the LPF’s variance was smaller than normal. Similarly, when the subject was running, the LPF reduced about 6 mm from normal about average displacement. This result means that the LPF can project information that is more stable than normal. However, the LPF cannot sufficiently stabilize the projected information when footfalls cause the device to vibrate. Gyroscope( InertiaCube3
) ・3 axis, Data update rate :180 Hz ・26.2 (W) x 39.2 (D) x 14.8 (H) mm, 17 g
Camera (Point Grey Inc. FireFly ) ・60 fps, shutter speed :6 ms ・50 (W) x 30 (D) x 30 (H), 27 g
Computer(dell XPS) ・OS : windows vista (32 bit) ・CPU : Inter® Core™2 Duo T8100 2.1GHz ・Memory : 2.00 GB ・Video card : GeForce 8400 GS
Accelerometer (Wii Remote) ・3 axis, Data update rate : X and Z; 0.5 Hz to 1600 Hz, Y; 0.5 Hz to 550 Hz ・148 (W) x 36.2 (D) x 30.8 (H) mm, 87 g ・Bluetooth
Projector(ADTEC Inc. Bit) ・SVGA,15 lm, 27 (W) x 58 (D) x 90 (H) mm, 147 g
The device is fixed by backpack, hook and loop fastener
Fig. 4. Experimental instrument
White gloves are worn for the experiments
Projected red mark
Real red mark
6.5 cm
6.5 cm
2 cm
ARToolKit marker
The displacement between the projected mark and the real mark is measured
Fig. 5. Experimental conditions
4.2
Stabilization using a gyroscope
Optical stabilization comes with high calculation costs. Furthermore, it is difficult to stabilize the projected image when the user is walking or running. In
6
Stabilizing Projection Method for Palm-top Display with Wearable Projector
addition, the optical stabilization method involves large error by occlusion and worse light condition when compared to the inertial sensor. However, in good lighting conditions, optical tracking can precisely measure the position of the marker. When the user is moving, lighting conditions may change frequently. To solve this problem, we use both optical tracking and an inertial sensor. Regardless of occlusions, light conditions, or motion blur, a gyroscope and an accelerometer can accurately measure orientation and acceleration. However, gyroscopes and accelerometers have two major problems: drift and integral error. For this reason, we use optical tracking in good lighting conditions. In poorer lighting conditions, we use the gyroscope and the accelerometer. We assume the moment when the user’s foot hits the ground to be a poor condition for inertial tracking. We use InertiaCube3 as the accelerometer and gyroscope. InertiaCube3 outputs adjustment attitudes and accelerations compensated by reference to a magnetic direction sensor, gravitational acceleration, and a unique algorithm. We use this adjustment data as actual attitudes and accelerations. Basically, our proposed method applies the gyroscope to cancel rotation of impulse vibration so as to stabilize the projected information. As shown in Fig. 2, we find the pitch angle to be larger than the roll and the yaw. In this paper, to confirm the efficacy of this method, we stabilize the projected information by measuring the pitch angle. The relative distance between the marker and the camera (Dis) is measured by ARToolKit in advance. The correct position is then calculated by Eq. 1 (Fig. 6). After that, the position of image is changed by OpenGL according to the correct position. The system then projects the information on the user’s palm. To confirm the efficacy of this method, we conducted experiment under the same conditions as those presented in Section 4.1. We found that when the user is walking, this method reduces the average displacement of normal by half (Fig. 7). This method’s variance is smaller than normal. When the user is running, this method was found to reduce approximately 12 mm from normal and approximately 7 mm from LPF about average displacement. This result means that the method can more stably project information than normal. Moreover, when the user is running, the method can project a more stable image than that projected by the method using the LPF. However, this result is approval in short time. In longer experiments, the results related to the gyroscope involve large drift and integral errors, meaning that the image cannot be projected on the user’s palm correctly. Furthermore, these drift and integral errors cannot be corrected using only the gyroscope and the accelerometer. y ′ = y − Dis × 2 × sin
4.3
θ π × cos( − θ) 2 2
(1)
Discussion
The above results show that the gyroscopic and optical methods produce more stable images than normal. When comparing the optical stabilization method (with the LPF) to the gyroscopic method, it can be said that both are about
Projection Stabilizing Method for Palm-top Display with Wearable Projector
7
Y
Pro-Cam
Dis Projected position
θ (y,z)
Z Vibrated pro-cam
Correct position Ideal pro-cam
(y’,z’)
Fig. 6. Derivation of the stabilization equation Running in place
Walking in place
Normal
Ave. 6.12 mm
Normal
LPF
Ave. 3.57 mm
LPF LPF
Gyroscope
Ave. 3.02 mm
Gyroscope
0.0
3.0
6.0
9.0
12.0
15.0
Spatial displacement (mm)
Ave. 37.9 mm
Ave. 32.1 mm
Ave. 25.2 mm
0.0
20.0
40.0
60.0
80.0
Spatial displacement (mm)
Fig. 7. Spatial displacement (walking in place and running in place)
equally effective. The optical stabilization method, however, involves delays in the projection speed when the user is in a stationary, upright position and intentionally moves his palm. As noted Section 4.2, the optical sensor cannot track the palm in poor or changing lighting conditions. Indeed, feature point tracking and probabilistic distribution can partly solve these problems. On the other hand, according to above results, the optical and gyroscopic stabilization methods produce the same effects when the user walks for a short distance. Furthermore, the gyroscopic stabilization method produces a more stable image than the optical stabilization method when the user runs for a short distance. Thus, we assert that the gyroscopic stabilization method is suitable for compensating for the vibrations that may occur when the user runs for a short distance. Combining the gyroscopic and optical stabilization methods complements the weaknesses of the optical sensor (tracking errors and occlusion). Therefore, we have proposed an algorithm that changes the stabilization method depending on the degree of vibration and the lighting conditions. For example, in the case of a small impulse vibration such as stationary and slow behavior, the system applies the optical stabilization method. In the case of large impulse vibrations produced from running, the system applies the gyroscopic stabilization method. For this reason, we use an accelerometer to measure the vibration. As shown in Fig. 2, an accelerometer is high-frequency sampling and is able to recognize the footfalls of the user. Essentially, the accelerometer is the trigger that changes the stabilization method.
8
5
Stabilizing Projection Method for Palm-top Display with Wearable Projector
Conclusion and future work
In this paper, we proposed a palm-top display and two projection stabilization methods. The optical and the gyroscopic stabilization methods were observed to effectively stabilize information projected on the user’s palm. In the case of impulse vibrations caused by running short distances, the gyroscopic stabilization method was observed to be more effective than the optical method. Therefore, we developed an algorithm that changes the stabilization method depending on the degree of vibration through the implementation of an accelerometer. In the future, we will conduct further experiments with the algorithm using a combination of qualitative and quantitative analyses so as to evaluate the proposed stabilization method. Besides, we suppose to use direction of eyes. We hypothesize that the direction of the user’s eyes will be fixed on the projected information as he moves. Thus, we consider that the projected information is put on direction of eyes by making model data of the direction. In addition, we must consider the degree of accuracy stabilizing projection depended on projected contents. Degree of accuracy stabilizing projection should be proportional to the contents of the projected information (i.e., small text, complex pictures, and movies). Moreover, we suppose to consider that user wears glove which is suitable screen to increase the visibility to recognize projected information easier in spite of fatigue. Thus, we must also research the relationship between user fatigue and the quality of the projected information. Furthermore, the color of the projected information must be changed depending on the user’s skin color.
References 1. Mayol, W.W., Tordoff, B., Murray, D.W.: Designing a miniature wearable visual robot. In: ICRA, pp. 3725-3730 (2002) 2. Kurata. T, Satkata. N, Kourogi. M, Okuma. T: Toward the Realization of Interaction Using Nearby and Far Projection Surfaces with the BOWL ProCam, IPSJ SIG Notes. CVIM pp.1-8 20061109 (2006) (in Japanese) 3. Yamamoto, G., Xu, H., Sato, K.: PALMbit-Silhouette. In: Interaction 2008, pp. 109-116 (2008) (in Japanese) 4. Yamamoto, G., Nanbu, S., Xu, H., Sato, K.: PALMbit-Shadow: Accessing by Virtual Shadow. In: The 6th IEEE and ACM International Symposium on Mixed and Augmented Reality, Nara, Japan (2007) 5. ARToolkit Website, http://www.hitl.washington.edu/artoolkit/ 6. Blasko .G, Feiner .S, Coriand. F, Exploring Interaction with a Simulated WristWorn Projection Display, Proceedings of the Ninth IEEE International Symposium on Wearable Computers, p.2-9, October 18-21(2005) 7. Maes, P., Mistry, P.: The Sixth Sense. TED talk in Reframe session. In: TED 2009, Long Beach, CA, USA (2009) 8. P. Mistry, P. Maes, L. Chang. WUW-Wear Ur World-A Wearable Gestural Interface. In the CHI ’09 extended abstracts on Human factors in computing systems. Boston, USA, (2009)