IdWristbands: IR-based User Identification on Multi-touch ... - CiteSeerX

3 downloads 55 Views 4MB Size Report
Nov 7, 2010 - (e.g., Microsoft Surface [2]). These cameras are capable of seeing the LEDs' beam and allow the subsequent system to detect them, in parallel ...
IdWristbands: IR-based User Identification on Multi-touch Surfaces Tobias Meyer University of L¨ubeck L¨ubeck, DE [email protected]

Dominik Schmidt Computing Department, Lancaster University Lancaster, UK [email protected]

ABSTRACT

Multi-touch surfaces are predestined to be used by many users at the same time. However, most systems are not able to distinguish touches of different users. This paper presents ongoing work on using wristbands, equipped with infrared LEDs, to provide touches with identities. The LEDs transmit coded light pulses as an identifier. We are using a specific blinkingpattern to determine the orientation of the wristband in order to reliably associate corresponding touches. H.5.2 Information interfaces and presentation (e.g., HCI): User Interfaces–Input devices and strategies (e.g., mouse, touchscreen) ACM Classification:

General terms:

Design, Human Factors, Algorithms

Keywords: Interactive tabletops, surface computing, multitouch interaction, user identification INTRODUCTION

Interactive surfaces are a compelling platform for co-located collaboration. Hence, a lot of use cases in this context suggest applications that are simultaneously controlled by multiple users. At the same time, there are only few approaches allowing to distinguish between touches of different users. Without this information, all input looks alike: It is impossible to tell apart interactions by different users. However, user identification enables a compelling set of interactions, such as multiuser-aware interfaces [7] or access control [5, 9]. In our approach, every user wears a wristband (Figure 1). Attached are several infrared LEDs which transmit an identification-code. This approach leverages that many interactive surfaces are using infrared cameras for the detection of touches (e.g., Microsoft Surface [2]). These cameras are capable of seeing the LEDs’ beam and allow the subsequent system to detect them, in parallel to finger touches. The transmitted code is used to allocate a unique identifier to every wristband and by that to every finger touch registered in an area near it. To narrow down said area, the system uses multiple LEDs to determinate the wristbands’ orientation and derive the likely location of the users’ hand. Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. ITS’10, November 7-10, 2010, Saarbr¨ucken, Germany. Copyright 2010 ACM 978-1-4503-0399-6/10/11...$10.00.

Figure 1: User wearing an IdWristband: a Lilypad Arduino board with an ATmega 328 controls the two LEDs to send identification codes.

In related work, Dietz and Leigh [3] propose an approach of determining the location and originator of a touch on an interactive surface by using a conductive grid. Up to four users are constantly wired up to a receiver and touches are identified and located by capacitive coupling. This system already allows different users to be identified by the system, but it restricts the user’s ability to move around the table. Dohse et al. [4] suggest using hand-tracking to differentiate touches from different hands. The system uses a top mounted camera to track the users’ hands and associate the touches registered underneath a hand with it. Schmidt et al. [9] extend this approach by introducing a biometric hand-contour based system. It allows users to identify by laying their hands flat on the surface. While this does not require the users to be augmented, the system needs them to re-identify every time they moved a hand off the table. While our approach requires users to wear wristbands, they can move around freely without restrictions. They are not required to stay in one position, and every touch can be identified. In addition, IdWristbands are designed to be used with existing camera-based systems. Roth et al. [6] are following a path similar to ours. Instead of a wristband their users are wearing a ring, equipped with a circuit board, on one of their fingers. Their approach is mainly targeting the authentication of users, while we are concentrating on the differentiation of touches originating from different hands. Since we are using two LEDs instead of one we are additionally able to determinate the hands orientation. SYSTEM Hardware

Based on Schmidt [8] we are using an FTIR table with a Point Grey Grasshopper camera for touch and LED detection. The camera is equipped with an infrared-filter and allows to query

The system knows which bit of the transmitted code should be received from which LED and uses this information together with the prediction of movement to determine the wristband’s orientation (Figure 3). Hence the approximate position of the hand is known. Corresponding touches are searched for in a smaller region, decreasing the possibility of false positive allocations. (a) Method 1: Basic allocation (b) Method 2: Advanced allocathrough nearest neighbour tion using the wristband’s orientation

Figure 2: Allocation methods

1.

2.

Left LED on

3.

Right LED on

Calculate perpendicular cone

Figure 3: Steps to determinate orientation: It is known which of the two LEDs is supposed to transmit which bit of a code. Thus, the orientation can be derived.

frames at a frequency of 120 Hz. The wristband is a regular sports-wristband equipped with Lilypad Arduino circuits [1] and a small battery. The Lilypad Arduino incorporates an ATmega 328, which we use to control the LEDs. The LEDs have a wavelength of 875nm and are sanded to diffuse the light and hence are visible from broader angles. Identification

The wristbands use a simple code consisting of light pulses of different length to transmit an identification number. At first occurrence LEDs are not easy to tell apart from finger touches in the camera’s image. The system monitors every light-dot of specified size. If a dot is visible for a time longer then the maximal pulse-time (24ms in our current system) it is categorised as a finger, otherwise its history is recorded. As soon as the history contains enough information to decode the identity the dot is categorised as a wristband with the received identity. With the current setup the transmission of a two-bit word (allowing to identify up to four simultaneous users) takes between 0.1 and 0.2 seconds. Allocation

At the moment we are exploring different strategies to associate detected LEDs to corresponding touches. The first method, shown in Figure 2(a), is a simple nearest neighbour allocation. Every finger dot is allocated to the wristband next to it, while the distance is restricted to a maximum (similar to the approach taken in [6]). This allows every touch in a circle around the wristband to be identified by it. This method has a good rate of true positive allocations. But it also allows touches in a large area in which the hand wearing the wristband could not reach, and thereby increases the potential of false positives allocations. The second method, shown in Figure 2(b), counters the problems of the first method by using two alternating LEDs.

DISCUSSION & FUTURE WORK

The boundaries of our system are mainly determined by the camera’s speed. The speed of code transmission has to match the camera’s speed and this again defines the duration it takes to distinguish between touches and LEDs. Currently, we are working on improving the recognition of orientation and the detection of fast movements. In order to achieve this, we are looking into changing the code used for transmission. At the same time we are building additional wristbands and prepare a collaborative drawing application to test the system in a real life scenario. We also plan to combine our approach with the hand-contour recognition described in [9]. Users would identify by providing their hand-contour once. After that, their wristband is mapped to their identity and is used to identify touches throughout the session. REFERENCES

1. L Buechley, M Eisenberg, and J Catchen. The LilyPad Arduino: using computational textiles to investigate engagement, aesthetics, and diversity in computer science education. In Proc. CHI, pages 423–432, 2008. 2. Microsoft Corp. Surface. http://www.microsoft.com/surface, May 2010. 3. P H Dietz and D Leigh. DiamondTouch: A multi-user touch technology. In Proc. UIST, pages 219–226, 2001. 4. K C Dohse, T Dohse, J D Still, and D J Parkhurst. Enhancing multi-user interaction with multi-touch tabletop displays using hand tracking. In Proc. ACHI, pages 297– 302, 2008. 5. M Ringel, K Ryall, C Shen, C Forlines, and F Vernier. Release, relocate, reorient, resize: fluid techniques for document sharing on multi-user interactive tables. In Proc. CHI, pages 1441–1444, 2004. 6. V Roth, P Schmidt, and B G¨uldenring. The IR ring: Authenticating users’ touches on a multi-touch display. In Proc. UIST, 2010. 7. K Ryall, A Esenther, K Everitt, C Forlines, M R Morris, C Shen, S Shipman, and F Vernier. iDwidgets: Parameterizing widgets by user identity. In Proc. INTERACT, pages 1124–1128, 2005. 8. D Schmidt. Design and realization of an interactive multi-touch table. Technical report, Lancaster University, 2009. 9. D Schmidt, M K Chong, and H Gellersen. HandsDown: Hand-contour-based user identification for interactive surfaces. In Proc. NordiCHI, 2010.

Suggest Documents