SPOTLIGHT NAVIGATION: INTERACTION WITH A ...

3 downloads 0 Views 309KB Size Report
information spaces with simple hand gestures, similar to using a flashlight. We describe interaction techniques specifically designed for such devices that are ...
Advances in Pervasive Computing, A Collection of Contributions Presented at PERVASIVE 2004, April 18-23 2004, pp. 397-400, Linz/Vienna, Austria.

SPOTLIGHT NAVIGATION: INTERACTION WITH A HANDHELD PROJECTION DEVICE S. Rapp, G. Michelitsch, M. Osen, J. Williams, M. Barbisch, R. Bohan, Z. Valsan, M. Emele∗ Abstract This paper introduces a novel interaction paradigm for handheld devices using projection technology that leverages real world experience by users. They can navigate naturally in unlimited virtual information spaces with simple hand gestures, similar to using a flashlight. We describe interaction techniques specifically designed for such devices that are intuitive, efficient and – as informal experimentation with users has shown – fun to use.

1.

Introduction

A key effect of technological progress has been the ability to reduce the size of technological devices while, at the same time, increasing their power. This has had the advantage in that applications which could previously only run on stationary computing stations can now easily run on small portable devices. While this is undoubtedly a positive development, an accompanying disadvantage is that such applications are often not amenable to a small screen size (e.g. internet browsing) and are difficult to view. In addition, as the size of technological devices continues to decrease, problems arise interacting with such devices using small keys or other input means. The current form factor of the majority of small devices offers no easy answers to solving these problems. In this paper, we describe Spotlight Navigation which aims to break out of the constraints of the mobile form factor by coupling image projection to gesture interaction and combining input and output mechanisms into one mobile device. The use of image projection means that a device relying on this principle can employ a screen size which is far greater than the size of the device itself (see also [1]). We follow the approach that the projector is moved as a whole, just as when operating a flashlight. The movement of the device itself therefore acts as an input mechanism (see also [2, 3]). We envisage Spotlight Navigation as being a key device for fulfilling the goal of realizing truly pervasive computing in a number of ways. Firstly the simple and intuitive interaction mechanism combined with the large display area allows it to overcome many of the aforementioned problems of small devices potentially leading to adoption by a wider spectrum of users. Secondly, the close coupling between the Spotlight Navigation device and the environment in which it is used make such a device particularly amenable to an unobtrusive form of augmented reality which can usefully complement the many forms of environmental ambient intelligence which are rapidly becoming commonplace. ∗

Hedelfinger Str. 61, 70327 Stuttgart, Germany, [email protected]

Figure 1. Hardware prototype and application layout used in our experiments

2.

The flashlight metaphor

Spotlight Navigation uses the same operation principle as a hand-held torch or flashlight. The information which the device can access is located at certain positions on a wall. Users steer the projection light beam to the parts they are interested in. The software we developed creates the illusion that the projected content stays fixed on the wall, as if the light beam is activating hidden ink written on the wall. The spatial memory of the user completes the full picture in the users’ minds, just as one might build up a mental map of a dark room when using a flashlight. Thus, the whole wall can be used as a workspace, not just the fragment just being illuminated. Phrased alternatively, Spotlight Navigation works like a virtual desktop, where the display itself can be moved (instead of the mouse reaching the screen borders) to areas of the desktop to allow these to be seen. Due to the simple yet intuitive operation method, users are quickly able to operate such a device as it requires little more expertise to operate than a flashlight.

3.

Zooming and clutching

We have extended the metaphor in two aspects. In the real world, if we are interested in some detail of an object that our flashlight illuminates, we move closer to investigate. Spotlight navigation works by zooming in and out by means of a scroll wheel instead of forcing the user to move forwards and backwards from the wall. The fundamental property that the virtual content is fixed in a position on the real wall remains unaffected. Aspects of detail in any position on the information wall can then be accessed in a convenient manner by the user. The ability to situate information on a large screen however means that information located at the extremities of the screen might in reality be too far from the user to be easily read. The problem is exactly the same as might be encountered when standing in front of a large notice-board in the real world. The only solution in that situation is to walk to the area of the board which has information of interest and then bend down or stretch-upwards to try and read the information. In order to solve this problem in a convenient way, the Spotlight Navigation

device incorporates a “clutching” function which allows the interesting parts of the board or wall to be pulled towards the user, where they can then be “released” and viewed in the usual manner. This increase in information “real estate” offers interesting possibilities. The panning and zooming organization effectively allows for an unlimited working area that can be traversed quickly and easily. Instead of the need for overlapping windows, deeply nested menu structures and so on, unlimited space can be made use of to lay out items side by side. Another important aspect which we have identified is that of semantic zooming. This works on the principle of making important aspects of information more visually salient than non-important aspects of information. Combined with zooming, this means that all information is visible, but that interesting aspects of information are made visible sooner when zooming inwards. This allows the information context to be maintained around the aspects of information which we are interested in as well as allowing an easy way to move accurately from one information aspect to the next. A requirement which comes with this possibility however is that the user receives some guidance in order to be able to navigate from one interesting piece of information to the next. This is achieved in our design by using visual cues (see fig. 1).

4.

Manipulation and input of information

Previously we considered access to information, but how about manipulation and input of information? Spotlight Navigation uses two key methods in this respect: drag and drop and scribbling input. Direct manipulation user interfaces draw heavily on the drag an drop interaction technique that can also be utilized with spotlight navigation. The center of the projected image serves as the reference point to select an object, just like the mouse pointer. Pressing a button on the device picks up the object, which is then released when the user releases a mouse button over a chosen destination. In our prototype the button for the drag and drop function was put in a position where it could easily be operated with the middle or ring finger, so that users could easily zoom in and out during drag and drop operations with their thumb. Secondly, in order to input new information we rely on scribbling input which allows letters, symbols or simple drawings to be sketched on the wall. Again, the center of the screen serves as a reference point for the virtual ink. Pen up and down is accomplished with a button designed to sit under the index finger, the finger that is probably most easily associated with writing. As writing without haptic feedback could potentially be difficult, the quality of the scribbling is improved by an algorithmic tremor cancellation that eliminates natural hand tremors. Experience so far has shown that effective writing can be achieved fairly easily by novice users after a short period of time. Subsequent experimentation will show whether making use of standard handwriting recognizers is feasible or whether specialized recognizers will need to be trained for that purpose.

5.

Conclusions

Spotlight navigation devices have a large degree of potential for various pervasive applications. Handheld devices which use projection can theoretically be made very small but still, with appropriate user interaction technology, allow vast information spaces to be explored. The range of applications which might make use of this unlimited information are themselves virtually unlimited. All current standard PDA and desktop applications are potential candidates, however the fact that the device relies on real space in order to function, means that it can also interact with the real environment in ways which have not up to now been practical. As one example, notices can be scribbled on a wall in the

real world and be read by other users of similar devices, however there are a wide range of further possibilities as to how the device can interface with the environment which have yet to be explored. In short, all augmented reality scenarios developed for head mounted displays could equally well be realized with spotlight navigation. This paper proposes a new user interface paradigm for small form-factor, portable devices that allow the user to project information within the device onto arbitrary surfaces. By doing so, we anticipate to gain three fundamental advantages over current display solutions for such devices: (1) a virtually unlimited workspace that keeps information fixed in relation to the real-world, allowing for an effective use of spatial memory, (2) a very natural way of navigating in that virtual information space with hand gestures that require no learning effort at all, (3) an obvious way of revealing information associated with real-world objects simply by having this information appear whenever the user illuminates such an object with his spotlight device.

References [1] Web Site Symbol http://www.symbol.com/products/oem/lpd.html [2] G. Fitzmaurice Situated information spaces and spatially aware palmtop computers in Communications of the ACM, Special issue on Augmented Reality and UbiComp, July 1993, 36(7), pp. 38-49, 1993. [3] Ka-Ping Yee Peephole Displays: Handheld Devices as Virtual Windows. In Proceedings of CHI 2003, April 5-10, ACM Press, pp. 1–8, 2003.

Suggest Documents