displays, a wall-mounted plasma display, a table with a built-in touch screen, a wireless ... mouse and keyboard from a personal wireless networked device or laptop. .... In an interactive environment with many computers there is a need for ...
Magic Bowl: a Tangible User Interface for Configuration of Interactive Environments Maria CRONÉ1, Erik ELIASSON2, Johan MATTSSON1, Paweł WIATR2 1
Department of Computer and Systems Science, Stockholm University/Royal Institute of Technology, Stockholm, Sweden {mariac, johanm}@dsv.su.se
2
Department of Microelectronics and Information Technology Royal Institute of Technology Stockholm, Sweden
{eliasson, pawel}@imit.kth.se
Abstract
This paper describes our work with the design and deployment of a tangible user interface for controlling an interactive workspace. Our intention was to design a user interface that makes it possible to use the room without focusing on the technology behind. We found that the problems are especially at the beginning of a work session when a group enters a totally shut-down workspace and have to spend valuable time starting and configuring every device before they are able to start working. The Magic Bowl makes it possible to quickly start the interactive workspace with a personalized configuration.
Introduction Interactive environments are designed to support creative collaborative work, e.g. project work, meetings, workshops etc. They are technology-rich environments for displaying and manipulating information of many forms on inter-linked shared and personal devices and computers. Interactive spaces are often scarce resources that are shared by many users with different needs and preferences.
Figure 1: iLounge at Royal Institute of Technology, KTH in distributed and co-located work scenarios
In our research we focus on distributed and co-located work, and the setting is the iLounge [Fig.1], an interactive environment at KTH, designed and built during the spring of 2002. We were inspired by Mark Weiser´s vision of ubiquitous computing and calm technology [1]. The equipment of the iLounge includes touch sensitive wall-mounted
displays, a wall-mounted plasma display, a table with a built-in touch screen, a wireless network that enables the use of laptops and PDA's as an integral part of the collaboration. In this environment, all co-workers are able to interact with the large shared displays both using the available common wireless mouse or keyboard or by redirecting mouse and keyboard from a personal wireless networked device or laptop. The users can also easily share and fetch documents and data between their personal devices and the shared devices. There are also many other environmental features interactively customizable, such as lighting, curtains, background sound etc., which makes it possible to change and distinguish the atmosphere between different groups and work sessions. Since the room is a shared resource, a group that enters the room can not expect to find it in the same state as it was in when they left it after their last session. They will have to spend some time re-configuring the room at the beginning of each session. To offer an easy way of re-configuring the environment, we suggest a tangible user interface where each user group would be represented by a physical icon, phicon. The phicon is a small, physical object with an RFID tag, and when placed in the Magic Bowl [Fig. 2] it causes the environment to be reconfigured according to the group preferences. The Magic Bowl holds an RFID reader, which can read one or more RFID tags simultaneously.
Figure 2: Magic Bowl and Spaceman phicon
The motivation for a tangible user interface is that all displays in the room are turned off when a group enters the iLounge. We also think that the users’ attitude towards the user interface will be positively influenced by the fact that they can choose any object to represent their group (as long as it fits in the bowl and can be fitted with an RFID tag).
Problem description A group of users that enters the interactive environment spends valuable time at the beginning of each work session configuring the room. An easy and flexible way to control the state of a complex and technology rich environment such as iLounge is needed. In particular, it is needed in the startup and shutdown phase where all graphical displays and feedback sources are turned off. Based on our experience from earlier user studies [2] of our environment the problem can be divided in two parts: Devices - Controlling the physical aspects of the environment, like turning on the lights, computers, projectors, etc. Applications - Managing files and other work material related to a certain group of users. In our suggested design, computers, displays and projectors are started when preconfigured phicons are placed in the bowl. The lights turn on and at one of the large wall displays the group will e.g. get a list of all files and applications that were running at the end of their last meeting.
Wireless buttons have been used to control various settings in the iLounge. The mapping between a button and its effect in the iLounge has been found non-intuitive and an issue that needs resolving. It is desirable to have interfaces that are easy to understand and identifiable to know what they are controlling.
Motivation The research on a new user interface for the iLounge and other interactive workspaces were initiated for two purposes. We especially needed a suitable interface that will make the room more usable. We also wanted to explore the benefits of tangible interfaces. The following list of items is based on experiences from research and project work in the iLounge. It describes areas where improvements are needed in order to make the iLounge an efficient and productive work environment. The user interface needs to be flexible and extendable. Ideally users can turn any physical object into a phicon that can control any setting of any controllable device in the room. Users should be able to see how the room is configured, and the phicons in the bowl allows users to quickly see what the active configurations are. Usage of the room has shown that there is a large unavoidable latency when controlling some of the devices. We expect that users will be more tolerant to the latencies and less frequently overload the system by repeating commands several times when using the proposed tangible user interface. We want to explore if a reliable tangible interface for controlling the room makes it possible to move the issue of feedback from a problem on response time in the computers and software over to the real world. If the users trust the physical representation of the phicons currently presented in the bowl and that the configurations represented by the phicons will be applied within reasonable time, then the real world action of performing the placement of the object in the bowl would be the actual feedback.
Hypothesis A tangible user interface for unified control of an interaction rich environment will be easier to use than per device configuration that is in use today.
Related work Computing is shifting from the desktop environment to the physical environment that we inhabit. The motivation behind tangible user interfaces is to support a natural way of interacting with the environment. Human capabilities and skills as well as the natural affordances of everyday objects are taken advantage of in the interaction [3]. Manipulating physical, graspable things is a natural way of interacting with the physical world around us. People can manipulate digital information by handling physical objects. The user interacts by the direct manipulation of physical representations of digital data. There are many research examples of physical embodiment of digital information. One is the Marble Answering Machine [4] where incoming voice messages are physically instantiated as marbles. The user can pick up the message (marble) and drop it into a notch in the machine to play the message. The user can also place the marble onto an augmented telephone, thus dialing the caller automatically. In an interactive environment with many computers there is a need for interaction techniques that easily lets the user move digital information between computers. By using physical objects that are linked to the digital information, people can move information between computers without having to know more than the relationships between the physical objects. Since the beginning of the 90s the Tangible Media Group at MIT Media Laboratory has been doing research in this area [5].
Another interesting way of moving digital objects between computers, or within one computer, is the Pick-and-Drop technique [6]. Pick-and-Drop, inspired by Japanese chops sticks, is an extended concept of the commonly used drag-and-drop technique. With this technique, a user picks up an object on one computer display with a stylus, and then drops it on a (possibly different) computer display. A similar technique for moving digital objects between displays by manipulating physical objects is Passage, developed at the Fraunhofer IPSI research institute in Darmstadt, Germany. Passage is part of the Beach software infrastructure for interactive environments [7]. We have considered several technologies that could be used to detect which physical item is put into the bowl, like barcodes and a barcode scanner, weight and size measurements or even video recognition. We have chosen RFID tags to be the most appropriate for our setup mostly because of the simplicity of deployment both on the software side development and the physical phicon creation. There have been several applications already deployed [8] where RFID tags have been used as technique for supporting tangible computation that we have been inspired by.
Interactive Room Operating System As backend software we use iROS (interactive Room Operating System)[9] middleware developed by our partners in the iSpaces project at Stanford University [10]. Within iROS the communication is based on events launched and handled by various devices and coordinated by the centralized Event Heap. An entity firing an event does not decide who the receiver is; instead it sets any number of parameters within the event. Receivers register to the Event Heap with a set of matching criteria for receiving events. The sender and receiver are decoupled from each other. This enables easy development of applications that use several input devices and allows one device to control or to be used by many applications in the room. When many innovative interaction elements such as the iStuff [11] are introduced as decoupled components, the Patch Panel [12] provides a flexible application to perform translations and the redirections of events. This makes it possible to map and create meaning of an abstract event, and is a flexible way to configure the interactions that are carried out. As an example, the Patch Panel typically is set up to configure translations between low-level device events and higher-level application events. The WorkspaceNavigator [13] is a meeting capture application for storing, reuse and recapitulation of previous project work content. Working documents, browsed internet URL’s and screenshots are captured from the numerous shared, and optionally personal, screens and computers. Overview pictures and other environment specific settings are captured too. The captured files are stored in shared folders, accessible for members of different project groups. The content is structured as ‘time slice’ information in a SQL database. The WorkspaceNavigator browsing application supports searching and reuse of earlier captured project work.
Design and Implementation Design choices Our solution needs to manage the most crucial settings for working in the room. Our design goal has been to create a simple and intuitive user interface that does not require any computer and screen in the room to be powered on. We have also experienced that a safe and simple way to shut down the room is needed. We wanted flexibility in how the room was started up and ability to add or change parts of the setup during a work session. As examples of different settings and add-on functionalities for using the room we could mention:
Providing an initial, extendable setup of the environment including the starting of computers, shared displays, complex lighting capabilities, and also to be able to turn off the whole room easily. Temporarily changes to a specific usage mode or function, such videoconferencing, watching a movie etc. Restoring a specific project setup, including documents, selected shared work surfaces and room environment. Various functions in the iLounge can be controlled via wireless push buttons. Earlier experiments and every day usage of that system indicated problems in the following areas: Latency: The communication between components and the hardware devices introduce latencies that sometimes become noticeable or even unacceptable by the end users. They tend to press the buttons repeatedly until the system responds. This behavior results in toggling features on and off several times, further increased latencies and sometimes even failure to perform the task at hand. Feedback: Users need immediate feedback to know that the proper action has been performed by the system. Feedback is important to make the users feel confident in the functionality of the room and to avoid the problems described regarding latency. Some suggested solutions to meet the demand of feedback include direct feedback in some form such as audio cues. Symbolism: It was hard to know which push button belonged to which function and project and to be aware of if it was active or not. We have designed a system that allows the user to use different symbolical objects for different functions, projects and settings. We have also introduced the notion of putting the object in an active mode by placing it in the Magic Bowl and when the object is removed the state is disabled.
Implementation Figure 3 shows the Magic Bowl and how it is connected to the iLounge infrastructure. A more detailed description of the parts is found below. Phicon
Event translation Controllable appliances
The Magic Bowl RFID Antenna
Event Heap Hardware interface
RFID-reader EH-interface
Event2Command Proxy
EH –>X10
Shared screens and computer resources
Figure 3: Schematic overview describing the connections of the hardware and software components in The Magic Bowl
•
Phicons are various symbolic objects with an RFID tag attached which are used as a representation of different setups of the interactive room as well as for
• • • • • • • • •
controlling separate services. Currently we use the following phicons in the iLounge representing three different uses: o The Spaceman phicon is configured as the project group phicon. It sets the light in the room to a predefined level, starts the projectors and needed workstations, and gives access to the Spaceman project documents. o The Obelix phicon is configured to initialize a specific activity such as a demo or presentation. In the iLounge it launches a PowerPoint presentation that is part of the Magic Bowl demo. o The Lamp bulb is configured to control the iLounge environment. When the bulb is put into the Magic Bowl the large amount of computer controllable light sources are put into a default configuration and when removed the room lighting turns off. The Magic Bowl is a physical container of a number of phicons. Tasks or configurations are made active by placing the associated phicon in the container and inactivated when it is removed. The RFID antenna is attached to the bottom of the Magic Bowl and provides approximately 15 cm reading range which is enough to detect all the RFID tagged objects in the bowl. RFID-reader -> Event Heap interface: Interface between RFID tag reader to the Event Heap that maps the RFID’s to the Events, and then sends them to the Event Heap. The Event Heap is the interactive room communication infrastructure developed in the iSpace research project. Event Translation - the Patch Panel is used to translate one event to others. The “Asterix event” can be mapped to the PowerPoint presentation via this functionality. Controllable appliances are curtains, lights, 3D sound system, video conference system, computer, projector or others. Interface for controllable hardware that translates events from Event Heap to the hardware interface such as serial device drivers etc. Event2Command proxy, on each computer device there is an event-to-command demon running, that maps the parameters of certain trusted events to commands executed on the local machines. Shared screens and computer resources are used to display information and support collaborative interaction.
Configuring an interactive workspace using the Magic Bowl Configuring the behavior associated with a new phicon means changing the Event Translation settings. A user is able to assign new phicons to desired configurations by using the Patch Panel Manager, an application that provides a graphical user interface for mapping of the events of placing and removing the tagged phicons in the bowl to other events that causes execution of certain commands. We are now working on extensions of the Magic Bowl to make project phicons able to record the current state of the work environment. When the phicon is removed from the bowl the current state of the room is recorded, and putting the phicon back in the bowl re-creates that setting. This will provide an easier way of creating and configuring new phicons. The system is built to support and respond to numerous phicons, being placed in the bowl, being active simultaneously. This provides an opportunity for the bowl to support a flexible setup by combining phicons and changing the functionality of the environment. Changes of the environment that are only relevant for a shorter sub-session
of a project, such as connecting a remote co-worker in a video conference, that could be temporarily added to the bowl and then removed when that session are over. The physical content within the bowl also reflects the current configuration and act as a physical storage of the current room configuration, so if some service in the room fails and has to be restarted, the bowl storage provides a map of the current configuration and could serve as a source to retrieve wanted configurations. This flexibility in combination with the event based coordination infrastructure does however cause some disadvantages and makes it hard to maintain a coherent model of the room’s current state. The flexibility also makes it hard to resolve the resulting state for combining functions that conflict or override each others settings. The activation of an event can not be undone. Placing a phicon in the bowl generates an active event and removing it an inactivation event. In our first prototype the environment will react to the events generated by the bowl (activating and inactivating), but the Event Heap infrastructure will not keep accumulated information regarding previous events.
Conclusions The conclusions are based on our own experiences of having the Magic bowl deployed and used in the iLounge. The prototype will be further developed and evaluated during spring of 2004, when student groups will use the iLounge for group project work on regular basis. We have created a user interface that gives the user the experience of interacting with the room as a whole, instead of with the separate devices and services in the room. This is especially important in the beginning of a work session when a group arrives to a completely shut-down room and would otherwise have to spend valuable meeting time on starting and configuring devices. The Magic bowl makes it possible to do all that with one command – placing a phicon in the Magic bowl. The complex non-avoidable delays of the system, e.g. start-up time of machines, projectors, etc, can cause users to be unsure of whether they executed a command, and therefore repeat the command over and over again. We experienced that when using wireless buttons the system often got overloaded. With the Magic bowl the users can see that a command was given to the system by looking at the bowl.
Future work We started to work on deploying an easy interface for end users that could facilitate changing the actions that can be activated and deactivated by phicons. It will require more integration with services such as Patch Panel, iROS Manager and the Event2Command utility. In the current prototype of the Magic Bowl we would like a more distinct way for both activating and disabling room functionality so that the bowl contains a set of enabled functions that reflects the state of the room. The iROS infrastructure coordinates interactions through events, which more represents numerous cued actions than reflects the room’s accumulated state. The Magic Bowl phicons will represent the projects in a more dynamic way where we will provide an opportunity to record the current state of project activities as well as environment settings. We will also support the recovery and reuse of work that was performed in earlier sessions. We considered the project phicon to represent the last relevant ‘time-slice’ annotation in the WorkspaceNavigator database. Placing the project phicon into the bowl would correspond to reactivating the stored ‘time-slice’ from the database that more consistently reflects the content of the bowl. The ‘time-slice’ includes files and software services involved in a project work as well as the physical settings of the environment. Removing the project phicon will create a reactivation entity in WorkspaceNavigator database, and will cause the environment to come down to a state represented by the remaining phicons in the bowl.
References: [1] M. Weiser. The Computer for the 21st Century. Scientific American, vol. 265 (1991), no. 9, pp. 66-75. [2] M. Croné, H. Sundholm. Evaluating Ubiquitous Service Environment for collaborative work. In Supplement Proceedings of the ECSCW 2003, Helsinki, Finland. [3] D. A. Norman. The Psychology of Everyday Things. ISBN: 0465067093, 1988. [4] Crampton Smith, G. The Hand That Rocks the Cradle. I. D., May/June, pp. 60-65 [5] H. Ishii and B. Ullmer. Tangible Bits: Towards Seamless Interfaces between People, Bits and Atoms, proceedings of CHI´97 [6] J. Rekimoto. Pick-and-Drop: A Direct Manipulation Technique for Multiple Computer Environments. Sony Computer Science Laboratory Proceedings of UIST´97, pp. 31-39. [7] N. A. Streitz, J. Geißler, T. Holmer, S. Konomi, C. Müller-Tomfelde, W. Reischl, P. Rexroth, P. Seitz, R. Steinmetz i-LAND: An interactive Landscape for Creativity and Innovation. Proceedings of CHI´99 [8] R. Want. Bridging Physical and Virtual Worlds with Electronic Tags. Proceedings of CHI’99 [9] B. Johanson and A. Fox. The Event Heap: A Coordination Infrastructure for Interactive Workspaces Proceedings of WMCSA 2002 [10] B. Johanson, A. Fox, T. Winograd. The Interactive Workspaces Project: Experiences with Ubiquitous Computing Rooms. IEEE Pervasive Computing Magazine 1(2), April-June 2002. [11] R. Ballagas, M. Ringel, M. Stone, and J.Borchers. iStuff: A Physical User Interface Toolkit for Ubiquitous Computing Environments. Proceedings of the ACM CHI 2003 [12] R. Ballagas, A. Szybalski, A. Fox. Patch Panel Enabling Control-Flow Interoperability in Ubicomp Environments. Submitted to PerCom 2004. [13] A. Ionescu, M. Stone,T. Winograd. WorkspaceNavigator: Capture, Recall and Reuse using Spatial Cues in an Interactive Workspace. Stanford Technical Report TR2002-04