P. Tandler, N. A. Streitz, Th. Prante Roomware—Moving Toward Ubiquitous Computers. In: IEEE Micro, Nov/Dec, 2002. pp. 36-47
ROOMWARE—MOVING TOWARD UBIQUITOUS COMPUTERS COLLABORATIVE WORK APPROACHES, FACILITATED BY UBIQUITOUS COMPUTING, WILL SHAPE FUTURE WORKING ENVIRONMENTS. ROOMWARE COMPONENTS INTEGRATE EVERYDAY OFFICE FURNITURE WITH COMPUTERBASED INFORMATION DEVICES.
Peter Tandler Norbert Streitz Thorsten Prante Fraunhofer Integrated Publication and Information Systems Institute
36
In the past, a central mainframe computer provided terminals for many users. In the current age of the personal desktop computer, there are one or more computers for each person. It’s only in rare cases that these computers work together to jointly provide functions. In the future, computational power will be ubiquitous. In such an environment, devices that support humans interacting with information at work and in everyday activities will complement the desktop computer. These devices will be closely interconnected and integrated with the environment and context in which people use them. They indeed will synchronously work together to support fluent collaboration. Collaboration between users and environments with multiple interconnected devices will determine, to a large degree, approaches to work and everyday activities. An example of this type of device is roomware, or computer-augmented objects resulting from the integration of room elements, such as walls, doors, and furniture, with computer-based information devices. The roomware components that we have developed at Fraunhofer IPSI support the vision of a future where our surroundings act as an information interface, and the computer as a device disappears from our perception. Three main observations influenced the creation of roomware components:1
• the growing importance of information technology, • the need to integrate information technology with the environment in which it is used, and • the recognition that new work practices will emerge to cope with the increasing rate of the innovation.
Growing importance of information technology Only a few remaining domains of modern social and economic life are independent of information technology. Within the working context, it’s almost a requirement to use digital information as a major part of everyday work. The desktop computer as a device is ubiquitously present in every office. This idea was further pursued by Mark Weiser, who coined the term ubiquitous computing.2 We interpret this view as the design goal of a two-way augmentation and smooth transition between real and virtual worlds. Combining real and virtual worlds in a computer-augmented environment—resulting in hybrid worlds—lets us design enabling interfaces that combine everyday reality and virtuality. We seek to use the best aspects of both worlds. Our goal is to transform and transcend human-computer interaction resulting in rather direct human-information interaction
0272-1732/02/$17.00 2002 IEEE
and human-human cooperation, making computers disappear. In our approach, we distinguish between two types of disappearance (http://www.ambient-agoras.org). Physical disappearance of computer devices comes about by making the computer-based parts small enough to fit in the hand, interweave with clothing, or attach to the body. Mental disappearance of computers occurs when they become invisible to the user’s mental eyes. The important aspect here is that humans do not perceive the devices as computers anymore, but as embedded elements of augmented artifacts in the environment. In our actual designs, we combine these two types of disappearance.
Integration with the environment When the computer disappears in the sense defined previously, developers will need to widen their concept of computers and information technology. Their focus must begin to include the environment and context in which the computer is used. Architecture, as one major part of the surrounding environment, has a high degree of influence in determining our living and working habits. (Here, we use architecture to refer to the architecture of buildings, not software.) In addition, interaction with physical objects both as sources of information (such as books, magazines, or drawings) and tools or furniture is natural and normal for the way people live and work. This is in sharp contrast to the way people work with digital information today, which reduces interaction with information to interaction with computers in a very limited way. The computer has no information available about where it is located. It has no clue about the physical distance to other devices connected over the network; it’s only aware of directly connected devices, such as its monitor, keyboard, mouse, and, perhaps, a printer or scanner. A computer cannot figure out whether or not there is a large public display close by, which could show information used for a discussion among several people. It normally would not even sense that several people are trying to look at its small screen, because most current operating systems enable application use by a single user only. Besides the need for awareness of the physical context in which a device is used, it’s also helpful to make other types of context avail-
able.3-4 Examples include the number of people present, their roles (in terms of team members or visitors), and their current tasks. This type of context information is closely related to the third observation.
New work practices Looking at work environments, work is increasingly characterized by a high degree of dynamics, flexibility, and mobility. Initial examples include such new work practices as on-demand and ad hoc formation of teams. Contents and participants as well as contexts, tasks, processes, and structures of collaboration will change frequently to cope with the increasing rate of innovation. There is empirical evidence that teams work more effectively if they develop balanced proportions of individual work, subgroup activities, and full-group work.5 The same study found the range and combination of information devices available determined, to a significant extent, a team’s flexibility in relation to working in different modes. This stresses how important it is to provide the methods and tools to support different work phases in teams. Besides the need for new software tools, this variety of work phases influences other parts of the work environment. In the past, the role of architecture and office buildings addressed, to a large extent, the needs of individual work. In the future, work environments will increasingly address the needs for collaboration and communication. This evolution raises the need for appropriate furnishings and equipment. Roomware components address these needs, providing physical artifacts that meet the requirements of flexible reconfigurability.
Software developer challenges Obviously, roomware components differ from desktop PCs in several ways. Interaction in a multiuser, multidevice environment supported by roomware components requires new user interface concepts for efficient interaction. Roomware components developed as part of the i-Land—an interactive landscape for creativity and innovation—project base interaction on a pen or finger, instead of a mouse and keyboard.1 However, all these features pose a challenge for the software developer of roomware appli-
NOVEMBER–DECEMBER 2002
37
ROOMWARE
DynaWall
CommChair InteracTable ConnecTables
Figure 1. Second generation of roomware.
cations. Whereas several well-established models, frameworks, and tools exist to aid application design for a traditional PC,6 roomware application developers cannot draw upon such tools. They are in the same situation as that of pioneering computer scientists when PC use first began to spread.
Roomware components Our approach to roomware component design is to meet, in parallel, the requirements of flexible configuration and dynamic resource allocation in physical and information environments. The term roomware was originally coined by Streitz and his Ambiente team5 and is now a registered Fraunhofer trademark. However, it also applies to a general characterization of this approach and to products in this area. The general goal of developing roomware is to design integrated physical and virtual information spaces. In the context of supporting teamwork, roomware components should let users tailor and compose them to form cooperation landscapes, serving multiple purposes such as project team rooms, presentation suites, learning environments, information foyers, and so on. These goals have in common the requirement of developing software that enables new forms of multiuser, multiple-display, human-computer interaction and cooperation.
38
IEEE MICRO
In 1997, we began work on a testbed for experimenting with roomware components. We call this environment i-Land. In 1999, as part of the research and development consortium Future Office Dynamics ( h t t p : / / w w w. f u t u r e office.de), we developed— together with industry partners—the second generation of roomware. We redesigned existing components and developed an additional component called ConnecTable. Figure 1 shows this and other roomware devices: an interactive electronic wall (DynaWall), interactive electronic table (InteracTable), and mobile and networked chairs with an attached interactive display (CommChairs). We developed the Basic Environment for Active Collaboration with Hypermedia (Beach) software framework as infrastructure to support synchronous collaboration with roomware components.7 Beach offers a user interface that addresses the needs of devices that have no mouse or keyboard, and require new forms of human-computer and teamcomputer interaction. To allow synchronous collaboration, Beach builds on shared documents concurrently accessible via multiple interaction devices.
Large group activities: DynaWall The DynaWall provides a large whiteboard-like display that serves the needs of teams working in project and meeting rooms. It is the electronic equivalent of large areas of assembled sheets of paper covering the walls for creating and organizing information. The current realization consists of three segments with back projections and large touch-sensitive display surfaces (http://www.smarttech.com). The total display size of 4.50 × 1.10 meters can cover one side of a room (see Figure 1). Although driven by three computers, Beach provides one large, homogeneous workspace with no interaction boundaries between the segments. Two or more people can either work
individually in parallel or share the entire display space. To acknowledge the large visual surface, Beach supports spatial hypertext documents. workspaces can be positioned freely on the surface. As the touch-sensitive displays enable direct interaction with the documents using pen or finger, users draw strokes rather than clicking (that is, tapping) on the surface. Beach therefore allows drawing informal scribbles. It also allows pen gestures to invoke commands. Users can gesture to create, for example, new workspaces or hyperlinks, or remove objects. In addition to the general design of a penbased user interface, DynaWall’s size creates challenges for human-computer interaction. For example, it would be cumbersome for a user to drag an object or a window over a distance of more than 4 meters by having to touch the DynaWall the entire time (similar to holding down the mouse button). We developed two mechanisms to address these types of problems. Similar to physically picking up an object and placing it somewhere else, our take-and-put feature lets users take information objects at one position, walk over to a different place, and put them somewhere else on the display. For bridging the distance between several closely cooperating people during group brainstorming, for example, the shuffle feature lets users throw an object from one side of the display to the other, where another team member can catch it (by touching it).
Individual work in a group context: CommChair A CommChair, shown in Figure 2, is a mobile chair with a built-in computer. These chairs represent a new type of furniture, combining the mobility and comfort of armchairs with information technology. CommChairs let people communicate and share information with people in other CommChairs, standing in front of the DynaWall, or using other roomware components. They can make personal notes in a private space but also interact remotely on shared (public) workspaces, for example, making remote annotations at the DynaWall. Beach software provides the cooperative sharing functionality. To offer flexibility and mobility, each chair has a wireless network (we currently use an IEEE 802.11 WaveLAN) and independent power supply.
Figure 2. Tight collaboration using a CommChair in front of a DynaWall. The user in the chair can remotely annotate the wall by simply writing on the display attached to the chair.
Small-group collaboration: InteracTable The InteracTable interactive table provides a space for creation, display, discussion, and annotation of information objects. It’s suitable for use by groups of up to six people standing around it. It consists of a large plasma display panel with a touch-sensitive surface (http://www.smarttech.com) embedded in the top of a stand-up table. People can write and draw on it with a pen and interact via finger or pen gestures with information objects. Users with a need for extensive text input can use a wireless keyboard. The InteracTable is an example of how designers need to adapt the user interface to different form factors. Its horizontal setup display lets people stand around it, resulting in an interaction area with no predefined orientation. In contrast, vertical displays such as desktop computer monitors have a dedicated top and bottom, left and right. Thus, horizontal displays require new forms of human-computer interaction. To this end, Beach provides gestures for rotating individual information objects or groups of objects. This accommodates the need for easy viewing from all perspectives. Furthermore, a user can create a second
NOVEMBER–DECEMBER 2002
39
ROOMWARE
Figure 3. To support horizontal displays, Beach lets different users rotate documents to a preferred orientation.
view of an object; this view stays synchronized with the original view.7 A user can shuffle this view to a colleague standing on the other side of the table, so this team member will always have a personal view of the same object in the preferred orientation. Now, both users can view, edit, and annotate the same object in parallel, as Figure 3 shows.
Transition from individual to group work: ConnecTable The ConnecTable component is part of the second generation of roomware. It eases the transition between individual work and smallgroup cooperation. Users can adapt the display height to accommodate different
Coil Tag
Sensor
(a)
(b)
Figure 4. Sensor technology integrated into the ConnecTable (a). A coil and tag at the top of the display detect other tables (b).
40
IEEE MICRO
working situations, such as standing up or sitting. They can tilt the display to different angles, providing an optimal view. In this stand-alone mode, it is similar to a CommChair. To initiate tight collaboration, two users can move their ConnecTables together, and arrange them to form a large display area, as shown in Figure 1. This makes the ConnecTables suitable for small-group work, just like the InteracTable. To detect adjacent tables, integrated sensors measure the distance between the ConnecTables and initiate the automatic coupling of the displays once they are close enough. The current implementation uses a radio-frequency-based sensor, shown in Figure 4a. A coil integrated with the tabletop can detect radio frequency identification tags, shown in Figure 4b, when the tag’s distance falls within a certain threshold. Whereas this implementation is simple to realize and can reliably detect other tables, it’s only possible to connect two ConnecTables. However, adding coils and tags on each side of the tabletop would make it possible to connect more tables. Beach software lets workgroups use the resulting large display area as a common workspace,8 employing the same technology as that for coupling the DynaWall’s segments. Several people can then concurrently work and seamlessly move information objects across the physical borders of the individual displays. Like the InteracTable, people can create second views, shuffle them from one ConnecTable to another, rotate them there, and work on them in parallel with correct perspectives.
Software application model Traditional application models don’t provide enough guidance for developers to create software systems, such as Beach, for roomware environments. Software developers must consider further aspects not relevant for software running on a desktop PC with monitor, mouse, and keyboard as standardized interaction devices. Our proposed application model, shown in Figure 5, accounts for the properties of roomware and ubiquitous computing environments. It’s based on three design dimensions, describing orthogonal aspects of roomware applications. The first design dimension entails five models that separate
the basic concerns of roomware applications and make up the application model’s structure. This structure provides reusable components and hooks for adapting to or for different devices. The second dimension entails the degree of coupling and the aspects of sharing information between devices. The third dimension entails the four levels that organize the application model. These levels define different levels of abstractions for software functionality.
Separating basic concerns
Level of abstraction Data model
Task
Application Generic
Model
Core
Separation of concerns
User interface model Environment model Interaction model Local class
Shared class
Degree of coupling and sharing
Figure 5. Application model organized into four levels of abstractions and five models that separate basic concerns. Crucial for synchronous collaboration is the sharing dimension, responsible for distributed access to shared objects.
Clearly separating different responsibilities within the software helps provide the flexibility that different devices need. Therefore, we distinguish models for the data, application, user interface, environment, and interaction, as Figure 5 shows. The data model specifies the type of data that users can create and interact with. To work with data, the application model provides the necessary functionality. These two models are independent of the currently used or supported hardware device. Instead, the environment model describes available devices and other relevant parts of the environment. The user interface model defines the framework for how the software can present its functionality to the user, taking into account the environment model’s properties. These models are applicable to other applications besides those for ubiquitous computing and roomware, as well. Yet, because of the heterogeneous environment ubiquitous computing applications operate in, they have a strong need for a structure that is clear yet flexible enough to adapt components independently for different situations. Data model: Information objects. A common approach in application modeling is to separate the application model from the data, domain, or business object model.9 The data model relates to the information dimension identified by Jacobson et al.,10 while the application model represents the behavior dimen-
sion. This way, software developers can independently reuse both data and application models. They can specify and implement different applications for an existing data model. This reuse can save time if the current application domain has complex data structures or algorithms. Conversely, developers can reuse application models for different types of data, if they carefully define the interface between the application and the data at an appropriate level of abstraction. Application model: Application behavior. Application models describe all platform- and interface-independent application aspects, such as manipulation of data objects. As application models define the application’s behavior, they specify control objects as defined by Jacobson et al.10 For a text object, the data model includes the string describing the text and text attributes such as font or size. The application model adds the text’s editing state, such as cursor position or selection. Applications can give awareness, for example, displaying the cursors of other users, if they have access to a shared editing state.11 To use different application models for the same data model, the data model must remain unaware of any application model, but only represent the document state. Researchers and software developers have found it helpful to choose a fine granularity for some application models. This way, devel-
NOVEMBER–DECEMBER 2002
41
ROOMWARE
opers can aggregate low-level application models with a well-defined functionality (for example, to edit simple text) into more complex models at a higher level of abstraction (for example, an editor that can manage complete workspaces). Usually, a whole hierarchy of application models composed of generic, reusable, and custom parts constitutes an application.11 This way, the application model often forms a hierarchy that is isomorphic with respect to the containment hierarchy of its associated data model.9 Using small application models fostered a new conception of what developers regarded as an application. We view the application model as a description of additional semantics for a data model, instead of the conventional view of data as a supplement that applications will edit. This change in viewpoint, therefore, leads to an information-centric perspective of application models. In the context of roomware environments, it’s essential that developers do not include user interface and environment aspects in the application model. Enforcing a strict separation between application model and devicedependent aspects makes it possible to reuse application models with different user interfaces and within a different environment. Environment model: Context awareness. One major property of ubiquitous computing environments is the heterogeneity of the available devices. To provide a coherent user experience, “the system must have a deeper understanding of the physical space.”12 This raises the need for an adequate model of the application’s physical environment. Therefore, the environment model is the representation of relevant parts of the real world. This includes a description of the devices themselves, their configuration, and their capabilities. This is the direct hardware environment, which the user interface model can employ in adapting to different devices. In addition, the environment model can include other aspects if these aspects influence the software’s behavior. Depending on detected changes in the physical environment, the software can trigger further actions to reflect the current situation. An example of this is the way ConnecTables establish a common workspace when placed next to each other.
42
IEEE MICRO
Besides the physical environment, other contextual information, such as the current task, project, or coworker presence, could influence the software’s behavior—insofar as this information is available to the application. We refer to this type of contextual information as the logical context of the application.3 Currently, it’s difficult for software applications to grasp the physical environment and logical context. Further work must establish how to capture sufficient information about the current environment and to define appropriate models (for example, as in Sousa and Garlan4). User interface model: Interface objects. Because traditional operating and window management systems are suited for a traditional desktop PC, their user interfaces have drawbacks when used with devices without a mouse and keyboard, or those having different forms and sizes. For instance, if a menu bar were always at the top of the screen in a wall-sized display such as DynaWall, shown in Figure 1, users would find it difficult to reach. Alternatively, a toolbar takes up a lot of precious screen space on a small device, such as a personal digital assistant. Accordingly, the user interface model could define alternative user interface concepts suitable for different interaction devices, for example, rotation of user interface elements on horizontal displays. To choose an appropriate user interface, the user interface model can draw on information provided by the environment model. An explicit model of an appropriate user interface addresses all issues related to the hardware and the physical environment, making applications and documents device and environment independent. Still, the user interface model must not enforce a dedicated presentation or interaction style; this is the responsibility of the interaction model. Rather, the user interface model concentrates on the elements offered for interaction. These elements can be a device-independent representation of user interface widgets or interactors. Figure 6 illustrates the dependencies between data, application, environment, and user interface models. Interaction model: Presentation and interaction. To support different styles of interaction, it’s
Environment model
User interface model
Application model
Data model
Context awareness, devices, and tasks
User interface objects
Application behavior
Information objects
Interaction model Presentation and interaction style
Figure 6. Dependencies between data, application, environment, user interface, and interaction models. The user interface model can draw on information available in the environment model to define an application’s interface.
crucial to separate interaction issues from all other aspects of an application. The interaction model is the only place that specifies presentation aspects or interaction style. This way, a software system can adapt its presentation for different contexts, for example, by using a pop-up menu instead of a list box. It’s also possible to choose a different representation—when no display is available, voicebased interaction might still be possible. Hence, the interaction model defines a way to interact with all other basic models, as shown in Figure 6. An appropriate interaction style depends on the available interaction devices and the associated user interface. Software can choose a suitable interaction model, depending on the environment and user interface models. When designing an interaction model, the software developer has to choose an architectural style that is appropriate for the supported interaction style. For visual interaction, researchers have successfully used an adapted version of the model-view-controller style; this style separates input and output explicitly.13 Views render a visual representation of their model. Watching for model changes, the view updates the representation whenever the model is modified. Controller objects receive input events and modify their associated model accordingly. This way, the model needs no information about how it is visualized, or how users can trigger functionality.
Coupling and sharing Aiming at synchronous collaboration, tra-
ditional computer-supported cooperative work (CSCW) or groupware systems14 have two crucial aspects: access to shared data and the ability to couple the applications of collaborating users. Obviously, this coupling must apply to both data and application models for software running in a distributed environment.11 In the context of ubiquitous-computing environments, we must extend this view. In addition to data and application, different devices and applications must exchange information about the physical environment, such as the presence of nearby users or other available interaction devices. The user interface can be distributed among several machines or among complementing devices. Beach software explores these additional issues. Sharing the data model: Collaborative data access. To access and work with common documents, researchers widely agree that a shared model for documents reduces the complexity in dealing with distributed applications.13 In the example of a team member sitting in a CommChair and working with another member at the DynaWall, both users have access to the same information objects and can modify them simultaneously. Sharing the application model: Workspace awareness. As an easy way of sharing information about the editing state of other users, researchers have proposed sharing the application model as well as the data model.11 Sharing the editing state allows accessing
NOVEMBER–DECEMBER 2002
43
ROOMWARE
detected by sensors. As soon as Beach observes this change, it triggers functionality to connect the two displays to form a homogeneous interaction area.8 Currently, the involved sensors are attached to computers built into the ConnecTables; future work could replace or augment this setup. A sophisticated object tracking system, for example, as described in Brummit et al.,12 involves computers integrated into the environment. Here, a shared environment model enables arbitrary computers to update the information. Figure 7. When placed next to each other, two ConnecTables allow seamless movement of user interface elements from one device to another. This is realized by using a shared user interface model.
information about who is working on which document, providing awareness information to collaborating team members. For example, activity indicators15 can provide visual feedback for actions performed at a DynaWall by a user sitting in a CommChair. By changing the state of the application model, the software can control possible work modes such as the degree of coupling. When two users in Beach share the same workspace browser, they couple their navigation; when one user switches to another workspace, all users sharing the same application model will follow.7 Sharing the environment model: Environmental awareness. When several people and devices physically share a common environment, it’s obvious that the applications used in such situations should also have a shared model of how their environment looks. In ubiquitous-computing environments, many devices have sensors that grasp some aspects of the physical environment. By combining all available information and making it accessible to other applications, each application draws on context information that it can use to adapt its behavior. Thus, a shared environment model can serve as the basis for environment or context awareness. When someone places roomware components, such as two ConnecTables, next to each other, the ConnecTables update their shared environment model using the information
44
IEEE MICRO
Sharing the user interface model: Distributed user interface. We want a visual interaction area to cross the borders between adjacent displays (as in our realization of the DynaWall, or with ConnecTables) but connected to different machines. To do so, the user interface elements must move freely between the different displays, as Figure 7 shows. In this case, different machines must share user interface elements. Furthermore, if one user interacts with different devices at the same time, it’s useful to coordinate the devices’ user interfaces. This is only possible if all involved devices can access information about the current user interface elements. For instance, a user sitting in a CommChair in front of a large DynaWall can view all information at the chair and on the wall at the same time. Consequently, the user would benefit if he could modify the information visible on the wall and remotely control the entire user interface. Depending on how much state collaborating users share, applications can control the degree of coupling. Sharing all involved user interfaces and application states produces a tightly coupled collaboration mode; sharing only the same data model creates a loosely coupled environment.11 Linking the interaction and shared models. Implementing data, tool, user interface, and environment models as shared objects gives several users or devices simultaneous access to these objects. However, objects for the interaction model must exist locally on each machine. This is necessary because the interaction model’s objects must communicate with the locally available interaction devices. Moreover, a local interaction model lets each
client adapt the interaction style according to its local context, especially to its physical environment and interaction capabilities. Tandler et al. give an extensive example of how local interaction objects can adapt to their local context.8 Although the interaction model is local to every machine, for synchronous collaboration, the generated presentation must be consistent with the underlying models. Therefore, Beach uses a dependency mechanism, similar to the one provided by Amulet (http://www2.cs.cmu.edu/~amulet), to link the output of the interaction model to the shared models it presents. Beach ensures that refresh and recomputation begin as soon as an observed model changes. Sharing environment, user interface, and application models lets all clients access the information encapsulated in the models. This can provide awareness information to the user as part of the interaction model. Typical for CSCW applications is the provision of workspace or activity awareness.15 This can easily be realized by sharing the application model, including all editing state.11 A shared user interface model can be used to implement tightly coupled user interfaces. However, an always tightly coupled user interface can be inconvenient to use. Therefore, shared user interface information can instead supply additional awareness hints to remote users. Beyond the provision of awareness in traditional CSCW systems, sharing the environment model enables a new type of awareness—environmental awareness—for ubiquitous computing environments.
Conceptual levels of abstraction The third dimension of the application model is the abstraction level. Separating software into levels of abstraction is a common software engineering technique; it reduces the complexity of each level16 and ensures interoperability.17 For example, a core functionality of the interaction model, such as handling physical interaction devices, belongs to a very low level. Based on this functionality, higher levels define abstractions, such as widgets or logicaldevice handlers. High-level interaction components use these abstractions to define the user’s access and interaction possibilities for
some other model at the same level of abstraction. The application model presented here proposes four levels of abstraction. Core level: Platform-dependent infrastructure. The core level provides functionality that makes higher-level development convenient by abstracting from the underlying hardware platform. Roomware applications require additional functionality that is unavailable from off-theshelf libraries or toolkits. This functionality includes support for multiuser event handling or low-level device and sensor management. For Beach, this includes implementation of the shared-object space and the dependency mechanism. Model level: Basic separation of concerns. The model level provides basic abstractions that can serve as the basis for the definition of higher-level abstractions. Here, for example, Beach implements the model-view-controller style for the interaction model. Generic level: Reusable functionality. One important goal of every software system is to provide generic components suited for many different situations and tasks. Therefore, software developers should group models and concepts that apply to a whole application domain at a generic level. This way, the software developer must think about generic concepts, which will lead to the implementation of reusable elements. At the generic level, Beach defines generic document elements, such as workspaces, text, scribbles, or hyperlinks. Task level: Tailored support. When a conceptual application model defines only generic elements, this restricts the application’s usability to some degree. Some tasks require specialized support. Therefore, our conceptual model has a task level that groups all highlevel abstractions unique to small application areas. For example, we have implemented support for creative sessions on top of Beach.18
O
ur initial experiences are quite promising. In addition to our Ambiente Lab in Darmstadt, we had two external installations of i-Land: at the Deutsche Arbeitsschutzausstellung (DASA), or German Occu-
NOVEMBER–DECEMBER 2002
45
ROOMWARE
pational Safety and Health Exhibition, in Dortmund, which is ongoing, and at Wilkhahn in Bad Münder, which lasted for five months. Both installations were part of registered projects of the world exhibition EXPO 2000. In 2000, second-generation roomware components won the International Design Award of the state Baden Württemberg in Germany. Given the structure suggested by the application model, we have restructured Beach to consist of layered frameworks. With these frameworks, we are currently developing further applications tailored for roomware components (http://ipsi.fhg.de/ambiente). We consider architectural space as a guiding metaphor for designing environments that support cooperation between humans and their interaction with information. This is an important perspective for us. Innovative forms of interaction, such as throwing information objects on large interactive walls, provide intuitive forms of cooperation and communication. Nevertheless, it remains to be seen how far the use of these concepts and metaphors will actually carry. We are continuing this research in the Ambient-Agoras project, which is part of the European-Union-funded Disappearing Computer initiative. MICRO Acknowledgments We thank Jörg Geißler, Torsten Holmer, and Christian Müller-Tomfelde as well as many of our students for their valuable contributions to various parts of the i-Land project and the Ambiente division. Likewise, we thank the IEEE Computer Society’s reviewers for their extensive and helpful comments. Furthermore, we appreciate the cooperation with Heinrich Iglseder, Burkhard Remmers, Frank Sonder, and Jürgen Thode from the German office furniture manufacturer Wilkhahn; and Michael Englisch from their design company, WIEGE. They worked with us in the context of the Future Office Dynamics consortium (http://www.future-office.de), which sponsored part of this work.
2.
3.
4.
5.
6.
7.
8.
9.
10.
References 1. N.A. Streitz et al., “i-LAND: An Interactive Landscape for Creativity and Innovation,” Proc. Conf. Human Factors in Computing
46
IEEE MICRO
11.
Systems (CHI), ACM Press, New York, 1999, pp. 120-127; http://ipsi.fhg.de/ambiente/ publications. M. Weiser, “The Computer for the 21st Century,” Scientific American, Sept. 1991, pp. 94-104; http://www.ubiq.com/hypertext/ weiser/SciAmDraft3.html. A. Schmidt, M. Beigl, and H. Gellersen, “There is More to Context than Location,” Computers & Graphics, vol. 23, no. 6, Dec. 1999, pp. 893-902; http://www.elsevier.com. J.P. Sousa and D. Garlan, “Aura: An Architectural Framework for User Mobility in Ubiquitous Computing Environments,” Proc. 3rd Working IEEE/IFIP Conf. Software Architecture, Kluwer Academic, Boston, 2002, pp. 29-43; http://www.cs.cmu.edu/ ~aura/. N.A. Streitz, P. Rexroth, and T. Holmer, “Does ‘roomware’ Matter? Investigating the Role of Personal and Public Information Devices and their Combination in Meeting Room Collaboration,” Proc. European Conf. Computer-Supported Cooperative Work (ECSCW), Kluwer Academic, Amsterdam, 1997, pp. 297-312; http://ipsi.fhg.de/ambiente/ publications. B.A. Myers, “User Interface Software Tools,” ACM Trans. Computer-Human Interaction, vol. 2, no. 1, Mar. 1995, pp. 64103; http://doi.acm.org/10.1145/200968. 200971. P. Tandler, “Software Infrastructure for Ubiquitous Computing Environments: Supporting Synchronous Collaboration with Heterogeneous Devices,” Proc. Ubiquitous Computing (UbiComp), LNCS vol. 2201, Springer, New York, 2001, pp. 96-115. P. Tandler et al., “ConnecTables: Dynamic Coupling of Displays for the Flexible Creation of Shared Workspaces,” Proc. 14th Ann. ACM Symp. User Interface and Software Technology (UIST), vol. 3, no. 2, CHI Letters, ACM Press, New York, 2001, pp. 11-20; http://ipsi.fhg.de/ambiente/publications. VisualWorks User’s Guide, rev. 2.0 (software release 2.5), ParcPlace-Digitalk, Palo Alto, Calif., 1995. I. Jacobson et al., Object-Oriented Software Engineering: A Use Case Driven Approach, Addison-Wesley Professional, Boston, 1992. C. Schuckmann, J. Schümmer, and P. Seitz, “Modeling Collaboration using Shared
12.
13.
14.
15.
16.
17.
18.
Objects,” Proc. Int’l ACM Siggroup Conf. Supporting Group Work, ACM Press, New York, 1999, pp. 189-198; http://www. opencoast.org. B. Brummit et al., “Easyliving: Technologies for Intelligent Environments,” Proc. 2nd Int’l Symp. Handheld and Ubiquitous Computing (HUC), LNCS vol. 1927, Springer-Verlag, Heidelberg, Germany, 2000, pp. 12-29. W.G. Phillips, Architectures for Synchronous Groupware, tech. report 1999-425, Dept. Computing and Information Science, Queen’s University, Kingston, Ontario, Canada, 1999; http://phillips.rmc.ca/greg/pub. C.A. Ellis, S.J. Gibbs, and G.L. Rein, “Groupware—Some Issues and Experiences,” Comm. ACM, vol. 34, no. 1, Jan. 1999, pp. 38-58. C. Gutwin and S. Greenberg, “Design for Individuals, Design for Groups: Tradeoffs between Power and Workspace Awareness,” Proc. ACM Conf. Computer Supported Cooperative Work, ACM Press, New York, 1998, pp. 207-216; http://doi.acm.org/10.1145/289444.289495. L. Nigay and J. Coutaz, “Building User Interfaces: Organizing Software Agents,” Esprit 91 Conf. Proc., ACM Press, New York, 1991, pp. 707-719; http://iihm.imag.fr/publs/ 1991/. J.I. Hong and J.A. Landay, “An Infrastructure Approach to Context-Aware Computing,” Human-Computer Interaction, vol. 16, no. 24, Dec. 2001, pp. 287-303. T. Prante, C. Magerkurth, and N.A. Streitz, “Developing CSCW Tools for Idea Finding: Empirical Results and Implications for Design,” Proc. ACM 2002 Conf. Computer Supported Cooperative Work (CSCW), ACM Press, New York, 2002; http://ipsi.fhg.de/ ambiente/publications.
Peter Tandler is a scientific staff member of the Ambiente division of the Fraunhofer Integrated Publication and Information Systems Institute (IPSI). He leads software development within the Beach and i-Land projects. His research interests include synchronous groupware, integration of virtual and physical environments, new forms of human-computer and team-computer interaction for roomware, software architecture, programming languages, object-oriented frameworks,
and object-oriented design and programming. Tandler is working on a PhD in the context of application models and software infrastructure for roomware environments at Fraunhofer IPSI. He has a Dipl.-Inform. in computer science from the Darmstadt University of Technology, Germany. Norbert Streitz is the head of the research division at Ambiente—Workspaces of the Future, which he founded to initiate work on roomware and cooperative buildings at Fraunhofer IPSI. He also teaches in the computer science department of the Technical University Darmstadt. He is the chair of the steering group of the European research initiative The Disappearing Computer (DC) and manager of the DC project Ambient Agoras. His research interests include human-computer interaction, hypermedia, computer-supported cooperative work, ubiquitous computing, user-centered design of smart artifacts, and the relationship between real and virtual worlds. Streitz has an MSc and PhD in physics and a second PhD in psychology. He is a member of the German Society of Computer Science (GI) and the German Society of Psychology (DGP). Thorsten Prante is a scientific staff member of the Ambiente division at Fraunhofer IPSI. He also coordinates activities of the Future Office Dynamics consortium, and he teaches at the Darmstadt University of Technology in the departments of computer science and architecture. His research interests include humancomputer interaction and computer-supported cooperative work, focusing on user interfaces for cooperative software in ubiquitous computing. Prante has a Dipl.-Inform. in computer science with minors in architecture and software ergonomics. Direct questions and comments about this article to Peter Tandler, Fraunhofer IPSI, Ambiente division, Dolivostr. 15, 64293, Darmstadt, Germany;
[email protected].
For further information on this or any other computing topic, visit our Digital Library at http://computer.org/publications/dlib.
NOVEMBER–DECEMBER 2002
47