A Tangible User Interface for Peripheral Task Management Darren Edge University of Cambridge Computer Laboratory
[email protected] Abstract Project plans fail to reflect the real work situation if they represent infrequently updated, high level objectives. We propose a tangible interface that allows team members to update and share their progress in real time through peripheral task management.
1. Introduction Tangible User Interfaces (TUIs) give physical form to digital information and its subsequent physical control [1]. The tangibility of the resulting objects gives them spatiotemporal characteristics quite distinct from textual or visual languages displayed on a screen – they are both persistent over time and freely configurable in physical space. These qualities provide interaction designers with an opportunity to better exploit the human capabilities of tactile perception, bimanual skill, peripheral awareness and spatial memory, and to augment human memory and cognition through the externalisation of information. The aim of this project is to evaluate the degree to which these hypothesized benefits of TUIs can be realized in the context of team-based project work. The initial focus has been on providing team-members with a means of updating and communicating their work situation to one another. We believe that the directness of tangible interaction, in terms of both input and feedback, can minimise the loss of time and attention caused by switching to the secondary activity of task management. This should encourage a higher update frequency, hence information that more accurately reflects the evolving work situation.
2. Application Design Our design of the task management application was influenced by preliminary fieldwork within a multinational technology company. The main problems identified in interviews with two project managers and a group discussion with a team of five engineers were: 1. Reporting. Plans were shared and revised in weekly project meetings. Timesheets were also kept, ‘accurate’ to half an hour, but frequently a best guess.
2. Estimating. Estimates of task duration were made by the project manager, and estimates of percentage complete and finish date were made weekly by teammembers. Most estimates ultimately proved optimistic. 3. Sharing. To-do lists were stored in many different forms; these often impacted on other team-members but the effort required for publication deterred sharing. Addressing these problems formed the basis of the application design. The domain concepts were chosen in accordance with the representational epistemology approach to interface design, which integrates alternative perspectives and levels of abstraction within a single representation, shown to provide “substantial support to learning and problem solving” [2]. Tasks are team-members’ unit of work, having hours’ work remaining and completion date attributes controlled by the user. In addition, users should be able to indicate the active task they are currently working on, both to automate timesheet data collection, and to dynamically count down from the current estimate of work remaining. This is intended to encourage revision of estimates sooner rather than later, by confronting users with the reality of the time spent so far on a task. Re-estimation provides immediate feedback to the user in terms of how their estimate of work remaining, coupled with the time they have already spent on the task, combine to give an up-to-date estimate of the total task duration and the true percentage complete. Over time, users will build up a history of task estimation profiles against which to make new estimates. This slower feedback loop of linking past experience to future expectations has been shown to reduce optimistic bias in cases where the estimate is coupled with a plausible scenario of how that estimate will be achieved [3]. Such scenarios are supported by an action plan attribute – actions being the things that people do to make progress towards the completion of a task. Users should be able to view actions by task or by action type according to their activity. The application should also help users create better schedules through exploratory design, by showing them how overlapping tasks propagate the necessity of work closer to the present. This should also prove more realistic than planning tasks in a calendar, which was tried and abandoned in the organization in question
Visual Languages and Human-Centric Computing (VL-HCC'06) 0-7695-2586-5/06 $20.00 © 2006 Authorized licensed use limited to: IEEE Xplore. Downloaded on April 27, 2009 at 23:50 from IEEE Xplore. Restrictions apply.
because it didn’t reflect how work was actually carried out: schedule, start, stop, reschedule, restart, etc. The user should therefore be shown a latest start time for each task, taking into account estimates of working hours per day and the distribution of overlapping tasks.
3. Interaction Design In order to design a TUI and resulting style of interaction that would complement the traditional monitor, mouse and keyboard setup, and support the kind of task-management application outlined in the previous section, further fieldwork was conducted in the form of a video study of an engineer at work. A 30minute period was selected from the footage and subjected to a per-second analysis of his actions whilst performing a typical software debugging task. Coding the data showed that the engineer’s left hand* displayed idling behaviour much more frequently than his right hand (23% vs. 10%), and that he repeatedly used his right hand to select between multiple objects in an eyes-free manner. Our TUI design therefore aims to engage the left hand, whilst simultaneously taking advantage of the right hand’s ability to move between prearranged objects with visual attention elsewhere. The structural design of the TUI (Figure 1) reflects this observed asymmetry of hand usage, and also a consideration of the Tangible Correlates of the Cognitive Dimensions [4].
visually attend to either their monitor, or to the tokens which they can now manipulate with their left hand. This interaction scheme combines the advantages of both space-multiplexing (multiple tokens) and timemultiplexing (single tool). The division of labour between the two hands can be seen as an application of Guiard’s theory of bimanual skill [5]: the left hand leads by nudging a token in one of four axis-aligned directions, selecting both the token and the attribute associated with that nudge direction, and mapping control of that attribute to the single tool. This coarse motion is followed by the fine motion of the right hand operating the tool, manipulating the selected attribute.
5. Evaluation The utility of having a task-management TUI for individual users derives from having a persistent representation of their work context on the periphery of their visual field, quickly and efficiently manipulable though bimanual actions. In this initial version, the problem of information sharing is addressed by publishing online the changing states of all users’ TUIs. The first evaluation step will be to deploy this TUI for field trials in a real project team over a number of months. During that time, the server will collect various statistics regarding usage of the TUIs to support observations and interviews made on-site. As well as for evaluation, this will also form a contextual, iterative redesign process [6] in which the TUI is extended to incorporate further aspects of awareness and coordination between team-members. Together, these methods will be used to answer research questions regarding the benefits of tangible, peripheral interfaces for managing secondary work activities.
6. References
Figure 1 Items of interest are represented by physical ‘tokens’ arranged on the left hand side of the workspace, whilst their attributes – visualised on an interactive surface beneath the objects – are manipulated by a single physical ‘tool’ located on the side of the right hand (a hybrid knob/button). The interface thus complements the existing workstation: the right hand can switch between the mouse and the control tool in an eyes-free way, allowing the user to
*
[1] Ullmer, B. & Ishii, H. (2000) Emerging frameworks for tangible user interfaces. IBM Systems Journal, 39(3&4). [2] Barone, R., & Cheng, P. C-H. (2004). Representations for problem solving: on the benefits of integrated structure. 8th Int. Conf. on Information Visualisation. [3] Buehler, R., Griffin, D. & Ross, M. (2002). Inside the Planning Fallacy: The Causes and Consequences of Optimistic Time Prediction. In T. Gilovich, D. Griffin & D. Kahneman (Eds.) Heuristics and Biases, CUP. [4] Edge, D. & Blackwell, A. (In Press). Correlates of the Cognitive Dimensions for Tangible User Interfaces. Journal of Visual Languages and Computing, 17(4). [5] Guiard, Y. (1987) Asymmetric Division of Labor in Human Skilled Bimanual Action: The Kinematic Chain as a Model. Journal of Motor Behavior, 19(4). [6] Jones, R., Milic-Frayling, N., Rodden, K. and Blackwell, A. (accepted for publication). Contextual method for the re-design of existing software products. To appear in International Journal of HCI.
For left handed users, reverse “left” and “right” throughout.
Visual Languages and Human-Centric Computing (VL-HCC'06) 0-7695-2586-5/06 $20.00 © 2006 Authorized licensed use limited to: IEEE Xplore. Downloaded on April 27, 2009 at 23:50 from IEEE Xplore. Restrictions apply.