Experiences in Conceiving and Prototyping a ...

3 downloads 419 Views 166KB Size Report
Business Application Using Multi-Touch Technology. Claudia Nass. 1 ... operating system Windows 7 was introduced in October. 2009 ... of Expression Blend in order to create a Silverlight applica- tion. ... the traditional desktop metaphor.
ITS 2010: Poster

November 7-10, 2010, Saarbrucken, ¨ Germany

Experiences in Conceiving and Prototyping a Commercial Business Application Using Multi-Touch Technology Claudia Nass1, Kerstin Klöckner1, Rudolf Klein2, Hartmut Schmitt2, and Sarah Diefenbach3 2 3 1 Fraunhofer Institute for Expea3 Systems GmbH Folkwang University rimental Software Engineering Saarbrücken, Germany Essen, Germany Kaiserslautern, Germany {rudolf.klein; hartmut.schmitt} sarah.diefenbach@ {nass; kloeckner}@iese.fhg.de @a3systems.com folkwang-uni.de the workbench DESIGNi [2] and the Microsoft Expression Studio. We also realized a field study with end users of GKE, which aimed at verifying the users’ perception of the software’s quality. Finally, we report on our experiences, challenges and the lessons learned during this project.

ABSTRACT

In this paper, we describe methods and challenges regarding the conception and prototypical implementation of a business application that uses multi-touch technology in its input device. We illustrate the use of a new conception and specification approach called DESIGNi and our experiences using the new Microsoft Expression Studio, which is specifically intended to support the development of commercial multitouch software. We focused on the redesign of an application called "Graphical Knowledge Editor" (GKE), which is used to model processes and workflows of a call center. This system was evaluated in a study that showed how the users perceived the quality of the new software.

CASE STUDY

The case study involves a so-called GKE, used for modeling the business process of a call center. A typical, recurring task when using GKE is to build new processes by combining already existing process building blocks. Conception Phase

The conception phase was supported by an interaction design workbench called DESIGNi (Designing Interaction). DESIGNi supports the conception and specification of interaction behavior. With this approach, designers are able to explore different forms of interaction and to systematically describe the transition or interplay between human and system [2]. Table 1 shows the specification of an interaction with DESIGNi. The first part consists of the elements that describe the interaction: elementary step, situation, and attributes. Concrete interaction design requires the specification of actions on behalf of the human and a corresponding (re)action on behalf of the machine. The (re)actions embedded in an elementary step can be described by the actual action, its manner, and specific attributes as presented in Table 1.

ACM Classification: H5.2. [Information Interfaces and

Presentation]: User Interfaces - User-centered design. General terms: Design, Documentation Keywords: Interaction design, multi-touch technology INTRODUCTION

Recently, we have experienced a flood of electronic devices that employ a variety of novel, mainly gesture- and touchbased ways of interaction. As a consequence, when, the new operating system Windows 7 was introduced in October 2009, it presented multi-touch support as a core capability. This allowed small and medium enterprises (SME) to develop applications in a commercial and profitable way. Although the possibilities offered by this technology are tremendous and call for exploration in terms of productive work in the business domain, the most of the available applications nowadays come from the entertainment domain (moving photos around the screen, zooming them in and out, etc.).

For the conception and specification of the new interaction concepts, the designer had to enter the information specified in an earlier requirement phase into DESIGNi (elementary steps and situation). With these data, the designer began to build the puzzle of what an interaction would look like by filling the cells of the table. He built the interaction parts in the order that ideas appeared in his mind. He compared individual interaction steps with the entire sequence and reflected on whether the interaction was meaningful and appropriate.

In this work, we explored methods and procedures regarding the conception and prototypical implementation of a business application that uses multi-touch technology in its input device. For this, we developed a highly interactive prototype of a system called "Graphical Knowledge Editor" (GKE) using

Implementation Phase

Based on the specification of DESIGNi, the system was implemented as a high-fidelity, fully interactive prototype. The development of the prototype was realized with Microsoft Expression Studio. This suite promises ideal collaboration between designers and developers and also enables the development of multi-touch applications that can be used in Windows 7 [1]. The actual implementation of the system was also supported by the Silverlight Software Development Kit (SDK) due to its appropriateness in developing rich-Internet business applications.

Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. ITS’10, November 7–10, 2010, Saarbrücken, Germany. Copyright 2010 ACM 978-1-4503-0399-6/10/11...$10.00.

279

ITS 2010: Poster

November 7-10, 2010, Saarbrucken, ¨ Germany

The first stage in the implementation of our prototype was realized on the Microsoft Expression Design, a raster and vector graphics editor in which the layout and visual design of the GKE (multi-touch version) was developed. This vector data was exported as XAML and later imported in a project of Expression Blend in order to create a Silverlight application. Figure 1 shows a screenshot of the prototype designed for the interaction specified in Table 1.

"immediately knew what to do and felt right at home". This high degree of intuitiveness was also reflected in the performance measures. In spite of the long term experience with the old system, task accuracy and performance times were just a little better for the old GKE. Altogether, performing usual working tasks became real fun and participants even did not want to stop working with the new prototype, "What a pity that it's over already" was common statement as well (for more detailed results, see [2]). DISCUSSION AND CONCLUSION

DESIGNi is a workbench that helps designers to envision and describe interaction behavior in a systematic way. We experienced this structure as helpful due to the inclusion of the human action in the design space. It made the design of new human action manners explicit and so brought the designers to define new system reaction forms unknown in the traditional desktop metaphor.

Figure 1: Sample screen of GKE prototype

Using Microsoft Expression Studio, it was possible to transfer the user interface and visual design completely from the vector to the implementation tool in an easy and practicable way. This contributed to reducing the gap between visual design specification and implementation. However, the available documentation of Silverlight SDK appeared to be inappropriate for supporting the development of business application. The majority examples presented in the available documentation come from the game domain and do not provide any higher view of the programming concepts that could explain relationships of individual components in the context of business applications.

Already existing Silverlight controls, such as container elements and canvas, were used in the implementation. Moreover, additional controls had to be developed in C#, such as lists, tabulators, scroll bars, and specific elements of GKE, called knowledge objects. Event handler routines and functions were implemented behind these new elements in order to manipulate their size, position, and appearance. The prototype should be mainly manipulated by touch with the exception of the text fields that could be operated using the normal keyboard. Thus, we implemented single-touch gestures, such as tap (to select or to activate), drag (to move an object or to connect objects), slide and fling (to scroll), and rub (to delete), as well as multi-touch gestures such as pinch (to shrink), spread (to enlarge), slide with two fingers (to scroll), and hold and tap (to connect). In order to enable correct recognition of the implemented gestures, a certain number of touch points and events were recorded in a queue. An interpretation function ran over this queue in order to generate the desired behavior of the controls based on the number of touches and the history of touches; e.g., if an object was tapped, it would be possible to move or to connect this object.

ACKNOWLEDGMENTS

This work was supported by the projects FUN-NI (Grant: 01 IS 09007) and Emergent (Grant: 01 IC 10S01A). Both projects were funded by the German Federal Ministry of Education and Research (BMBF). We thank HanseNet Telekommunikation GmbH and its employees for their participation in this study. REFERENCES

1. 2010. Microsoft Brings Developers and Designers Closer Together With Expression Studio 4 Release (URL). Retrieved August 24, 2010 from http://www.microsoft.com/presspass /press/2010/jun10/06-07Expression4PR.mspx

USER EVALUATION

A first evaluation of the prototype with 12 participants, that use GKE as part of their daily work, revealed promising results. Compared with the "old" GKE, the new system was perceived as more attractive, more natural and especially more intuitive. Several participants pronounced that even though it was their first time working with the system, they

Person is at the desktop in front of the computer

7

7

7

7 6

2

5

2

st

am1.Person holds first element. am2. Person taps element with which the first element has to connect.

Person connects elements

sa1. Recognition of 1 element of connection sa2. Recognition of selected element

s3. Connection is built



Table 1: Screenshot DESIGNi (Scale of attributes: 1=little to 7= extremely)

280

sm1. Element "process building block" pulses when selected sm2.1. Element "process building block" pulses once when tipped sm2.2. Haptic vibration on element "process building block" sm3.1. Arrow fligt from first to the next element sm3.2. Sound effect

attention

constancy

7

Attributes

System Manner

delay

5

System action

power

power

7

(Re)action Human action

precision

precision

1

Human Manner

continuity

continuity

Attributes speed

evidence

purpose

proximity



els34. Connect two elements of type "process building block"

Action Attributes directness

Situation

speed

Interaction Elementary step

2. Nass, C., Kloeckner, K., Diefenbach, S., Hassenzahl, M. (in press). DESIGNi – A Workbench for Supporting Interaction Design. In Pro. 5th Nordic Conference on Human-Computer interaction: Extending Boundaries.

2

7

2

1

1

1

7

2

7

2

1

1

7

1

2

7

2

1

1

7

1

6 6

6 6

4 4

4 4

1 1

6 6

1 1

Suggest Documents