Scenario Networks: Specifying User Interfaces with

0 downloads 0 Views 826KB Size Report
Demosthenes Akoumianakis and Ioannis Pachoulakis. Department of Applied Informatics and Multimedia,. Technological Educational Institution of Crete,.
Scenario Networks: Specifying User Interfaces with Extended Use Cases Demosthenes Akoumianakis and Ioannis Pachoulakis Department of Applied Informatics and Multimedia, Technological Educational Institution of Crete, P.O. Box 1939 Heraklion, Crete, Greece, GR 71500 {da, ip}@epp.teicrete.gr

Abstract. In this paper, we present the rationale and the baseline of a notation which can be used on its own or as an extension to standard UML to facilitate specification of an interactive system’s global execution context (GEC). The GEC graph is a visual construction consisting of (a) nodes, which represent interaction scenarios, and (b) directed links, which represent scenario relationships designating alternate execution, concurrency, ordering, and set-oriented relationships between two scenario nodes. The technique is particularly useful for specifying adaptable and adaptive behaviours across interaction platforms, contexts of use and target user communities. In the paper, we demonstrate the application of the technique using a file-exchange application which runs on a portable device such as a PDA and implements a lightweight ftp process to connect to a server wirelessly and offer standard ftp functionality (get/put/delete).

1 Introduction Modelling user interfaces is a complex engineering task and a popular research topic. In the past three decades it has been studied from a variety of perspectives (i.e. psychological, sociological, knowledge engineering) by Human Computer Interaction (HCI) researchers, yielding substantial results and a broad collection of methods for describing interaction tasks, object classes, lower-level dialogs, as well as mental activities of users interacting with application software. In the majority of these efforts the distinct assumption has been the loose separation of the user interface modelling activity from the software engineering tasks typically involved in addressing functional requirements of interactive software systems. As a result, user interface modelling methods and tools do not couple easily with application modelling and vice versa. This status quo brings about several drawbacks for the software design community, as frequently the integration effort required to bridge across the user interface and software engineering tasks is far from trivial. Recent attempts towards establishing effective methods for closing the gap between the two engineering communities [see 17-20] have explored concepts such as scenarios [21], use cases [1,2], dedicated dialogue description techniques such as event modelling and more recently the UML Unified Modelling Language [3]. UML departs from the commitment of model-based user interface development tools which focus on domain modelling to describe the P. Bozanis and E.N. Houstis (Eds.): PCI 2005, LNCS 3746, pp. 491 – 501, 2005. © Springer-Verlag Berlin Heidelberg 2005

492

D. Akoumianakis and I. Pachoulakis

data over which the interface acts, to provide a more encompassing frame of reference for modelling the functionality of the application for which the interface is being constructed. This, however, is widely acknowledged as insufficient support for user interface modelling [4-7]. As a result, extending UML to establish additional expressive power for user interface modelling is an active track of on going research activities. In this paper, we investigate current work in this area and describe the rationale and baseline of a notation, which can be used on its own or as an extension to UML, to facilitate specification of an interactive system’s global execution context (GEC). The extended facilities can be used to compile a task’s GEC graph as a visual construct, which consists of (a) nodes, representing interaction scenarios, and (b) directed links representing scenario relationships such as alternate execution, concurrency, ordering, and set-oriented relationships between two scenario nodes.

2 Related Work At present, no current research effort is known of, which seeks to extend a particular user interface modelling technique to facilitate application modelling. All activities in this area aim to advance (some of) the following targets: use specific UML notations to model user interfaces; extend UML so as to facilitate user interface modelling; couple UML components with user interface modelling techniques. A representative effort in the direction of using specific UML notations to model user interfaces is presented in [22]. The authors suggest an approach for requirement engineering, linking scenarios with user interface prototypes. Scenarios are acquired in the form of UML collaboration diagrams and are enriched with user interface information. These diagrams are automatically transformed into UML Statechart specifications of all objects involved. These specifications are, in turn, used to generate a UI prototype, which is embedded in a UI builder environment for further refinement. The approach has been demonstrated to work for relatively simple user interfaces (such as those used in ATMs) using conventional user interface objects. As a result, it deals only with interactions at the lexical level. Syntactic and/or semantic interaction levels are ignored and no account of HCI knowledge is exploited. These shortcomings have motivated research proposals for either explicit coupling between UML and user interface modelling notations, or UML extensions. Several researchers have investigated integrating interface modelling techniques with UML. The rationale for such coupling is derived on the one hand from the insufficiency of UML to model user interface aspects and on the other from the maturity of some well-established user interface modelling notations. For example, one approach assesses UML models for use in interface modelling, comparing them with a collection of specialist interface modelling notations [4]. Another approach suggests how one can use several UML models — particularly class diagrams and use case diagrams — along with task models for user interface modelling [5]. The latter work proposes coupling the ConcurTaskTrees notation to available UML diagramming techniques, while creating semantic mappings of the ConcurTaskTrees concepts into UML meta-model [6].UML extension mechanisms could then be used to support smooth coupling. The primary benefit resulting from coupling UML to existing user

Scenario Networks: Specifying User Interfaces with Extended Use Cases

493

interface modelling techniques is that each component remains atomic in the sense that no extension is required to accommodate new expressive or representational power. Hence, since the coupling is typically undertaken by semantic mappings across the notations, this needs to be done only once. In this manner, application and user interface designers continue to work on their own perspectives using their own tools, while relying on these mappings to undertake the required integration. A different line of research includes efforts which seek to extend UML to support user interface modelling, without reference to a particular user interface technique or task model. These efforts typically add notations to the already rich set of UML diagrams. A representative of this category is UMLi [7], which attempts to use the UML extensibility mechanisms to support user interface modelling. An alternative research track concentrates on building UML profiles for specific tasks of user interface design such as GUI (layout). A representative effort is reported in [8], where the author describes a profile intended to provide an easily comprehensible abstract representation of an actual screen based on designers’ sketches. A GUI Layout Diagram is proposed, consisting of a Screen, which contains multiple ScreenAreas, each of which may be decorated with one or more Stereotypes, representing performed functionalities like text, image or link. By nesting and arranging properly stereotyped ScreenAreas within each other, the developer is able to create an abstract version of a user interface. The Navigational Diagram provides UML-based support for common design artefacts like storyboards and sitemaps. The diagrams created can be linked to Use Case modelling using existing UML to specify requirements and context of a particular screen. One issue of significance to the present work is the sufficiency of the proposals described above and other similar efforts reported in the literature to cope with the emerging requirements for nomadic applications and ubiquitous access. Specifically, our concern relates to the suitability of existing notations to facilitate the novel requirements prevailing in nomadic and ubiquitous applications - such as scalability, object mobility / migration, security, platform independence, location awareness, personalization / individualization - commonly referred to as non-functional requirements (NFRs) or quality attributes [9]. To this end, the established UML notations offer no obvious mechanism to allow designers to model explicitly and address such software quality attributes in the course of analysis and design. The work described in [22] is a partial on-going effort towards such an end, which deals only with class diagrams. This realization has been the driving concern in recent efforts aiming to extend UML so as to facilitate support for modelling global applications [10] and their respective properties and quality attributes (e.g., mobility, performance, object migration, security). On the other hand, the model-based user interface development paradigm seems to address some of these concerns partially and in an ad hoc manner. In particular, model-based development has indeed delivered a number of possible solutions to generating user interfaces for multiple platforms (for instance [11]) but these do not result from an explicit account of designated NFRs. In fact, model-based development practitioners need not be aware of such constructs as NFRs. Instead, in the majority of cases, platform-aware user interfaces are generated by manipulating, automatically or semi-automatically, an abstract task model that is incrementally transformed to a concrete instance. It should be noted that our premise is not that model-based development cannot cope with such challenges. On the contrary, we strongly believe that it is perhaps the

494

D. Akoumianakis and I. Pachoulakis

only engineering paradigm that can be refined to cope with the emerging requirements. However, this would require a substantial shift from task-level to goal-oriented and activity modelling and linking with recent advances in goal-oriented requirements engineering and requirements-driven system development such as the Tropos project [12]. To this end, it is claimed that in modern applications, the designer is increasingly required to be able to articulate the GEC of the systems tasks, rather than being solely concerned with the development of an abstract task model from which incrementally, either through mappings or transformations, a platform-aware user interface is generated. The remainder of this paper develops a proposal for a specificationsbased technique to facilitate an insight to the GEC of a system’s tasks.

3 Specifying the Global Execution Context of Tasks The premise of the present work is that HCI design lacks a coherent and detailed macro-level method (in the sense defined in [13]), for including change management in the user interface development process. In other words, although there is a plethora of micro-level techniques useful to study change in the context of HCI, there is no integrated frame of reference for identifying and propagating change across stages in the course of the design and development processes. It is also argued that formalizing change management in relation to designated NFRs, which are explicitly accounted in the context of human-centred development, can provide a useful frame of reference. 3.1 Definition of Terms Change in interactive software is inherently linked with the context in which a task is executed. Typical context parameters include the target user, the platform providing the computational host for the task or the physical or social context in which the task is executed. Traditionally, interactive software was designed to cope with minimal and isolated changes, related primarily to the user, since no other part of the system’s execution context (i.e. platform or context of use) was viable to change. As more and more software and information systems adopt Internet technologies and protocols for greater openness and interoperability, the changes that can take place are far more complex and demanding, as the closed computing environment is replaced by the open, dynamic and almost unbounded nature of the Internet. In the new distributed and networked information-processing paradigm, change management entails a thorough understanding of the GEC of tasks. In our previous work [14] we presented a methodology and a suite of tools for designing user interfaces as a composition of task contexts. A task context was defined as an interactive manifestation (or a style) of an abstract task (i.e. a task comprising abstract interaction object classes). For example, a selection is an abstract task, which can be interactively manifested either as an explicit choice from a list box, or choice from a check box or selection from a panel of options. Each alternative was designated as a distinct task context with explicit rationale and interaction style, describing conditions for style activation/deactivation, object classes, attributes of object classes, preference and indifference expressions, default conditions, etc.

Scenario Networks: Specifying User Interfaces with Extended Use Cases

495

Building on this early conception of a task context, we define the GEC of an abstract task T as a five tuple relation where g is the task’s goal to be achieved by alternative scenarios si ∈ S, and a function f(si) which defines the maximally preferred option of S, given a designated set of constraints C. Such a definition, allows us to model change in interactive software in terms of certain conditions or constraints which propagate alternative interactive behaviours to achieve taskoriented goals. Each constraint in C is a predicate of type c(type, parameter, value). Three types of constraints are relevant, namely user constraints, platform constraints and context constraints. In light of the above, it is now argued that change δ in the execution context of a software system occurs if and only if there exists at least one constraint in C whose parameter value has been modified. The result of recognizing δ and putting it into effect causes the deactivation of an si ∈ S, which was the status prior to recognizingδ, and the activation of a new sj ∈ S, which becomes the new status. In the context of user interface design, it is therefore of paramount importance to identify the changes which may reasonably take place in a system’s execution context, model them in terms of constraints relevant to user interface, and detail the interactive scenarios through which the user interface can cope with the designated changes. To this effect, the study of NFRs early in the design process can help populate and structure the required design space. 3.2 Goal Modelling of Designated NFRs In the context of detailed user interface modelling work, two aspects of NFRs become prominent. The first is explicitly modelling designated NFRs, while the second amounts to modelling how the NFRs intertwine. One possible approach is to use the concepts and notation of the NFR Framework [15] to develop the softgoal interdependency graph for adaptability by identifying and decomposing NFR in conceptual models. Figure 1 presents the hierarchical decomposition for the NFR of adaptability, summarizing how it may be related to other NFRs such as scalability, individualization and platform independence. Softgoals are represented as clouds and can be decomposed to other softgoals either through AND- or OR-decompositions, represented by single and double arcs respectively. An AND-decomposition refines an overall goal into a set of subgoals, which should all be achieved. OR-decompositions refine a softgoal into an alternative set of refinements; one suffices to satisfy the softgoal. The softgoal hierarchy of Figure 1 is typically read from the top. Thus, adaptability requires that a system is capable of detecting the need for change, deciding what change is needed and putting the required change into effect. Deciding on changes needed is considered to be an internal system function, responsible for implementing contextsensitive processing and thus, it does not usually have an interactive manifestation. Consequently, it suffices to explicitly mention it, but no further decomposition is attempted. Detecting the need for change, as pointed out in the previous section entails detecting changes of scale, changes in interaction platform or changes in users and context of use or any combination of them. In all cases detection can be automatic or manual. Figure 1 depicts that change of interaction platform and changes in users and contexts of use constitute AND-decompositions of the ubiquity softgoal, thus linking explicitly ubiquity and adaptability. On the other hand, putting a change into effect entails an OR-decomposition denoting that changes may be propagated either

496

D. Akoumianakis and I. Pachoulakis

on the user interface or on the content to be presented or on both. User interface change is OR-decomposed to physical changes (i.e., lexical aspects of interaction), syntactic changes (i.e. modifications in the dialogue such as confirmation box, alternate feedback, etc) and semantic changes (i.e. alternative overall interactive embodiment such as symbolic versus pictographic presentation of interaction objects, for example [16]). In all cases, the system can effect the changes either automatically or manually (see following section). Adaptability Need for Change Detection

Ubiquity

Scalability [Change of scale]

Platform independence [Change of platform]

Decide on change

Effect change

User interface

Physical Syntactic

Autodetect

Content

Individualization [Change in user]

Semantic

Manualdetect

Autoeffect

Manualeffect

Fig. 1. Softgoal hierarchy for adaptability

3.3 Scenario Crafting A first step in translating a softgoal hierarchy into concrete user interface requirements is to capture softgoals in terms of a small set of reference scenarios to provide a concrete resource for reflection and/or envisioning. Typically, the mapping of goals to scenarios follows the structure of the softgoal hierarchy and thus it is common to start with a top-level scenario describing an abstract task to be achieved and progressively refine the scenario as surrogate goals are considered. It is worth pointing out that at this stage of the analysis, it is really indifferent whether a system already exists or is to be designed. Instead, the only concern of the analyst is to compile narrative descriptions of desirable interactions. This is illustrated by means of an example in Figure 2, which depicts at the top an abstract scenario that is incrementally articulated (i.e. made concrete) through navigating the hierarchy. The dashed lines indicate a link between a goal node in the hierarchy and the corresponding draft scenario. The example is motivated by a file exchange application where a user is required to carry out standard ftp functionality. This is a simple application, which allows authorised users to connect to a server and subsequently manipulate files (i.e. transfer, delete, view).

Scenario Networks: Specifying User Interfaces with Extended Use Cases

Ubiquity

Scalability [Change of scale]

Platform Scenario: ftp functionality using PDA

Reference Scenario: Development of an lightweight ftp process to connect to a server using wireless LAN and offer standard ftp functionality (get/put/delete)

Adaptability Need for Change Detection

Platform independence [Change of platform]

Decide on change

Effect change

User interface Individualization [Change in user]

Physical Syntactic

Auto-detect

Platform Scenario: ftp functionality using desktop terminal

497

Content

Semantic

Manualdetect

Auto-effect

Individualization scenario: The user has gross temporal control – scanning on

Manualeffect

Individualization scenario: The user possesses fine temporal control

Fig. 2. Exemplar scenarios

In subsequent sections of this paper, we will assume a tentative prototype, such as that depicted in Figure 3, and we will be concerned with populating the GEC of the respective tasks taking into account representative adaptability scenarios as described in Figure 2. 3.4 Scenario Relationships Scenario relationships are the primary mechanism for specifying the GEC of a reference scenario. To demonstrate their scope, we will assume a base (or reference) scenario, which executes the tasks of our example case study on a desktop through a style SB (see Fig. 3). There may be several styles serving the same goal through different interaction elements, resulting in different interaction scenarios. Moreover, any two styles may be aggregated to an abstract style, while an abstract style can be segregated to one or more concrete styles. Style aggregation and segregation form two semantic relationships, which are further elaborated through a set of operators, with the most important briefly described below in the context of our reference case study. Alternative_to: Two styles are related with the alternative_to relationship when each serves exactly the same goals and one and only one can be active at any time. For instance, Figures 3 and 4 describe two alternative styles for carrying out the same tasks (on a desktop and PDA respectively). This is typically expressed as follows ∀ g ∈ Goal, g(Si) alternative_to g(Sj). Alternative_to is the main operator for specifying

498

D. Akoumianakis and I. Pachoulakis

Fig. 3. FTP on the desktop

Fig. 4. FTP on a PDA

adaptability of a system with regards to designated quality attribute (e.g., platform independence). Two alternative scenarios are considered as indifferent with regards to all other quality attributes except the ones designated in the alternative_to declaration. This leads to the preference operator described below. Preference: This relationship extends the alternative_to relationship for cases where, given a goal g and g(Si) alternative_to g(Sj) where i # j and i, j > 1. In such cases, there should be a preference order for Si and Sj specified by a preference condition or rule. When executed, the preference condition should place candidate scenarios in a preference ranking (indifference classes), while the most preferred scenario (first indifference class) is the one to be activated. Typically, the range of scenarios in a preference class may be used to augment one another under certain conditions. Augmentation: Augmentation captures the situation where one scenario in an indifference class is used to support or facilitate the mostly preferred (active) scenario within the same indifference class. In general, two scenarios related with an augments relationship serve precisely the same goal through different interaction means. An example illustrating the relevance of the augments to relationship is depicted in Fig. 5. In the example, the scanner is explicitly activated upon button press. There are cases, however, which are better served by an implicit activation of the scanning function (i.e., case of a user with gross temporal control). Parallelism: Parallelism in the execution of interaction scenarios (or concurrent activation) is a common feature in a variety of interactive applications and one which, when properly supported, can serve a number of desirable features such as adaptivity, multimodality and increased usability. For the purpose of the present work, two scenarios are related with a parallel_to relationship when they serve the same goal and are active concurrently. Figure 6 shows an example of parallel scenarios for selecting multiple files to download, where selection from the taskbar by file type (e.g., .ppt) results in automatic checking of all files (in the current selection list) with the appropriate extension. In general, such a feature, which is provided in parallel with selecting through browsing the list and manually checking options, is useful to easily select files in a category.

Scenario Networks: Specifying User Interfaces with Extended Use Cases

499

Specifically, it avoids: (a) scrolling the list up/down to view the file extension of each file separately, and (b) having to increase the width of the “Name” column by hand for long filenames, which is awkward on a PDA.

Fig. 5. Example of augmentation

Example: When the focus is on the “Remote Files” list, the user can press the PowerPoint icon on the taskbar to select all .ppt files in the list. This multi-selection task adds checkboxes to the left of all items in the list, and all files with the appropriate extension (.ppt) are automatically checked

Fig. 6. Parallel scenarios for selection

500

D. Akoumianakis and I. Pachoulakis

3.5 Building the Global Execution Context Graph Having presented the basic scenario relationships, we can now illustrate the GEC graph of our reference example as an extended use case model (see Figure 7). The diagram elaborates a base scenario “Select file with desktop style”, characterizing file selection as a polymorphic task accomplished in parallel through two concurrent scenarios namely selection-bytype and selection-bychecking. The base scenario has two alternatives representing respectively PDAand HTML-like styles. All Fig. 7. The global execution context graph alternatives can be augmented by scanning, which is specialized into two alternatives versions, namely one button auto scanning and two button auto scanning, with the former being the preferred option.

4 Discussion and Concluding Remarks From the discussion, so far, two main conclusions can be drawn. Firstly, an analysis of the GEC of a system’s tasks entails an account of all relevant NFRs in terms of the task T these relate to, the goal g served, the scenarios S which can facilitate them, the function f defining the system’s transition from one state to another and the constraints C which rationalize these transitions. Secondly, NFR-based user interface design requires an analytic approach whereby alternatives are generated and evaluated in terms of suitability, design trade-offs and rationale. All these alternatives should aim to capture plausible interaction scenarios under different softgoal regimes. It is worth noting that this is considered to be a distinct contribution of the present work, as existing proposals for design rationale capture and retrieval are seldom related to NFRs in the course of user interface design. Thirdly, the designated NFRs raise explicit demands upon the outcomes of HCI design and the corresponding artefacts, such as the user interface software architecture, the interaction styles devised to capture alternatives and the way in which they are to be managed in the course of user interface development. For instance, requiring that a quality goal is to realize the autodetect and auto-effect softgoals for all superior softgoals of the hierarchy in Figure 1 implies a single implementation capable of managing a broad range of interaction styles. This however entails novel user interface engineering practices and tools, which at present are not widely available.

Scenario Networks: Specifying User Interfaces with Extended Use Cases

501

References 1. Carroll, J. M., Mack, R., Robertson, S., Rosson, M. B.: Binding objects to scenarios of use, International Journal of Human Computer Studies, 21, 1994, 243-276. 2. Jacobson, I.,: Object-oriented Software Engineering – A use case driven approach, Reading, MA: Addison Wesley, 1992. 3. OMG, Unified Modelling Language 1.3, Object Management Group, 1999. 4. Markopoulos, P., Marijnissen, P.: UML as a Representation for Interaction Designs, Proc. Australian Conf. Computer-Human Interaction, CHISIG, 2000, pp. 240–249. 5. Paternò, F.: Towards a UML for Interactive Systems, Proc. 8th IFIP Working Conf. Eng. for Human-ComputerInteraction (EHC 01), Springer-Verlag, 2001, pp. 7---18. 6. Mori, G., Paternò, F., Santoro: CTTE: Support for Developing and Analyzing Task models for interactive system design. IEEE Trans. on Software Engineering, 28(9), 2002, 1-17. 7. Pinheiro da Silva, P., Paton, W.N.: User interface modelling in UMLi, IEEE Software, 2003 (July/August), 62-69. 8. Blankenhorn, K.: A UML Profile for GUI Layout, Master’s Thesis, University of Applied Sciences Furtwangen, Department of Digital Media, 2004. 9. Mylopoulos, J., Chung, L., Nixon, B. : Representing and using NFRs: a process-oriented approach, IEEE Trans. Software Engineering 18 (6), 1992, pp. 483-497. 10. Hubert Baumeister, Nora Koch, Piotr Kosiuczenko, Perdita Stevens, and Martin Wirsing. UML for global computing. In Corrado Priami, editor, Global Computing. Programming Environments, Languages, Security, and Analysis of Systems. LNCS 2874, November 2003. 11. Mori, G., Paternò, F., Santoro: Design and Development of Multidevice user interfaces through multiple logical descriptions. IEEE Trans. on Soft. Engineering, 30(8), 2004, 1-14. 12. Castro, J., Kolp, M., Mylopoulos, J.: Towards requirements-driven information systems engineering: the Tropos project. Information Systems 27 (2002) 365–389. 13. Olson, J. S., Moran, T. P., 1996, Mapping the method muddle: Guidance in using methods for user interface design. In HCI design: Success stories, emerging methods, and realworld context, edited by Rudisill, M., Lewis, C., Polson, P. B., & McKay, T. D., (San Francisco, CA: Morgan Kaufmann Publishers), pp. 101-121. 14. Akoumianakis, D., Savidis, A., & Stephanidis, C., 2000, Encapsulating intelligent interaction behaviour in unified user interface artefacts, Interacting with Computers, 12, 383-408. 15. Chung, L., Nixon, B., Yu, E. and Mylopoulos,J. “Non-Functional Requirements in Software Engineering” Kluwer Academic Publishers 2000. 16. Miller, L. A. and Stanney: The effect of pictogram-based design on human - computer performance, International Journal of Human - Computer Interaction, 9, 1997, 119-132. 17. Nunes, N.J., et al., Interactive System Design with Object Models, in Reader, A. Moreira, S. Demeyer (eds.), ECOOP'99 Workshop, Springer-Verlag, 1999. 18. Constantine, L.L. and L.A.D. Lockwood, Software for use : a practical guide to the models and methods of usage-centered design, Reading, Mass.: Addison Wesley, 1999. 19. Mark van Harmelen, et al., Object Models in UI Design. SIGCHI Bulletin, 29(4), 1998. 20. Artim, J., et al., Incorporating work, process and task analysis into industrial objectoriented systems development. SIGCHI Bulletin, 30(4), 1998. 21. Elkoutbi, M., Khriss, I., Keller, R.: Generating User Interface Prototypes from Scenarios. In 4th IEEE Int. Symposium on Requirements Engineering, Limerick, Ireland, June 1999. 22. Cysneiros, L-M., Leite, JCSP.: Using UML to reflect non-functional requirements. Proceedings of CASCON 2001, IBM Press, Toronto, 2001, pp: 202-216.