Adaptive Rich User Interfaces for Human Interaction in Business ...

12 downloads 13053 Views 3MB Size Report
Usually, those applications represent business processes (BPs) that .... binding are completely custom, this method does not meet our requirements of.
Adaptive Rich User Interfaces for Human Interaction in Business Processes Stefan Pietschmann, Martin Voigt, and Klaus Meißner Technische Universit¨ at Dresden Faculty of Computer Science, Chair of Multimedia Technology 01062 Dresden, Germany {Stefan.Pietschmann, Martin.Voigt, Klaus.Meissner}@tu-dresden.de

Abstract. In recent years, business process research has primarily focussed on optimization by automation, resulting in modeling and service orchestration concepts implying machine-to-machine communication. New standards for the integration of human participants into such processes have only recently been proposed [1,2]. However, they do not cover user interface development and deployment. There is a lack of concepts for rich business process UIs supporting flexibility, reusability and context-awareness. We address this issue with a concept for building human task presentations from service-oriented UIs. Those User Interface Services provide reusable, rich UI components and are selected, configured and exchanged with respect to the current context.

1

Introduction

Over the past years, the Internet has evolved to a stable and popular application platform. This is especially true for business applications, which heavily employ Web Services to provide functionality in a technology-independent, reusable way. Usually, those applications represent business processes (BPs) that are executed in a service-oriented fashion based on a composition description, the most prominent composition language being WS-BPEL [3]. BPEL focuses on machine-to-machine communication and a fully automatic process execution. Although BP research stems from the modeling and automation of originally human-centered workflows, current process engines and standards like BPEL do not reflect the undisputed importance of human interactions. Common human activities in processes involve data input and validation as well as decision making. Consequently, several vendors like IBM and Oracle provide proprietary BPEL extensions in their engines to support such ”human tasks”. Of course, their use entails interoperability and portability problems [4]. Promising recent proposals like BPEL4People (B4P) [1] and WS-HumanTask [2] allow for a standardized integration of human-based activities in BPEL processes. Problems that persist with these specifications are the development and deployment of the human task user interfaces. Mature standards and tools have been developed for BP modeling, specification and execution, but no comparable efforts have addressed the presentation layer. G. Vossen, D.D.E. Long, and J.X. Yu (Eds.): WISE 2009, LNCS 5802, pp. 351–364, 2009. c Springer-Verlag Berlin Heidelberg 2009 

352

S. Pietschmann, M. Voigt, and K. Meißner

To adequately support human tasks, sophisticated user interfaces are needed. They should, for example, include advanced data visualization techniques (e. g., interactive tables and graphs), allow for multimedia integration (e. g., image slide shows and rich text editors), support collaboration functionality, etc. In prevalent solutions, UIs are usually generated from proprietary markup code. First and foremost, vendor-specific UI definitions contradict portability and interoperability of human task specifications. Furthermore, resulting user interfaces do not meet the necessary requirements. They are usually simplistic, form-based and lead to media disruptions. As an example, users need to open external applications to write a report or look up a route, while the same kind of functionality could be included in a more interactive and intelligent task UI. All in all, there is a lack of concepts for the definition and deployment of flexible and reusable rich UIs in business processes. Thus, development and maintenance of human-involved BPs are time-consuming and costly. Since processes can be accessed from different contexts (user roles and preferences, device characteristics), context-awareness of UIs in this domain poses an additional challenge. To address the above-mentioned problems, we show how to utilize serviceoriented user interfaces at the presentation layer of human tasks. Therefore, we propose a concept for coupling a service-oriented UI integration and composition system developed within the CRUISe1 project and presented in [5,6] with existing business process engines. The remainder of this paper is structured as follows. In Section 2 we present a motivating example and discuss both advances in the integration of human tasks in business processes, as well as strategies for providing the corresponding user interfaces. After giving a brief overview of the CRUISe system in Section 3, our concept for the integrated, declarative UI composition description, the dynamic UI composition based on CRUISe, and the interface between the presentation layer and the underlying human task (engine) are presented in Section 4. Section 5 describes our prototypical implementation with the help of the exemplary process and its corresponding user interface. In Section 6 we conclude this paper and outline future work.

2

Human Interaction in Business Processes

The problem domain of human interaction in BPs can be divided into two parts (cf. Figure 2): (1) Human Task Integration and (2) Human Task Rendering. The former refers to the formal and practical integration of human activities in BPELbased BPs, including data and role management. The latter deals with issues regarding the UI description, generation and deployment for such human tasks, as well as with its interface to the task engine. We will briefly discuss related work 1

The CRUISe project is funded with means of the German Federal Ministry of Education and Research under promotional reference number 01IS08034-C.

Adaptive Rich User Interfaces for Human Interaction in Business Processes

353

concerning both parts after introducing an exemplary human-involved process which illustrates the necessity of rich UIs for human tasks. 2.1

An Exemplary Business Process

Together with an industry partner we designed a business process which serves as a motivating example and use case for our implementation. To reduce complexity we limited the process to a sequence of essential (human) tasks. It is without doubt, that in the real-world, it would include additional system tasks, which are independent of our concept, though. Our use case represents an insurance process (Figure 1). It is started by an incoming notification of claim from a customer (1), which is reviewed by an insurance employee in a human task (2). If the claim is accepted, it is forwarded to a field worker for planning an on-site-inspection (3). When he has finished his inspection and handed in the results (4), the case can be decided on by an insurance employee (5). (1)

Claim

HT

(2)

Estimate

HT

(3)

Onsite Prep HT

HT

(4)

Inspection

HT

(5)

Assessment

Fig. 1. Human-Involved Insurance Process

As can be seen, upon activation this process includes at least four human tasks. Every one of them needs to provide a corresponding user interface to allow the different people involved to input data needed. In task (2), it needs to be decided whether the claim is denied, or which field worker it is assigned to, alternatively. The task UI for (3) should provide information on the case and how to get to the customer. The UI for task (4) has to provide means for data input from the inspection, e. g., for photos and a textual report. Finally, the employee has to be provided with preprocessed, visualized data in (5) to support his decision-making. We identified several UI parts that are very common in business processes and would significantly simplify human task interaction. They relate to oftenneeded input data. As an example, instead of putting in color codes, users can be offered a color picker. In our example, maps can be used to visualize the route to the customer for field workers, and to show service providers nearby that may repair damages. Field workers can be provided a rich text editor for their reports, and media integration, such as an image uploader and browser. For insurance employees, we can utilize rich UI components that provide sophisticated data visualization, such as interactive graphs. In contrast to prevalent solutions, we argue that such rich UI components can make business processes including human tasks much more efficient and thereby more profitable.

354

S. Pietschmann, M. Voigt, and K. Meißner

2.2

Integration of Human Tasks in BPs

Following a whitepaper by IBM and SAP, in 2007 all major BP engine vendors jointly released two specifications for standardization by OASIS: BPEL4People [1] (B4P) and WS-HumanTask [2] (WS-HT). B4P introduces a PeopleActivity in BPEL that allows for the integration of human interactions. Those are specified based on WS-HumanTask, which provides an XML syntax for modeling human tasks and notifications. Furthermore, WS-HT defines an API for accessing task instances and controlling their life cycle from a client (often referred to as Task List Client, TLC). According to [7], both specifications show a fair support of common workflow resource patterns. Definition, generation and deployment of a concrete user interface for human interactions within a business process are not covered by WS-HumanTask. It does define a rendering element which is supposed to enclose UI-specific markup, but its content is unspecified. Thus, related efforts regarding process and task UIs are discussed in the next section. 2.3

Strategies for Providing User Interfaces for Human Tasks

Research has covered different aspects of human-involved BPs, e. g., modeldriven development [8,4], user access control [9] and integration with existing BPEL engines [10]. UI specification and provision in this context has been neglected, so far. Since WS-HT does not cover human task rendering either, we will present alternative strategies for service or BP UIs in the following, and discuss how they relate to the requirements specified in Section 1. A pragmatic approach for UIs in SOA are Dedicated Client Applications. They are built with Java, .NET, or the like, and offer a “perfect-fit” UI at best. Service orchestration is either defined implicitly in the program code or facilitated by an underlying process engine. Since the user interface and its processbinding are completely custom, this method does not meet our requirements of a reusable, standards-compliant UI provision. Another popular approach to SOA UIs are Web Portals [11], where local or remote portlets (WSRP) include the services’ presentation. Usually, each portlet is developed like a client application. Yet, alternative approaches like WSUI employ UI generation techniques, which are discussed later. Portal-based solutions imply the setup of a heavy infrastructure and entail poor practical reusability due to proprietary vendor-specific extensions. Furthermore, the binding of portals to process engines as offered by SAP and IBM is not standardized at all. With the advent of “Web 2.0”, mashups have become an alternative SOA composition principle [11]. In the business world, they are referred to as Enterprise Mashups – applications which form a “user-centric micro-combination of standards-based internal and external data sources” [12]. Prominent tools and platforms, e. g., from Serena, JackBe and Corizon, offer visual composition mechanisms and widget-based UIs, but do not provide standardized integration with BPEL engines. Research efforts from the workflow point of view cover mashup-like, ”Web-centric” composition models [13,14]. However, none of

Adaptive Rich User Interfaces for Human Interaction in Business Processes

355

these approaches focus on the development and deployment of a corresponding (context-aware) web user interface for the resulting mashup application. A popular strategy for SOA UIs is automatic UI Generation from service descriptions. It usually employs UI-specific extensions to WSDL, such as GUIDD [15]. While research has mainly focused on mobility, multimodality [16] and adaptivity [17] of resulting service UIs, some works propose to integrate such techniques with business processes, e. g. the XML Interaction Service presented in [18], and the model-to-code transformation in [19]. In commercial systems, like the ActiveVOS engine (cf. Section 5), XSLT is used. UI generation often results in rather simplistic HTML or XForms which lack desired interactivity and flexibility. More advanced UIs are not supported by existing solutions and would dramatically complicate UI description and the underlying generation logic. In contrast to these system, our approach uses the declarative UI description to integrate existing, distributed, rich UI components for human interaction. It is clear to see, that no current approach allows for a standards-compliant integration of rich, reusable, interactive user interfaces for human tasks in BPELbased BPs. Component-based web UIs lack a standardized process binding, while integrated, generation-based approaches do not facilitate UIs that are as rich and flexible as needed. Thus, in the next section we will present a promising alternative for providing rich UIs in different application domains.

3

CRUISe: Service-Oriented UI Composition

As can be seen from the last section, standardization for integration of human tasks in BPs is actively driven by several major vendors of BPEL engines. Yet, definition and deployment of the corresponding UIs are largely unspecified and based on proprietary solutions. In this paper we propose the coupling of BPEL processes with the service-oriented user interface management system CRUISe [5] to facilitate rich, flexible and reusable user interfaces for human interaction in BPs. While BPEL allows for the orchestration of back end services, CRUISe Business Processes

Human Tasks

BPEL (B4P)

WSHumanTask

BPEL Engine

BPEL4People WS-HT Engine

WS WS WS WS (1) Human Task Integration

User Interface

Composition Description

CRUISe

WS WS WS UIS (2) Human Task Rendering

Fig. 2. Conceptual idea of a CRUISe-based task UI

356

S. Pietschmann, M. Voigt, and K. Meißner

realizes the orchestration of User Interface Services (UIS) that provide the presentation layer for human tasks, as illustrated in Figure 2. Concept, architecture and related work of CRUISe have been presented and discussed in [5,6]. In the following, we will briefly outline relevant parts of CRUISe to illustrate its convenience for the provision of human task UIs. We argue that web applications can be solely based on services that provide data, business logic and user interfaces – we focus on the latter. By using distributed services for the dynamic composition of a web application UI we can exploit advantages of service-oriented architectures, like reusability, customizability, and technology-independence, at the presentation layer. Figure 3 gives an overview of CRUISe. Its core concepts are User Interface Services (UIS) that encapsulate generic, reusable web UI components. They are dynamically selected, configured and integrated into a homogeneous, web-based UI with help of the Integration Service, which supports different Integration Contexts. This means, that the integration can be carried out on the server as well as on the client (Figure 3 shows the server-side integration).

Composition and Adaptation

Client-Server-Environment CRUISe Composition Description

Integration Service

Application Generator

find

Web Application UI CRUISe Runtime

update

UIS Registry

Context Service

bind

register

WS

WS

WS DS

Data Services

WS FS

UIS

Functional Services

UI Services

Fig. 3. Architectural overview of CRUISe

In CRUISe, the presentation layer of web applications is described declaratively in a Composition Description. It specifies the orchestration of several UIS, i. e., their configuration, layout, binding to back end services, and the event flow between them. This description is transformed into an executable web application containing UI placeholders by the Application Generator. Input and output of this generation process are technology-specific and independent from the overall concept. In the case of the prototype presented in [5], the composition description is based on JSP, and the transformation results in a servlet. Generated applications run in the CRUISe Runtime, which controls event- and data-flow specified in the composition description and allows for homogeneous

Adaptive Rich User Interfaces for Human Interaction in Business Processes

357

binding of services providing business logic and data. The location of the Runtime may be server- or completely client-side. At application initialization, that is, when a client sends a request, the UI placeholders are dynamically filled by embedding integration code for User Interface Services. Therefore, the Runtime calls the Integration Service which is responsible for finding UIS in a UIS Registry that match the given application requirements and context. Those are ranked by their accuracy of fit, and the integration code for the most suitable UIS is returned and dynamically included into the application. As mentioned above, integration code can be, for instance, JSP, portlet, or client-side JavaScript code. When the integration process is finished, UI services are initialized and may load remote UI code or content. Again, this UIS binding can be carried out on the server (by downloading UI components to the web server) or on the client (by loading remote UI components with JavaScript, comparable to Google’s APIs). Overall, this architecture has numerous benefits: it allows for easy, declarative authoring of a web application UI, it facilitates the composition of web UIs from reusable, configurable, rich components that are provided “as-a-service” and can thus be integrated dynamically with the help of their interface. Since application and Integration Service are decoupled, the latter can be used in different integration contexts, i. e., by different types of applications. Finally, the integration at run time allows for context-aware configuration and exchange of UIS and thereby adaptive user interfaces. In the next chapter we will show how rich UI components can be provided for human interaction in BPs. Thereby, WS-HT clients form a specific integration context that can benefit from the CRUISe Integration Service.

4

Service-Oriented User Interfaces for BPEL-Based Business Processes

This section covers our concept of binding BPEL engines, responsible for the orchestration of back end services, with the CRUISe system orchestrating User Interface Services. This solutions offers a separation of concerns between task and UI management and simplifies authoring and maintenance while offering rich and adaptive UIs for human tasks. On a side note, it proves the seamless integration and practicability of CRUISe for human-involved processes. An overview of the concept is given by Figure 4. It comprises three interrelated concerns: (1) the definition of a CRUISe-based human task UI, (2) the integration of User Interface Services as task UIs at run time, and (3) the communication between the UIS-based presentation layer and the underlying human task (engine). All three aspects are discussed in more detail in the following. 4.1

Composing the Human Task UI

Our concept integrates seamlessly into the authoring process of a human task. As usual, an author needs to define a task with the means of WS-HumanTask. As

358

S. Pietschmann, M. Voigt, and K. Meißner

BPEL Engine Human Task Engine Task List Client

BPEL Description

WS-HumanTask Description CRUISe UI Description

Integration Service

Bridge find

Task UI Client Runtime

UIS Registry

bind

UIS

Fig. 4. Integration of CRUISe with a business process infrastructure

mentioned in Section 2.2, WS-HT defines an element named rendering whose content is not covered by the specification and is supposed to contain an XMLbased description of the task’s user interface. Figure 5 shows that multiple renderings may be defined, and that we add a CRUISe-specific one (line 2). Because of its declarative, XML-based nature, we can embed a CRUISe Composition Description (CCD) there.

Fig. 5. CRUISe UI definition in WS-HumanTask rendering

The CCD, i. e., the UI description, contains “hot spots” which are later filled with UI components provided by User Interface Services. Those hot spots are defined with the help of specific tags (uis:component). In Figure 5 the CCD

Adaptive Rich User Interfaces for Human Interaction in Business Processes

359

contains a single UIS declaration (line 3). It contains properties of the UI component to integration, e. g., the mode “route” of a map (lines 4–7), initialization data (lines 8–13), subscriptions for incoming and outgoing events (line 14), the UIS location (line 15) and further configuration data for the UI component, e. g., an API key. Usually, this hot spot would be embedded in a complete, declarative UI description, whose language is not stipulated by CRUISe. Depending on the server environment, i. e., the technology used by the Task List Client, it can be XHTML, JSP, PHP, XAML, or the like. Line 11 exemplifies access to task instance data: In this example from our prototype (cf. Section 5), the driving directions from an insurance company to a customer are visualized by a UIS. Therefore, the customer’s address is made available in the task instance, and accessed from the UI composition via an XPath expression. This mechanism allows for run time access to relevant task instance data from the task’s presentation layer.

4.2

Integration of User Interface Services

WS-HT defines the Task List Client (cf. Section 2.2) as an interface between task engine and client. With its help users can list, view, and execute human tasks. To this end, the TLC makes use of the task engine’s standardized web service interface to retrieve and update task instance data. Figure 4 shows, how CRUISe (cf. Figure 3) and the business process infrastructure blend in. As can be seen, the TLC is extended with the server-side CRUISe Runtime. Conforming to the concept presented in Section 3, it contains communication functionality to establish a connection with the CRUISe Integration

Human Task Engine Renderings

Rendering Selection Task UI

Task Instance Data Parser

CCD

Bridge

CRUISe Runtime Other Renderer (e.g. XSLT)

Task List Client Client Fig. 6. TLC-CRUISe-Binding

CRUISe Integration Service

360

S. Pietschmann, M. Voigt, and K. Meißner

Service (CIS). For the CIS, the TLC is yet another web application with a specific integration context. Details of the TLC extension and the binding with the Integration Service are shown in Figure 6. The server-side CRUISe Runtime contains a Parser and a Bridge component. At run time, the TLC requests presentation markup (the renderings) from the task engine. In case of a CRUISe rendering, the CCD is processed by the Parser, which extracts all UIS definitions. After resolving expressions that refer to task instance data, the integration process is executed as described in Section 3. The Bridge forwards UIS definitions to the Integration Service which returns integration code for the appropriate UIS. Once all integration code has been loaded and composed to a complete task UI by the Runtime, it is returned to the TLC and sent to the client. UIS Binding takes place on the client. The UIS integration code is executed when the task UI is initialized in the browser, and loads remote UI components from UIS servers. This mechanism is comparable to loading a Google Map from remote. Event flow between integrated UI components is controlled by the CRUISe Runtime, based on the wiring specified in the CCD. 4.3

UI-HT-Interaction

In WS-HT, every human task has an incoming (from the BP) and an outgoing message (back to the BP). Both need to conform to a given XML schema defined during the modeling of the process. The compliance of the incoming message is assured by the BP engine and not in our responsibility. Its data can be accessed from the UI via XPath expressions as mentioned in Section 4.1. The consolidation and validation of the outgoing message, though, is a challenge for the presentation layer. As user-generated data resides in different UI components (provided by different UIS), it needs to be combined and prepared to conform to the given schema. Thus, a CRUISe-based task UI needs to provide means for data consolidation, transformation and submission to the human task

Human Task Engine Transformation Stylesheet

Task Output Message TS-UIS

UIS 1

UIS 2

CRUISe Runtime

...

UIS n

Fig. 7. Data Processing by the TaskSubmission-UIS

Adaptive Rich User Interfaces for Human Interaction in Business Processes

361

engine. For this purpose, a task author includes a specific UIS in the task UI: the TaskSubmission-UIS (TS-UIS). Its structure is illustrated in Figure 7. At design time, the TS-UIS is specified by the task author as data sink for UI components providing task-relevant data. At runtime, it collects all data and stores it in a temporary DOM. For this purpose, the TS-UIS includes additional transformation logic to XML, as JSON is typically used for client-side information exchange. This data aggregation and transformation functionality is fully transparent to the user. The visual representation of the TS-UIS contains a status indicator, which shows if all information necessary has been supplied by the user, and means to complete the task, e. g., a button. On task completion, aggregated UI data is mapped to a valid task output message for the human task. This mapping is based on a transformation stylesheet provided by the task author. 4.4

Context-Aware Task UIs

As mentioned in Section 1, an additional requirement for web-based process UIs are heterogeneous usage contexts, e. g., different devices (PDA vs. desktop PC) and different situations (private vs. public) it is accessed from. Instead of integrating UI adaptation logic in the task description, our concept supports the separation of concerns between task management (task engine, TLC) and management of the (adaptive) UI (CRUISe). Context-awareness in CRUISe can be attained in different ways. For one, context data can directly influence UIS configuration, e. g., as initialization parameters. In addition, context parameters can impact the UIS selection process. For instance, the availability of necessary plug-ins on the client (e. g., Flash) can be taken into account when deciding which UI component to integrate. Beside usage, user and device context, the process context plays an important role with regard to BPs. Since task instance data can be referenced in the composition description, they can affect the contextualization, as well. As a result, we can adapt the UI depending on the task status or relevant process data. In this paper we do not specifically focus on UI adaptation for human tasks. Yet, by decoupling BP engines from these aspects by using the CRUISe system and its adaptation mechanisms, context-aware user interfaces can be provided for any standards-compliant task engine.

5

Implementation

To verify our concepts we integrated the CRUISe Integration Service with a BPEL4People engine and tested it with an exemplary business process containing several human tasks. In the following we will present details of our implementation, the process and its corresponding CRUISe-based task UIs. Of the two commercially available BPEL engines that claim support for B4P and WS-HT – Intalio Tempo and Active Endpoints ActiveVOS – we found that only Active Endpoints’ product was completely standards-compliant.

362

S. Pietschmann, M. Voigt, and K. Meißner

Thus, we decided to base our proof-of-concept prototype on their BPEL engine ActiveBPEL 5 and the corresponding TLC, which are available as open source. It constitutes a minimally invasive integration, because it only affects the ActiveBPEL TLC and relies on the task engine’s interface standardized in WS-HT. Hence, our solution works for any standards-compliant human task engine. In our prototype the composition description is defined as XHTML containing placeholders for UIS (cf. Section 4.1), and processed by an XSLT style sheet used by ActiveBPEL to generate the user interface. We extended the ActiveBPEL TLC – a Java web application – with the Rendering Parser and the CRUISe Bridge as described in Section 4.2. Conceptually, both form the server-side CRUISe Runtime. Web service communication with the Integration Service is based on Apache Axis, a framework already used by the TLC. It was necessary to bypass the client-side “same origin policy” [20] for the interaction between TaskSubmission-UIS and task engine. Therefore, we added a REST and a SOAP proxy mechanism to the Integration Service and extended it with “JSON with Padding” functionality – a mechanism commonly employed for client-side access to content from external domains.

Fig. 8. Screenshot of a CRUISe-based UI within the ActiveBPEL TLC

Adaptive Rich User Interfaces for Human Interaction in Business Processes

363

To test our implementation we designed the business process presented in Section 2.1 with the help of the ActiveVOS Designer, and used it with the extended TLC. One of the resulting human task UIs is shown in Figure 8. It is embedded in the TLC frame (A), which provides additional information about the task. The CRUISe-based task UI below differs a lot from prevalent, form-based interfaces and offers more interactive and “rich” UI components. For instance, it includes a map (B1 and B2 ), a rich text editor (C), a picture browser (D) and a charting UI component (E) – all of them realized and integrated as fully configurable and reusable UIS. The reusability is exemplified by the map UIS being integrated twice in different modes, i. e., configurations: B1 (minimized) is visualizing a route on a map, while B2 provides textual directions. The TS-UIS on top (F) indicates that data is missing for task submission. Once it has been entered and validated, a button for task completion appears. With the help of our prototype we could prove the practicability of our concept. The resulting UIs are more useful, effective and context-aware compared to those of existing systems at stable authoring cost.

6

Conclusion and Future Work

Our paper describes how the user interface management system CRUISe can be utilized to provide rich, reusable UIs for human tasks in BPEL processes. The concept presented fills the gap of UI Rendering for human interaction in current standards. By integrating service-oriented UIs, development and reuse of human task presentations is greatly simplified, and resulting UIs become richer and context-aware. Since our approach relies on existing standards and decouples UI management and process execution, CRUISe-based UIs can be reused with different WS-HT-compliant BP engines. Apart from these benefits, our concept and prototype show the applicability of CRUISe in different integration contexts – in this case the server-side integration in a WS-HT Task List Client. In the future we plan to investigate into UIS of higher granularity, so that one UIS may provide a complete human task UI. We are further interested in combining multiple human tasks into one, integrated user interface, and to incorporate collaboration techniques. In CRUISe we are focusing on the contextaware selection and configuration of UIS by improving context monitoring and incorporating Semantic-Web-based service matching mechanisms.

References 1. Agrawal, A., et al.: WS-BPEL Extension for People (BPEL4People) Version 1.0. (June 2007), http://www.ibm.com/developerworks/webservices/library/specification/ ws-bpel4people/ 2. Agrawal, A., et al.: Web Services Human Task (WS-HumanTask) Version 1.0. (June 2007), http://www.ibm.com/developerworks/webservices/library/specification/ ws-bpel4people/

364

S. Pietschmann, M. Voigt, and K. Meißner

3. Alves, A., et al.: Web Services Business Process Execution Language Version 2.0. (April 2007), http://docs.oasis-open.org/wsbpel/2.0/wsbpel-v2.0.pdf 4. Link, S., Hoyer, P., Schuster, T., Abeck, S.: Model-Driven Development of Human Tasks for Workflows. In: Proc. of the 3rd Intl. Conf. on Software Engineering Advances (ICSEA 2008), October 2008, pp. 329–335 (2008) 5. Pietschmann, S., Voigt, M., Meißner, K.: Dynamic Composition of Service-Oriented Web User Interfaces. In: Intl. Conf. on Internet and Web Applications and Services (ICIW 2009), Mestre/Venice, Italy, pp. 217–222. IEEE CPS, Los Alamitos (2009) 6. Pietschmann, S., Voigt, M., R¨ umpel, A., Meißner, K.: CRUISe: Composition of Rich User Interface Services. In: Proc. of the 9th Intl. Conf. on Web Engineering (ICWE 2009), San Sebastian, Spain. Edition 5648, pp. 473–476. Springer, Heidelberg (2009) 7. Russell, N., van der Aalst, W.M.: Work Distribution and Resource Management in BPEL4People: Capabilities and Opportunities. In: Bellahs`ene, Z., L´eonard, M. (eds.) CAiSE 2008. LNCS, vol. 5074, pp. 94–108. Springer, Heidelberg (2008) 8. Holmes, T., Tran, H., Zdun, U., Dustdar, S.: Modeling Human Aspects of Business Processes – A View-Based, Model-Driven Approach. In: Schieferdecker, I., Hartman, A. (eds.) ECMDA-FA 2008. LNCS, vol. 5095, pp. 246–261. Springer, Heidelberg (2008) 9. Thomas, J., Paci, F., Bertino, E., Eugster, P.: User Tasks and Access Control over Web Services. In: IEEE Int. Conf. on Web Services (ICWS 2007), Salt Lake City, USA, pp. 60–69. IEEE, Los Alamitos (2007) 10. Holmes, T., Vasko, M., Dustdar, S.: VieBOP: Extending BPEL Engines with BPEL4People. In: 16th Euromicro Conf. on Parallel, Distributed and NetworkBased Processing (PDP 2008), February 2008, pp. 547–555 (2008) 11. Steger, M., Kappert, C.: User-facing SOA. Java Magazine, 65—77 (March 2008) 12. Crupi, J., Warner, C.: Enterprise Mashups Part I: Bringing SOA to the People. SOA Magazine (XVIII) (May 2008) 13. Pautasso, C.: Composing RESTful services with JOpera. In: International Conference on Software Composition 2009, Zurich, Switzerland. LNCS, vol. 5634, pp. 142–159. Springer, Heidelberg (2009) 14. Curbera, F., Duftler, M., Khalaf, R., Lovell, D.: Bite: Workflow composition for the web. In: Kr¨ amer, B.J., Lin, K.-J., Narasimhan, P. (eds.) ICSOC 2007. LNCS, vol. 4749, pp. 94–106. Springer, Heidelberg (2007) 15. Kassoff, M., Kato, D., Mohsin, W.: Creating GUIs for Web Services. IEEE Internet Computing 7(5), 66–73 (2003) 16. Steele, R., Khankan, K., Dillon, T.: Mobile Web Services Discovery and Invocation Through Auto-Generation of Abstract Multimodal Interface. In: Intl. Conf. on Information Technology: Coding and Computing, vol. 2, pp. 35–41 (2005) 17. He, J., Yen, I.L.: Adaptive User Interface Generation for Web Services. In: Proc. of the IEEE Intl. Conf. on e-Business Engineering (ICEBE 2007), Washington, DC, USA, pp. 536–539. IEEE CS, Los Alamitos (2007) 18. Kuo, Y.S., Tseng, L., Hu, H.C., Shih, N.C.: An XML Interaction Service for Workflow Applications. In: Proc. of the ACM Symposium on Document Engineering (DocEng 2006), pp. 53–55. ACM Press, New York (2006) 19. Torres, V., Pelechano, V.: Building Business Process Driven Web Applications. In: Dustdar, S., Fiadeiro, J.L., Sheth, A.P. (eds.) BPM 2006. LNCS, vol. 4102, pp. 322–337. Springer, Heidelberg (2006) 20. Jackson, C., Bortz, A., Boneh, D., Mitchell, J.C.: Protecting Browser State from Web Privacy Attacks. In: Proc. of the 15th Intl. Conf. on World Wide Web (WWW 2006), Edinburgh, UK, pp. 737–744. ACM, New York (2006)