Manouselis N., and Sampson D. (2004) Multiple Dimensions of User Satisfaction as Quality Criteria for Web Portals. In the Proc. of the IADIS WWW/Internet Conference, Madrid, Spain, October 2004.
MULTIPLE DIMENSIONS OF USER SATISFACTION AS QUALITY CRITERIA FOR WEB PORTALS Nikos Manouselis Department of Technology Education and Digital Systems University of Piraeus, 150 Androutsou Street, Piraeus GR-18532 Greece and Advanced e-Services for the Knowledge Society Research Unit Informatics and Telematics Institute (I.T.I.) Centre for Research and Technology – Hellas (CE.R.T.H.) 42, Arkadias Street, Athens, GR-15234 Greece
[email protected]
Demetrios Sampson Department of Technology Education and Digital Systems University of Piraeus, 150 Androutsou Street, Piraeus GR-18532 Greece and Advanced e-Services for the Knowledge Society Research Unit Informatics and Telematics Institute (I.T.I.) Centre for Research and Technology – Hellas (CE.R.T.H.) 42, Arkadias Street, Athens, GR-15234 Greece
[email protected]
ABSTRACT In this paper, we present the application of a quality evaluation framework assessing the multiple dimensions that might affect users’ satisfaction from a web portal. The framework applies a multi-criteria model that synthesizes assessment upon the multiple dimensions into a set of monitorable quality metrics. We demonstrate the application of the proposed framework in the context of the evaluation of the Greek Go-Digital e-business awareness and training portal for very small and medium enterprises (vSMEs). More specifically, we present how the results of the currently ongoing final evaluation of the Greek Go-Digital portal will be analyzed using this evaluation framework. KEYWORDS Web portals, quality criteria, users’ satisfaction.
1. INTRODUCTION Web portals are an emerging topic of Web Technologies, attracting the attention of both the Industry and the Research community. As a result, a number of related studies and commercial applications have been launched during the recent years. In this context, evaluation of web portals is an important issue towards addressing the user needs and achieving high quality of provided services. In general, a web portal offers a variety of services in a continuing development. It is desired to adopt evaluation methods and instruments that will allow the combination of different means of assessment and provide results exploitable in several ways. For example, an interesting perspective of evaluation is identifying and adopting evaluation methods that go beyond the separate assessment of different operational aspects of web portals, and that provide synthesized results to the key decision makers - being either those financially supporting the service or those involved in the design, development, operation and/or exploitation of the service. Moreover, we can also identify the aspect of assessing the satisfaction of users belonging in
Manouselis N., and Sampson D. (2004) Multiple Dimensions of User Satisfaction as Quality Criteria for Web Portals. In the Proc. of the IADIS WWW/Internet Conference, Madrid, Spain, October 2004. different evaluation groups. The above calls for subjective evaluation methods and tools that can supplement the rest of the evaluation activities (Adelman, 1992) synthesize assessment results, and facilitate the improvement of the offered services. In this paper we present the initial results of how a multi-criteria evaluation framework is applied to synthesize the multiple dimensions related with the satisfaction of the users of a web portal under study. More specifically, we present an example of the application of the framework to analyze the initial results of the currently ongoing final evaluation of the Greek Go-Digital vertical portal for vSMEs e-business awareness and support.
2. BACKGROUND 2.1 Web Portal Generic Features and User Satisfaction Initially the term web portal was used to refer to well-known internet search and navigation sites that provided a starting point for web visitors to explore and access information on the World Wide Web (Winkler, 2001). The term “internet portal” or “web portal” began to be used to describe such mega-sites (such as Yahoo!, Excite, AOL, MSN, Netscape Netcenter, and others) because many web visitors used them as a ‘starting point’ for their web surfing. Until then, portal technologies has significantly matured and specialised, and a diverse range of portal types have been developed and used in different contexts (Portals Community, 2003). A main characteristic of web portals is that they are a special breed of web sites, providing a blend of information, applications and services (Waloszek, 2001). In general, web portals provide a single point of access to information and/or a single point of information interchange (Komorski et al., 1998; Meehan, 1998). Several researchers also outline the importance of developing web portals that continuously attract the users by serving as a gateway to information or internet services. Since web portals mainly aim to serve as a reference point for users with specific interests, the satisfaction of users is a primary goal for the actors involved with the development and operation of a web portal (Zhang et al., 1999). In order to achieve high quality web sites, the designers have to first understand the different quality dimensions that users expect, and then relate the quality characteristics to the design features (von Dran & Zhang, 2002). In our work, we study user satisfaction from the dimensions related with the following portal features (Waloszek, 2001; Staab et al., 2000; Winkler, 2001; Ivory & Hearst, 2002; Blankenship, 2001; Avouris et al., 2003; Lacher et al., 2001): the web portal content (in terms of contained information or access to external information resources), the web portal design (in terms of providing users with a pleasant, usable and stable environment), the web portal personalization capabilities (in terms of serving users’ specific preferences and needs), and the web portal support to the formulation of virtual communities of users (in terms of bringing together users with similar interests and needs). The above identified features can be furthermore specialized and extended if someone focuses on more specific categories of web portals. For example, the case of enterprise portals (Hazra, 2002) calls for a careful study on aspects such as data and knowledge management, virtual workspace creation and sharing, high degrees of security, etc. In each case, the generic web portal features and the user satisfaction dimensions for each one of them can be appropriately specialized. They can all be considered as factors that affect users’ satisfaction from the blend of information and services that web portals consist of.
2.2 Quality Indicators Synthesis The methodology used for the quality indicators synthesis (Sampson & Manouselis, 2004) engages principles and tools from multiple criteria analysis and is based on a value-focused approach - that is, representing the problem parameters in the form of value functions (Keeny, 1992). The multi-criteria model used builds upon existing methods for assessing and synthesizing user satisfaction (Siskos et al., 1998) and has the same philosophy with similar approaches that have been applied in different domains (Adelman, 1992; Mayard et al., 2001).
Manouselis N., and Sampson D. (2004) Multiple Dimensions of User Satisfaction as Quality Criteria for Web Portals. In the Proc. of the IADIS WWW/Internet Conference, Madrid, Spain, October 2004. The first step of the methodology, concerns the specification of the evaluation system: that is the definition of the criteria and sub-criteria upon which the different dimensions of user satisfaction from a web portal will be measured. The criteria that will be used should adequately describe the portal as an entity and should be constructed in a manner that will not allow their overlapping. The second step of the methodology concerns the definition of the synthesis model. The actual assessment of the users’ satisfaction upon the different dimensions of the web portal is carried out at the lower level of the evaluation system. The assessment of users’ satisfaction for the higher levels is then synthesized using the lower level sub-criteria assessment results. Each of the criteria and sub-criteria is assumed to have its own importance, compared to the other criteria at the same level. Gradually, a set of synthesized performance indicators are calculated, depicting the synthesis of the assessment results upon the different criteria according to the importance of each sub-criterion. The third step of the methodology, concerns the techniques for the analysis of the evaluation results. In the context of the framework introduced in (Sampson & Manouselis, 2004), the following analysis phases are engaged: • Descriptive statistical analysis of the initial (not synthesized) evaluation results for each one of the portal dimensions assessed (that is, the leaf nodes of the evaluation system). This analysis allows for studying the users’ responses upon each one of the evaluation dimensions separately. It also allows identifying percentages of users with similar responses, frequencies of answers in their responses, and mean values calculation of the responses. • Descriptive statistical analysis of the synthesized results from all users upon each level of the evaluation system. This analysis is based studying the synthesized user responses regarding the criteria and sub-criteria of each level. It allows for an overall view of users responses for each criterion, instead of the detailed view of specific dimensions that the previous phase allows. • Calculation and examination of the global and partial quality performance indicators. This analysis provides the calculation of synthesized performance indicators and allows for their comparison with initially defined goals (for example, achieving a score over than 50% in the global performance of the web portal). • SWOT analysis: Strengths-Weaknesses-Opportunities-Threats analysis (Johnson et al., 1989) of the performance vs. importance results for each of the portal criteria or sub-criteria is used to examine the ‘dynamics’ of each criterion’s dimensions (in the form of Strengths, Weaknesses, Opportunities and Threats). This analysis provides a tool of assessing the portal’s possible strengths, giving the opportunity to take advantage of them, and the identification of possible weaknesses and threats, providing an indication of current and future risks. In the following section we will present the application of the methodology for the real case study of a web portal: the Greek Go-Digital vertical portal for vSMEs e-business awareness and training. We present the how the results obtained so far from the ongoing evaluation of the Go-Digital portal can be analyzed following some of the techniques above.
3. CASE STUDY: SATISFACTION OF THE GO-DIGITAL VERTICAL PORTAL USERS 3.1 The Go-Digital Portal The Greek Go-Digital Programme is a national initiative of the Greek government co-funded by the Operational Programme “Information Society” of the Greek Ministry of Development, the Operational Programme “Competiteveness” of the Greek Ministry of Economy and Finance, under the 3rd European Community Support Framework. The main objective of the Greek Go-Digital Initiative is to promote the deployment of e-business in very small enterprises (vSMEs) and their familiarization with Digital Economy. Following the EU policies and international trends, the Greek Go-Digital Programme is a massive initiative addressing a target group of more than 50.000 Greek vSMEs, as programme participants. In this context, the
Manouselis N., and Sampson D. (2004) Multiple Dimensions of User Satisfaction as Quality Criteria for Web Portals. In the Proc. of the IADIS WWW/Internet Conference, Madrid, Spain, October 2004. “Educational Support of the Go-Digital” Programme aims to address the lifelong training needs of vSMEs and their awareness regarding popular e-business issues and the Digital Economy. The requirement for flexible structures of on-demand training support that can be adapted to the individual vSMEs’ needs and preferences, along with the particularities of the Greek disperse geographical distribution, called for the design and deployment of a blended e-training strategy. This training framework consists of: on-site training visits by trainers; setting up and maintenance of a Help Desk service that is both telephone-based and online; development and maintenance of the Go-Digital portal (http://www.goonline.gr) providing a set of web-based services with added-value to support e-business training and awareness of vSMEs. The main objective of the Go-Digital e-Business Training and Awareness Vertical Portal is the establishment and operation of a national web-based reference portal, to support e-business awareness and training of more the Greek vSMEs. The portal offers numerous web-based services, integrated in the following portal’s services centers: the Information services center, the E-learning center, the Activities center, the Community services center, the Go-Digital Programme services center, and the Help-desk services center. The Go-Digital vertical portal incorporates a number of different services that aim to address the dynamic needs of a wide audience of end-users with diverse profiles. It is therefore a web portal under continuous evolution, calling for a well-specified framework of measuring user satisfaction and benchmarking its performance. Therefore it constitutes an appropriate case study for the application of the proposed methodology.
Figure 1. The evaluation criteria system used
3.2 Evaluation of the Greek Go-Digital Portal The initial evaluation of the Go-Digital vertical portal took place during the second half of 2002, and right after version 1.0 of the Go-Digital portal was deployed. It constituted a formative evaluation phase that aimed to technically verify and validate the web-portal services. A small group of expert users (technical experts, ebusiness experts, and selected experienced users) performed a thorough expert assessment that provided recommendations for corrections and improvements before version 1.1 of the portal was fully deployed. A summative evaluation phase followed in September 2002, following the basic principles of the framework presented in (Sampson & Manouselis, 2004) and engaging all actors involved in the Go-Digital portal development and operation: that is, decision makers (a selected group of policy makers related with the GoDigital initiative and providing the funding for the web portal development), portal element experts (involving experts upon all elements of the portal, such as content experts, usability experts, e-business experts, etc.) and the end-users (vSMEs participating in the Go-Digital program, their trainers, and other visitors of the portal).
Manouselis N., and Sampson D. (2004) Multiple Dimensions of User Satisfaction as Quality Criteria for Web Portals. In the Proc. of the IADIS WWW/Internet Conference, Madrid, Spain, October 2004. In the current mature phase of the Go-Digital portal, a final evaluation phase has been initiated on June 2004. This evaluation is still undergoing, but its initial results have been collected and presented in the following paragraphs. The evaluation process followed closely the stages of the framework presented in (Sampson & Manouselis, 2004).
3.2.1 Preliminary analysis phase In this phase, the evaluation experts supported the decision makers and portal element experts throughout the initial stages of evaluation. These included the identification of the evaluation objectives, the selection and specification of the evaluation criteria, and the creation and online deployment of the satisfaction measurement questionnaire. The system of evaluation criteria used (Figure 1) was based on a scaled-down (for questionnaire simplicity reasons) elaboration of the user satisfaction dimensions described in detail in (Sampson & Manouselis, 2004). SMEs
Trainers 1,1%
2,4%
2,2%
4,8%
14,6%
29,2%
31,3%
52,8%
Not at all
Little
Enough
Very
48,2%
Perfectly
Not at all
Little
2,8%
Enough
Perfectly
6,6%
19,4%
19,7%
35,5%
44,4%
Little
Very
1,3%
2,8%
Not at all
Enough
Visitors
Experts
30,6%
13,3%
36,8%
Very
Perfectly
Not at all
Little
Enough
Very
Perfectly
Figure 2: The satisfaction of the four evaluation groups from the Simplicity dimension.
3.2.2 Main evaluation phase In this phase, the evaluation activities were initiated. The evaluation aimed first in assessing the different user groups’ satisfaction from the use of the Go-Digital portal: the initial goal was to achieve an overall assessment of the portal that would exceed a 60% total performance threshold. The evaluation also aimed to compare the results obtained from the different user groups (that is, SMEs, trainers, experts and other visitors). Invitations to access the online evaluation questionnaires were electronically sent to all the evaluation participants, together with directions for exploring all features of the web portal. After one month, the initial results were collected and processed using the first three analysis techniques introduced in Section 2.
3.2.3 Evaluation results analysis phase In this phase the evaluation experts collected and synthesized the results from the first month of evaluation (302 responses). Initially, a descriptive statistical analysis of the participants’ assessment of each portal dimension was carried out. The results obtained from the four groups were comparatively studied, as it is demonstrated in the example of Figure 2.
Manouselis N., and Sampson D. (2004) Multiple Dimensions of User Satisfaction as Quality Criteria for Web Portals. In the Proc. of the IADIS WWW/Internet Conference, Madrid, Spain, October 2004. SMEs
Trainers 0,0%
2,4%
1,2%
3,6%
9,3%
30,2%
28,9%
12,0%
59,3%
Not at all
Little
Enough
53,0%
Very
Perfectly
Not at all
Little
Experts
Enough
1,3% 5,3%
5,7% 31,6%
Little
15,8%
8,6%
40,0%
Not at all
Enough
Perfectly
Visitors
8,6%
37,1%
Very
Very
46,1%
Perfectly
Not at all
Little
Enough
Very
Perfectly
Figure 3: The distribution of the synthesized responses for each evaluation group on the Design criterion.
Performance
Criteria Performance 90 80 70 60 50 40 30 20 10 0 Content
Design
Community
TOTAL
Criteria Trainers
SMEs
Experts
Visitors
Figure 4: Comparative presentation of the partial utilities and global utilities of the evaluation groups.
Figure 2 comparatively presents the percentage proportions of the different user groups’ responses to the question “How much satisfied are you from the simplicity of the portal environment?”. Although this application of the descriptive statistics was useful for studying satisfaction from separate portal dimensions, it did not provide an overall perspective of the web portal. For this reason, two more analysis techniques where engaged: • First, the synthesis of the responses of each evaluation group was used to calculate the distribution of responses for the higher level criteria. Figure 3 presents the case of criterion ‘Design’, synthesizing the assessments from all the lower level criteria (namely ‘Information Architecture’, ‘Usability’, ‘Graphical Design’ and ‘Technical Integrity’ and their corresponding dimensions). It is interesting to note that, in this case, the distribution of synthesized responses on a high level criterion is different than the one of an individual lower level sub-criterion. For example, the percentage of experts perfectly satisfied by the overall ‘Design’ of the portal is over 37%, although its ‘Simplicity’ dimension highly satisfied only 30% of them. • Second, the synthesis of all responses of each evaluation group was used to calculate the partial and global utility indicators. Figure 4 presents a comparison of the synthesized partial utilities of each
Manouselis N., and Sampson D. (2004) Multiple Dimensions of User Satisfaction as Quality Criteria for Web Portals. In the Proc. of the IADIS WWW/Internet Conference, Madrid, Spain, October 2004. criterion and the global utility (total satisfaction indicator) for all four evaluation groups. For example, the global satisfaction of the trainers user group was calculated to be 78%, which is very satisfactory compared to the targeted 60% threshold. Similarly, the global satisfaction of the SMEs user group was 72%; that is, lower than the satisfaction of the trainers but still over the threshold defined.
4. CONCLUSION The focus on user satisfaction as an element of quality for web applications is an approach already introduced in the area of web engineering (Zhang et al, 1999). In this paper we presented the application of a methodology that is based on multiple criteria analysis and that assesses the multiple dimensions which might affect user satisfaction from a web portal. We demonstrated the application of the methodology in the ongoing evaluation phase of the Greek Go-Digital vertical portal for vSMEs e-business awareness and training. The analysis tools that have been engaged allow for the comparative analysis of results obtained by different user groups, since they synthesize the results in comparable quality performance indicators. There are several issues yet to be explored. The methodology seems suitable for re-evaluating a web portal in different periods of time and examining the results or benchmarking web portals that have been developed for similar application contexts. Nevertheless, consistency between the dimensions of the portal features that are examined will have to be ascertained. The presented evaluation system may not cover the needs of all web portals; therefore future work should also include revisiting the evaluation system specification, taking advantage of the experience from the application of the framework in the different case studies.
ACKNOWLEDGEMENT The work presented in this chapter has been partly funded by the Greek Go-Digital Programme and more specifically the “Educational Support of Go-Digital” initiative. The Greek Go-Digital Programme is cofunded by the Operational Programme “Information Society” of the Greek Ministry of Development, and the Operational Programme “Competiteveness” of the Greek Ministry of Economy and Finance, under the 3rd European Community Support Framework.
REFERENCES Adelman L (1992). Evaluating Decision Support and Expert Systems. New York: Wiley. Avouris N. et al (2003) Website Evaluation: A Usability-based Perspective. Y.Manolopoulos et al. (Eds.), PCI 2001, Lecture Notes on Computer Science 2563, Springer-Verlag (2003). 217-231. Blankenship E (2001) . Portal Design vs. Web Design. SAP Design Guild, Edition 3. http://www.sapdesignguild.org /editions/edition3/. Hazra T.K. (2002) Building Enterprise Portals: Principles to Practice. Proceedings of ICSE’02, Orlando, Florida, USA, 19-25 May 2002. Ivory M., Hearst M. (2002) Improving Web Site Design. IEEE Internet Computing, Usability and the Web Issue, MarchApril 2002. Johnson, G. et al (1989) Exploring strategic management. Scarborough, Ontario: Prentice Hall. Keeney R.L (1992). Value focused thinking: A path to creative decision making. Harvard University Press, London. Komoroski M. et al (1998) On-line marketing: Leveraging portals for profit conference papers. IBC Conferences, Sydney, 23-24 November 1998. Lacher M.S. et al (2001) A framework for Personalizable Community Web Portals. Proceedings of the Human-Computer Interaction International Conference, New Orleans, LA, USA, 5-10 August 2001. Mayard S. et al (2001) A Multi-faceted Decision Support System Evaluation Approach. Journal of Decision Systems, Special Issue on “DSS in the New Millennium”. Oxford: Hermes Science Publishing Ltd., Vol. 10, No 3-4, 395-428.
Manouselis N., and Sampson D. (2004) Multiple Dimensions of User Satisfaction as Quality Criteria for Web Portals. In the Proc. of the IADIS WWW/Internet Conference, Madrid, Spain, October 2004. Meehan P. (1998) Internet Portals! The Door or the Store?. Gartner Group Research, KA-04-09103, July 1998. Newman M. W., Landay J.A. (2000) Sitemaps, Storyboards, and Specifications: A Sketch of Web Site Design Practice. Proceedings of Designing Interactive Systems: DIS 2000, Automatic Support in Design and use, New York, August 2000, ACM Press. Portals Community Fundamentals. Portal Definition and Types of Portals, PortalsCommunity. http://www. portalscommunity. com/ library/ fundamentals.cfm. Roy B. (1996) Multicriteria methodology for decision aiding. Kluwer Academic Publishes, Norwell. Sampson D, Manouselis N. (2004) A Flexible Evaluation Framework for Web Portals based on Multicriteria Analysis. In Tatnall A. (Ed.), Web Portals: the New Gateways to Internet Information and Services. Idea Group Inc. Siskos Y. et al (1998) Measuring customer satisfaction using a collective preference disaggregation model. Journal of Global Optimization, Vol. 12 (1998), 175-195. Staab S. Et al (2000) Semantic Community Web Portals. Proceedings of the 9th International WWW Conference, Amsterdam, The Netherlands, May 2000. Waloszek G. (2001) Portal Usability – Is There Such A Thing?. SAP Design Guild, Edition 3. http://www. sapdesignguild.org/editions/edition3/. Warner S. (1999) Internet portals, what are they and how to build a niche internet portal to enhance the delivery of information services. Proceedings of 8th Asian-Pasific SHLL Conference. Winkler R. (2001) Portals – The All-In-One Web Supersites: Features, Functions, Definition, Taxonomy. SAP Design Guild, Edition 3. http://www.sapdesignguild. org/editions/edition3/. Zhang P. et al (1999) Websites that Satisfy Users: A Theoretical Framework for Web User Interface Design and Evaluation. Proceedings of 32nd IEEE International Conference on System Sciences, Hawaii, USA, 1999.