Website User Experience (UX) Testing Tool Development using Open ...

5 downloads 5119 Views 317KB Size Report
measures of website quality in both developing and developed countries. .... open source web development tools such as Hypertext Preprocessor (PHP) and.
Website User Experience (UX) Testing Tool Development using Open Source Software (OSS) Ashok Sivaji, Soo Shi Tzuaan Product Quality & Reliability Engineering MIMOS Berhad Kuala Lumpur, Malaysia [email protected]

Abstract— Website usability and user experience (UX) are crucial measures of website quality in both developing and developed countries. The challenges faced with manual usability testing gave rise to the growth of usability assessment tools in the market. As these tools may be costly, it is important for the usability practitioners to justify the benefits and cost of these tools. Hence, the objective of this study is to first gather the requirements for website UX testing. This is followed by developing Open Source Software (OSS) called Ultimate Reliable and Native Usability System (URANUS) to support most of the techniques used for usability test. The final objective is to benchmark URANUS against existing tools. The methodologies employed during requirement gathering include literature review and interviews with target audience. This is followed by development of URANUS using the Rapid Application Development (RAD). Based on the LBUT and RUT comparison perform with existing tools, URANUS turned out to score relative high in terms of benefits with low cost of ownership. Keywords-component; Ultimate Reliable and Native Usability System (URANUS), Lab based User Experience Test (LBUT), Remote Usability Testing (RUT), Open Source Software (OSS), Balance Score Card (BSC)

is important that the benefit of the tools far outweigh the cost. From the requirement gathering performed with usability practitioners, 8 out of 10 concurred that justifying purchasing a usability tool is challenging. From this, the authors concluded that some of the challenges faced by practitioners and small and medium enterprise (SME) with the commercial tools include justifying the benefits over the cost. At the same time the Malaysian Administrative Modernization and Management Planning Unit (MAMPU) has embarked on a self reliance program that can address this challenge. MAMPU is a Malaysian government agency that is tasked to modernize public service administration towards distinction. From the Malaysian Public Sector Open Source Software (OSS) Master Plan proposed by Mohd Zahri-N.A, MAMPU aims for the public sector to embark on the self reliance phase from 2011 onwards. It is predicted that by 2016, 50% of leading non-IT organization will use OSS as a business strategy to gain competitive advantage [2]. This paper addresses some of the problems faced by practitioners who had to purchase more than one tool because different usability tools are targeted for different usability technique. II.

I.

INTRODUCTION

The awareness of website usability and UX are becoming crucial measures of website quality in both developing and developed countries. In Malaysia, it can be observed that the government is placing high emphasize on promoting good Website and Portal (WP) usability. As per the Malaysia Government Portals & Websites Assessment 2011 report by Multimedia Development Corporation 2011, usability has the highest contribution to e-Government WP ranking i.e. 40% among the other pillars of measure such as content, security, participation and services [1]. With the increase in the number of WP over the years, performing usability testing manually is becoming challenging (tedious and time consuming) to the usability practitioners. This gave rise to the growth of usability assessment tools in the market. From the literature review and experience of the authors who are usability practitioners, it was found that there are various pros and cons associated with the existing tools. Simply purchasing or leasing one of the existing tools does not guarantee the desired outcomes of the usability practitioners. It This paper is sponsored by Product Quality and Reliability Engineering department of MIMOS Berhad, Kuala Lumpur, Malaysia ([email protected])

OBJECTIVE AND SCOPE

The first objective of this study is to understand the current work practices of the usability practitioners. These will form the requirements for the OSS program to be developed. The second objective is to design and develop an OSS called Ultimate Reliable and Native Usability System (URANUS) [3] to support most of the usability and UX techniques. URANUS is intended to be a web-based system that is designed to help perform UX testing in a local lab or remote environment. It should be flexible enough to be used as either a moderated or un-moderated system. It is envisaged that URANUS will help to benefit the practitioners. The final objective is to benchmark the benefits and cost of the URANUS against existing tools in the market. Among the many UX techniques shown in [8], the scope of this paper is on usability testing (informal, lab, remote, moderated, un-moderated) and remote usability test (RUT) only. RUT is chosen because it is among the method that is gaining the popularity while UX test is chosen because it enables the collection of direct user feedback and is widely used [9]. Although the heuristic evaluation is among the popular techniques, it is already covered in another study [10].

III.

LITERATURE REVIEW

A. Usability Practitioners Work Practice According to the Usability Professional Association (UPA) salary survey from 2005 to 2009, there exist many UX techniques employed by usability practitioners worldwide [8]. Although the ultimate objective for URANUS is to support all of the techniques used by the practitioners (refer to [8] for details), the scope for the current release is to be able to support UX techniques as shown in Table I. From this table, notice that the informal usability testing is declining slightly while the labbased and remote moderated techniques are employed 54% and 42% respectively by the respondents. However, in 2009, a new technique known as the remote and un-moderated usability technique appeared. This is known as the RUT in this paper. Overall, the usage of techniques has remained mostly stable, with the exception of 9% decrease in informal usability testing and the addition of the usability testing remote un-moderated (RUT) at 18% usage. With that, RUT was added as a requirement of URANUS. TABLE I.

CHANGE IN USABILITY TECHNIQUES USED FROM 2005 TO

2009 [8]

Trend in UX Techniques

No 1 2 3 4

Technique Usability testing (informal) Usability testing (in a lab) Usability testing (remote, moderated) Usability testing (remote, unmoderated)

Total Respondents (n) 1329 1523 1786 Year of Survey 2005 2007 2009 75% 53% 37%

77% 54% 42%

68% 54% 42%

-

-

18%

B. Usability or UX Testing Usability testing (also known as user experience test (UET)) or UX testing is a means of obtaining direct user feedback on a product by observing and interviewing users while they perform some task while using the product. When conducted in a lab with a moderator and observers, it is known as lab-based and moderated UX testing. The general process flow is shown in Fig. 1. More details can be found in [9]. Unlike the lab or informal UX testing, the remote UET usually does not involve a moderator and observer as it is performed in a different location. The current teleconferencing and videoconferencing technologies also enable remote moderated UX testing. UX testing sessions are conducted with one participant at a time and it is common for these sessions to be conducted in a specialized usability laboratories. Hence, participants that are recruited for usability tests are required to travel to the usability laboratories. RUT [12] refers to a situation where the moderator and participants are not in the same location. RUT can provide several advantages [13] in that several participants perform test in parallel, participants will feel more comfortable performing the UX in their own environment; moreover, participants from different geographic area can participate in the UX test more easily and this reduce the need and cost for travelling to the usability laboratories. However, RUT might lose important information [14] since there is lack of interaction between participants and moderator.

Before the UX testing starts, UX practitioners will attend a kickoff meeting with the project stakeholders to understand the requirements and the objectives of the test. This is followed by the recruitment of users that involves preparing a user screening sheet to interview them. The UX team will also develop the test strategy, setup the environment and perform a pilot run to ensure the readiness before the actual testing. The moderator plays a crucial role that involves directly interacting with the participant and gaining qualitative feedback [15]. C. Open Source Software (OSS) Some of the advantages of OSS are: to reduce total cost of ownership (TCO) and cost of operation (COO); increased freedom of choice of software use (reducing vendor lock-in); interoperability among systems; growth of ICT industry and OSS industry; growth of OSS user and developer community; and growth of knowledge based society [2]. However, there are also certain risks for OSS such as the lack of guarantee of updates where user might get stuck using the old version for few years without getting any new update from the community since it might be too small or with poor support. In contrast, this risk is minimized for commercial tools whereby there exist full-time resources to support the development efforts. The authors argue that the TCO model is not suitable as it only considers the cost as opposed to considering both the cost and benefit. The Total Account Ownership (TAO) represents the degree of freedom with respect to the technology provider. Instead a balanced scorecard (BSC) method as proposed by Lavazza that is more suitable to address all the aspects of the OSS in a balanced and complete manner [16] [17]. IV.

METHODOLOGY

A. Reliability of Survey It is important to measure the reliability of the survey [8] from which the decision is made to develop the LBUT and RUT UX techniques. An established sample size calculator from Creative Research Systems [11] is used to measure the reliability of data. This research is targeted towards usability practitioners with an estimated population size of 34000; this is based on the assumption that from the 34 countries of survey respondents, there are around 1000 usability practitioner per country. Statistically, for a 95% confidence level, 34000 population sizes and 3% margin error, about 1067 survey respondents are needed. Based on Table 1, a 3% of margin error would imply that the 75% respondents who practiced the informal usability testing technique has a lower margin of error of 72% and higher margin of error of 78%. With 1329, 1523 and 1786 respondents respectively in year 2005, 2007 and 2009, it can be concluded that the data obtained from this survey has high reliability (low margin of error). Hence the findings from this survey are relevant and reliable for this research and for the development of URANUS.

B. Development Model URANUS is developed based on the Software Development Lifecycle (SDLC). In order to gather the requirements for URANUS, interviews were conducted with 10 usability practitioners who are from several academic institutions and organizations. All respondents were involved in performing a LBUT both as a moderator and also as a participant, at least once in the past 6 months. The questions range from understanding the practitioner’s work practices, the various instruments or tools used during the LBUT, their opinion and experience is OSS, and key features they need and want from an OSS. The findings from this interview are shown in Table 2. Once the requirements are gathered the development phase begins. From the requirements (Table 2R11), the development team decided to use open source web development tools such as Hypertext Preprocessor (PHP) and XAMPP package. The development is based on the generalized process flow and work practice that has been derived in Fig. 1. The rapid application development (RAD) was implemented during the development of URANUS, to speed up the development process. An extensive research study was conducted on the availability of existing OSS. C. Comparison of URANUS: Balance Score Card A LBUT was performed using URANUS, Tobii and Morae for an e-Commerce website. Further details of this study are published in [9]. The study involved the general process flow as shown in Fig. 1. This involves using the tools to perform the end-to-end test process as shown in Fig. 1. According to Lavazza (2007), it is important to consider beyond the Total Cost of Ownership (TCO) to justify for the cost of OSS in a balance and complete way. A Balance Score Card (BSC) approach is proposed, that extends the TCO to also include the benefits of the OSS developed. The BSC also account for the organization business goals and processes [16]. In this study, the adapted BSC approach includes the adherence of the various tools to the UX process as derived in Fig. 1 and from the requirements of the 10 usability practitioners as shown in Table II. D. Total Cost of Ownership(TCO) The Total Cost of Ownership (TCO) analysis has been carried out on various commercially available tools such as Morae [5], Tobii [6], Loop11 [4] and Userfeel [7] against URANUS which is an OSS. Each one of the tools has a different business, sales and service models. For instance, Tobii includes product sales (eye tracker), license fee (lifetime and maintenance) while Morae only imposes license fee. Userfeel and Loop11 are based on ‘software as a service’ (SaaS). URANUS on the other hand is OSS. V.

RESULTS AND DISCUSSION

Based on the literature review and interviews on work practices, the generalized process flow derived for UX testing is shown in Fig. 1. The summary of the requirements gathered from the interviews is shown in Table II. The third column in Table II shows the status of the development work completed against the user requirements.

Figure 1. Derived UX Process Flow TABLE II.

USABIITY PRACTITIONERS REQUIREMENTS AND DEVELOPMENT PROGRESS OF URANUS

No

Requirements

R1

Convert audio to text format to ease transcribing

R2

Automatically measure task duration

R3

Support multi LBUT and RUT projects

R4

Real time reporting

R4

Capturing the moments of distractions or defects

R5

Concurrent moderator, observers and users

R6

Moderator ratings ( ISO9241-11) and comments

R7

Remote testing

R9 R10

User/subjective ratings and comments Non-verbal data capture and analysis (eye tracking, gestures)

R11

Unlimited license access (OSS deployment)

R12 R14

Adherence to UX practitioner’s business logic Ease of integration with other existing systems (Tobii, Morae, Userfeel, Loop11)

R15

Integration of LBUT and RUT into one system

Progress

A. Comparison of Benefits The URANUS system developed was used by the moderators and observers to develop the usability test plan and setup the test environment. URANUS also facilitated the test execution and subject and moderator ratings; an UX score will be calculated automatically by the system based on the effectiveness, efficiency and satisfaction as per ISO 9241-11.

This also includes gathering user, observer and moderator’s feedback. Since URANUS enabled the recordings of the entire session, retrospective think aloud (RTA) was also done. Since data was gathered in real time, it was found that using URANUS enabled the project stakeholders to view the task effectiveness and efficiency and user satisfaction immediately after each task. Since all data was consolidated within URANUS, it also eased the moderators to perform analysis and recommendation for the improvement of the e-Commerce website. This process was repeated using Tobii Studio and Morae. From the LBUT performed, the benefits of various tools are compared. A score card of the 20 features of the UX tools is shown in Table III. TABLE III.

COMPARISON OF BENEFITS

No Features Morae Tobii Userfeel Comparison of End-to-End Test Features ( Fig. 1) 1 User Recruitment 2 Test Plan and Protocol Setup (Tasks, Questions) 3 Hardware or Mobile Device Testing 4 Software or Website Testing 5 Moderator Ratings 6 Subjective Ratings 7 Live Observation 8 Live Observation with Note-taking 9 Voice and Screen Recording 10 Face Recording 11 12

Loop11

Uranus

Reporting And Analysis Eye Tracking Analysis

13

Result Exporting

14

Audio to text – transcribing

No Features Morae Tobii Userfeel Loop11 Comparison of Conformance to User Recruitments ( Table 2) 15 Task Duration Measure 16 Multi Project Support 17 OSS 18

LBUT

19

RUT

Defect Capturing Feature Count Benefit Score Card (%)

Uranus

20

14 70

13 65

10 50

10 50

13 65

Note that the comparison of benefits is based on Fig. 1. Additionally, the requirements are gathered from interviewing the practitioners. A tick indicates that from the LBUT and RUT performed the moderators and observers found the existence of

the feature; while a cross indicate that the feature does not exist. Out of the 20 features, Morae fulfilled 14 features, followed by Tobii Studio and URANUS with 13 features each. Both Userfeel and Loop11 met 10 out of the 20 required features. With this, it could be summarized that Morae has 70% feature coverage while Tobii Studio and URANUS has 65% coverage each. Meanwhile, Userfeel and Loop11 only have 50% of the features covered each. Based on the comparison results, Morae had the highest score among all UX tools. However, one downside for Morae Recorder is that the recorded videos are stored in a proprietary format that is “.RDG”. The only way to generate a video clip from this is by using Morae Manager. In the event where the usability practitioners want to perform analysis, there is no alternative but to use the Morae Manager. This creates a vendor lock-in and does not provide the flexibility to practitioners without incurring extra time of conversion to a more generic format. URANUS is more flexible in that it uses screen recorder that supports a more generic video formats such as MP4, FLV, WMV and SWF. As compared to all other tools, only Morae provide live observation with note taking. The strength of Tobii Studio however lies in the eye tracking capability, whereby the areas of interest or distractions can be captured easily. Unfortunately this imposes a heavy cost burden as a special hardware is required to purchase the eye tracker. Among all tools, only Userfeel manages the recruitment of users automatically. As compared to Morae and Tobii Studio, the other three tools only support software and web-based testing. All tools are able to support the fundamental necessities for UX testing such as preparing test plan and protocol, subjective ratings, voice and screen recording, reporting and analysis. One of the key requirements from the practitioners is for a tool to support the audio to text conversion automatically, as it is deemed to reduce the transcribing time. As to date, none of the tools support this feature. As per the requirements, all the tools support measurement of task duration, multi-project and LBUT. Userfeel, Loop 11 and URANUS are able to support both LBUT and RUT, while Morae and Tobii Studio are more suitable for LBUT only. Among all the tools, only URANUS is an OSS. Additionally since it supports both LBUT and RUT, the practitioners will not be required to use different tools for different testing strategies. Without URANUS, an organization that had invested in either Tobii Studio or Morae will need to also acquire licenses for Userfeel or Loop11 to conduct RUT. In other words, the practitioners will not only need to spend for 2 different licenses, they will also need to invest time to learn how to use the additional tool. The authors propose URANUS as the suitable tool for LBUT and RUT. B. Comparison of Cost To measure the total cost of ownership (TCO) of URANUS, several scenarios involving criteria such as user recruitment, user token, and licenses acquisition were considered. Firstly from the requirement gathering interviews, the estimated number of projects per organization is obtained. In the first case, a hypothetical organization that requires LBUT for 25 projects per year involving 6 users and 10 tasks per project is considered. The next case is similar but it

involves RUT. From the first case study, Tobii Studio incurred the highest cost of ownership (i.e., 3.4 times more than URANUS). This is due to the eye tracker that cost approximately MYR100,000. This is followed by Loop11 and Morae that were respectively 1.5 times and 1.1 times more expensive than URANUS per year. As shown in Table IV, the licenses for Tobii Studio and Loop11 are approximately MYR20,000 and MYR26,000 respectively. For the second case study for RUT, it was found that Userfeel and Loop11 cost 3.6 times and 1.1 times more respectively as compared to URANUS. It can be seen that the license cost for URANUS is virtually zero as compared to MYR158,000 and MYR5,000 for using Userfeel and Loop11. TABLE IV.

CASE STUDY 1: TCO FOR LBUT USING UX TOOLS

LBUT for 6 Users involving 10 Task /project for 25 Projects/year (MYR’000) No Features Morae Tobii Loop11 URANUS 1 User Recruitment 35 35 35 35 2 User Participation 11.3 11.3 11.3 11.3 Token 3 Moderator-User PC 4 4 4 4 4

Eye Tracker

0

100

0

5

License Cost

4.5

20

26

0

55

170

77

50

Total TABLE V.

0

CASE STUDY 2: TCO FOR RUT USING UX TOOLS

RUT for 6 Users involving 10 Task /project for 25 Projects/year (MYR’000) No Features URANUS Userfeel Loop11 1 User Recruitment 35 35 35 2 User Participation Token 6 11.3 6 3

Moderator-User PC

4

Misc

5

Licenses Cost

Total

4

4

4

0.1

0

0

0

158

5

45

162

50

Since the Tobii Studio requires an eye tracker, it is estimated to cost MYR100,000. When 5 year straight line depreciation is applied, this averages to MYR20,000 per year. Both Tobii and Morae also has a yearly license upgrade fee. It will be of interest to stakeholders to analyze the cost of operation (COO) in performing LBUT over the period of 7 years with these tools with respect to first case study.

using Tobii Studio averages to almost MYR60,000 (USD18,868). This however excludes hardware maintenance cost. For Loop11, the average cost is MYR76,500 (USD24,057) throughout the 7 year period. The COO using Morae and URANUS is MYR49,000 (USD15,409) and MYR47,000 (USD14,780) respectively. From here it could be concluded that URANUS has the most economical TCO and COO. C. Other OSS Benefits URANUS is still at an infancy stage of development. It has been published in Google Code in order to promote it to the UX and OSS community. It is anticipated that this will accelerate URANUS development [18]. Google Code is not only meant for developers but also facilitates discussion groups and blogs. Under the project hosting, source codes of URANUS are stored and version controlled. This enables a multi-developer environment. In addition, the user manual and installation guide of URANUS has also been uploaded here. This will enable usability practitioners worldwide to download and use URANUS. Eventually, this is expected to form a community of usability practitioners with the common interest to continuously improve URANUS for the benefit of their professional practice. The users of the URANUS are expected to play few roles. First is to provide new requirements and second is to report bugs or defects for the improvement of URANUS. Third is contributing directly to the development effort in enhancing URANUS. This is shown in Fig. 3.

Figure 3. URANUS OSS Community

Fig. 4 and 5 shows the URANUS LBUT and RUT architecture.

Figure 2. LBUT Cost of Operation for UX Tools

From this analysis (Fig. 2), it could be seen that after the 5 year depreciation, from the sixth years onwards, the COO

Figure 4. URANUS Architecture for LBUT

address new techniques in the UPA, enhanced eye tracking calibration process and audio-to-text transcribing capacity. REFERENCES [1]

[2]

[3]

[4] [5] Figure 5. URANUS Architecture for RUT

During the LBUT, the moderator and user will be using URANUS in the testing room (Fig. 4). URANUS will be hosted on the server. The observers will be able to connect to the URANUS session using a virtual network connection (VNC). At the same time the project stakeholder, such as the project managers can also view the real time results such as the moderator ratings and subjective ratings. All data will be residing in the database. The voice and screen will be recorded using a screen recorder. Fig. 5 shows the URANUS architecture for RUT. The main difference is the feature to record and upload the voice and screen from the remote client’s PC. A remote stakeholder is able to also view the results of the RUT from URANUS. This shows that RUT is particularly useful if the users, moderators and stakeholders are at different locations.

[6]

[7] [8]

[9]

[10]

[11]

VI.

CONCLUSION

Based on the literature review and interviews, the work practices of the usability practitioners are derived as shown in Fig. 1. The development of URANUS is based on the requirements obtained from the practitioners and by benchmarking against existing tools. Although not all requirements were completed, a comparison with existing UX tools shows that URANUS has a relatively good balance score card of 65% compared to Morae and Tobii Studio that has 70% and 65% respectively. This shows that URANUS can be used reliably and will benefit usability practitioners. In addition, the TCO and COO analysis performed using both case studies shows that URANUS has the lowest cost of ownership and cost of operations. This proves that URANUS requires the least financial investment compared to the other commercially available tools. The cost of URANUS is cheaper simply because it is not associated with software, hardware or license costs. This is not the case with the other tools. In addition, URANUS provides extra benefits of OSS such as source code availability and no vendor lock-in. URANUS source code has been published in Google Code. This will enable usability practitioners worldwide to download and use URANUS. However, for URANUS to sustain, just like any other OSS, it needs an active community. In the future, URANUS is to

[12]

[13] [14]

[15] [16]

[17]

[18]

“Malaysia Government Portals & Websites Assessment 2011 report” , Multimedia Development Corporation URL: http://www.mscmalaysia.my/codenavia/portals/msc/images/img/govern ment/partner_msc_malaysia/msc_malaysia_worldwide/FinalMGPWA20 11.pdf , last accessed 15-2-2012 Nor Aliah binti Mohd Zahri, “Malaysian Public Sector Open Source Software (OSS) Master Plan” Malaysian Administrative Modernization and Management Planning Unit (MAMPU), Malaysia CIO Conference and MYGOSSCON,Putrajaya Malaysia, 2011 Shi–Tzuaan, S., Sivaji Ashok, “Open Source Program – URANUS” MIMOS Berhad , URL: http://usability.mimos.my/ums/product.php , last accessed 18-3-2012 “LoopTM11” URL: http://www.loop11.com/; last accessed 19-3-2012; “MoraeTM” TechSmithTM http://www.techsmith.com/morae.html ; last accessed 19-3-2012; Tobii “Eye Tracking Software Tobii StudioTM”, http://www.tobii.com/en/eye-trackingresearch/global/products/software/tobii-studio-analysis-software/ ; last accessed 19-3-2012; “userfeel.com Remote Usability Testing” http://www.userfeel.com/ ;last accessed 19-3-2012; “UPA Salary Survey” Usability Professional’s Association URL: http://usabilityprofessionals.org/usability_resources/surveys/SalarySurve ys.html last accessed 12-6-2012; Sivaji, A., Mazlan, M. F., Shi–Tzuaan, S., Abdullah, A., & Downe, A. G., “Importance of Incorporating Fundamental Usability with Social, Trust & Persuasive Elements for E-Commerce Website” IEEEXplore Business, Engineering and Industrial Applications (ICBEIA), pp. 221226, Kuala Lumpur,Malaysia, 2011. Sivaji, A., Soo, S.-T., & Abdullah, R. , “Enhancing the Effectiveness of Usability Evaluation by Automated Heuristic Evaluation System ” IEEEXplore Third Int. Conf. on Computational Intelligence, Communication Systems and Networks , pp. 48-53,Bali,Indonesia: 2011 “The Survey System:Sample Size Calculator ”, Creative Research Systems, 2007, URL: http://www.surveysystem.com/sscalc.htm; last accessed 24-5-2012 J.M. Christian Bastien, “Usability testing: a review of some methodological and technical aspects of the method”, International Journal of Medical Informatics, pp.18-23, v79, 2010, Carol M. Barnum, “International usability testing, Usability Testing Essentials”, Morgan Kaufmann, pp. 319-354,Boston, 2011 Andres Baravalle and Vitaveska Lanfranchi, “Remote Web usability testing”, Behavior Research Methods, Volume 35, Number 3, 364-368, DOI: 10.3758/BF03195512 , 2003 Carol M. Barnum, “Conducting a usability test, Usability Testing Essentials”, Morgan Kaufmann, pp. 199-237, Boston ,2011 Luigi Lavazza “Beyond Total Cost of Ownership: Applying Balanced Scorecards to Open-Source Software” ICSEA '07 Proceedings of the International Conference on Software Engineering Advances , IEEE Computer Society , Washingon DC,USA, 2007 Maha Shaikh and Tony Cornford, “Framing the Conundrum of Total Cost of Ownership of Open Source Software”, 7th International Conference on Open Source System, 2011 Shi–Tzuaan, S., “URANUS-USABILITY” Google Code, URL: http://code.google.com/p/uranus-usability/downloads/list , last accessed 12-6-2012

Suggest Documents