Journal of Biomedical Informatics 44 (2011) 897–908
Contents lists available at ScienceDirect
Journal of Biomedical Informatics journal homepage: www.elsevier.com/locate/yjbin
Methodological Review
Implementation of a user-centered framework in the development of a web-based health information database and call center Heather A. Taylor a,b,⇑, Dori Sullivan a, Cydney Mullen b, Constance M. Johnson a a b
Duke University School of Nursing, DUMC 3322, 307 Trent Drive, Durham, NC 27710, USA FirstHealth Moore Regional Hospital, 155 Memorial Drive, PO Box 3000, Pinehurst, NC 28374, USA
a r t i c l e
i n f o
Article history: Received 21 April 2010 Available online 9 March 2011 Keywords: User–computer interface Consumer health information Internet User-centered design
a b s t r a c t As healthcare consumers increasingly turn to the World Wide Web (WWW) to obtain health information, it is imperative that health-related websites are user-centered. Websites are often developed without consideration of intended users’ characteristics, literacy levels, preferences, and information goals resulting in user dissatisfaction, abandonment of the website, and ultimately the need for costly redesign. This paper provides a methodological review of a user-centered framework that incorporates best practices in literacy, information quality, and human–computer interface design and evaluation to guide the design and redesign process of a consumer health website. Following the description of the methods, a case analysis is presented, demonstrating the successful application of the model in the redesign of a consumer health information website with call center. Comparisons between the iterative revisions of the website showed improvements in usability, readability, and user satisfaction. Ó 2011 Elsevier Inc. All rights reserved.
1. Introduction Increasingly consumers are turning to the World Wide Web (WWW) to obtain health information. The Pew Internet & American Life Project (2008) estimates that 75–80% of WWW users have searched for health information online. Additionally, 75% of WWW users with a disability or chronic disease report that their last health information search affected a decision about how to treat an illness or condition [1]. This method of seeking health information influences the healthcare delivery system for both providers and consumers [2–6]. Most notable is that the WWW is increasingly becoming the primary source of health information for healthcare consumers [3,6]. Health information on the WWW is extensive and lacking regulation [7–9]. Consumers are not trained to filter what information is valid, reliable, and of importance to their diagnosis [10–14]. Consequently, consumers may make healthcare decisions based upon information that may be inaccurate, incomplete, and insufficiently evidence-based [15,16]. Furthermore, the proliferation of fragmented, inequitable health information on the WWW is difficult to navigate. Websites are often developed without consideration of intended users’ characteristics, literacy levels, preferences, and ⇑ Corresponding author at: The Foundation of FirstHealth, 150 Applecross Road, Pinehurst, NC 28374, USA. Fax: +1 910 695 7501. E-mail addresses:
[email protected] (H.A. Taylor), dori.sullivan@ duke.edu (D. Sullivan),
[email protected] (C. Mullen), constance.johnson @duke.edu (C.M. Johnson). 1532-0464/$ - see front matter Ó 2011 Elsevier Inc. All rights reserved. doi:10.1016/j.jbi.2011.03.001
information goals resulting in sites that are not user-centric in their functionality [17]. Technical language, information overload, layout inconsistencies, and disorganization of information lead to further cognitive thus navigational challenges [9,16,18]. Moreover, user dissatisfaction often leads to abandonment of the website and ultimately the need for costly redesign [17]. Expense is often cited as the reason for omitting a user-centered design [19,20]. However, it is estimated that redesign in the post-implementation phase is 100 times more expensive than in the initial design phase [21]. Numerous user-centered frameworks have been created to guide web development and design [22–25]. These frameworks include various types of design methods such as user, task, environmental, functional, and representational analyses. They also include evaluation analyses consisting of cognitive walkthrough, heuristic evaluation, and small-scale usability studies. The Website Developmental Model for the Healthcare Consumer (Fig. 1) is a framework that distinctively builds upon recognized design and evaluation methods by incorporating principles of human– computer interaction, content-based testing, expert-based testing, and usability evaluation techniques [26]. The model features current standards of practice in website accessibility, credibility, quality, accuracy, and interoperability [27–29]. These best practices in user-centered design take into account user characteristics, tasks, literacy, and environments early in web development, thus optimizing usability and addressing quality concerns of consumers acquiring health information on the WWW [26]. This paper describes the methodology of the Website Developmental Model for the Healthcare Consumer and the successful
898
H.A. Taylor et al. / Journal of Biomedical Informatics 44 (2011) 897–908
Fig. 1. The website developmental for the healthcare consumer. Adapted from Johnson C, Turley J. A new approach to building web-based interfaces for healthcare consumers. electronic Journal of Health Informatics 2007; 2(2): e8.
application of the model in the redesign of a consumer health information website with call center resulting in a highly usable, understandable, and credible website. 2. Methods 2.1. The website developmental model for the healthcare consumer design methods The Website Developmental Model for the Healthcare Consumer is a comprehensive user-centered approach to the design and development of a healthcare website for the average consumer [26]. Userfocused websites are modeled upon the characteristics of intended users, the purpose of the website, and the users’ information goals. Additionally, the literacy level, accuracy, and appropriateness of the website content must be considered. Peer review by experts in the content domain and in usability further contributes to the development of a website that is easy for users to understand. This section provides a brief description of the model’s design and evaluation methods. Each of the following analyses describes different but essential considerations in order to design or redesign a website prototype. 2.1.1. User/environmental/task/functional analysis One of the most important factors in the design of a website is to understand the people who will be using the website so that the
website matches the users’ capabilities [26]. To initiate this understanding, a user analysis is conducted to profile the intended users’ characteristics, to include age, education, race, ethnicity, computer skill level, and information goals [25]. In conjunction with user analysis, environmental analysis examines the physical, social, and cultural milieus in which the website is used [26]. These considerations are essential as they affect users’ accessibility, interactions, and understanding of the website. Task analysis is the process of identifying website functionality, user platforms, input and output formats, website constraints, information categories and flow, and the communication needs of the users [30–33]. Task analysis is a necessary complement to user analysis as it recognizes the information goals of the users, what they want to do on the system, and how they will likely interact with the system to achieve these goals [26]. Additionally, this method of analysis ensures that only task features that match users’ capabilities and are essential for the successful completion of tasks are included in the website [26]. In order to examine the structures needed for successful goal completion, a functional analysis is conducted to identify the relationship of tasks within the website, the users’ goals and how these goals will be achieved, and the flow of information. This analysis drives the functionality, structure, and overall navigation of the website [26]. Essentially, functional analysis identifies more appropriate or efficient ways to achieve tasks within the system.
H.A. Taylor et al. / Journal of Biomedical Informatics 44 (2011) 897–908
2.1.2. Visual-graphical representation Website content and visual-graphical representations are dictated by the purpose, information flow, and users of the website [26]. Consumer health information websites should be written at a 6–8th grade level and provide images related to the text. Websites that include interactive tools, such as online chat, require careful consideration of the information displays. Most importantly, the graphical representations and information displays should be purposeful and aid to clarify the website’s content [26]. 2.1.3. Comparative analysis Comparative analysis examines different aspects of similar websites [26]. This analysis provides insight into alternative website functionality, visual-graphical representation, usability aspects, navigation, and user platforms. Essentially, it allows the web developer to extract good design ideas and eliminate poor design features from similar websites [30]. 2.2. The website developmental model for the healthcare consumer evaluation methods The model’s evaluation methods, such as cognitive walkthrough, heuristic evaluation, and small-scale usability studies, are applied to the developed prototype to identify potential problems early in the redesign process and to guide necessary website revisions [26]. These evaluation methods are imperative and have been shown to dramatically improve website efficiency, effectiveness, and functionality [17,34–36]. 2.2.1. Cognitive walkthrough Cognitive walkthrough is an evaluation process guided by a set of questions that identify the users’ goals and the ease of use in accomplishing these goals [26]. This process can identify problems that a first-time user would encounter and establishes how well a first-time user is able to complete tasks within the website without formal instruction [37]. In order to conduct this method of evaluation, it is essential that the web developers clearly define the intended users, the core system tasks, and the order of each step to complete tasks. Cognitive walkthrough is necessary to identify functionality and ease of use design problems within the system before conducting user-based testing [26]. 2.2.2. Heuristic evaluation Heuristic evaluation uses interface design guidelines and principles in human–computer interaction in order to identify the majority of usability problems within a website [38]. Essentially, a small group of experts in website design and human cognition examine the website for violations, assign each violation a severity score, and generate solutions to the identified violations. Although many usability problems are identified with this evaluation, additional evaluation methods, such as user-based testing, are essential to identify further usability issues unapparent to the expert evaluator [38].
899
In order to further standardize the quality of information on the WWW and protect against commercial influence on the content of a website, the American Medical Association (AMA) developed guidelines to provide principles for developers of health information websites [28]. Similar to the HONcode, these guidelines focus on user privacy and access, funding, author credibility, sponsorship, and advertising, yet also incorporate usability principles. Implementation of these guidelines provides users with insight into the credibility of health information on the website. In addition to providing credible, transparent health information on the WWW, it is also imperative that information is accessible by all users with different types of platforms and devices. The World Wide Web Consortium (W3C) is an international organization that creates standards and guidelines to facilitate ‘‘One Web’’. One of the standards that focus on the W3C’s principle ‘‘Web for All’’ is the Web Content Accessibility Guidelines 1.0 [29]. These guidelines provide standards and direction on creating Web content that is accessible by users with different platforms. The goal of these guidelines is to develop a website that is interoperable, equitable, and accessible by all users. 2.2.4. Small-scale usability studies Small-scale usability studies, the most common form of userbased testing, is an evaluation method used to determine if users are able to find the information they want within the website prototype and to evaluate any subsequent iterative revisions [26,37]. These studies require that participants talk-aloud to discuss what they are thinking and doing as they use the website [39]. This type of technique is aimed at collecting information on the participants’ mental processing while carrying out a task, thus allowing the investigator to discover potential usability problems with the website [39]. Evidence supports that five participants are sufficient to identify the most important usability problems within a website [40,41]. 2.2.5. Content-based testing Content-based testing evaluates the readability of the website [26]. Various tools use a readability formula such as Flesch-Kincaid, Fry, or Simple Measures of Gobbledygook (SMOG) [42–44]. In consumer health websites it is useful to also assess the cohesiveness of the text using the Readability Assessment Instrument (RAIN) or the Suitability Assessment of Materials (SAM) [45,46]. 2.2.6. Expert-based testing Expert-based testing uses experts in the content domain to evaluate the website content for accuracy, reliability, and quality [26]. This evaluation method can be considered a peer-review of the website content. The purpose of the peer review is to identify and resolve inaccuracies in the content. This process assists in validating the reliability and appropriateness of the content for the intended user population [26]. 3. Results
2.2.3. Standards of practice Current standards of practice have been developed by governmental and non-governmental organizations to address concerns regarding the quality, credibility, and accessibility of health information on the WWW. The Health on the Net (HON) Foundation HONcode focuses on the provision of reliable and credible information on healthcare websites [27]. The HONcode consists of eight principles that include the author(s) qualifications, complementarities of the information, privacy of personal data, citation of sources, balanced evidence, support contact information, and funding and advertising policies. Essentially, the HONcode aims to standardize the quality of health information on websites.
3.1. The website developmental model for the healthcare consumer design results In this section a case analysis based on the redesign of a consumer health-related website using the Website Developmental Model for the Healthcare Consumer is presented (IRB approved). The order of the phases described is the appropriate sequence of the design and evaluation methods to be utilized in the redesign of a website. In February 2009, the development of a consumer health-related website for navigating the community through the healthcare
900
H.A. Taylor et al. / Journal of Biomedical Informatics 44 (2011) 897–908
organization and the surrounding communities was supported by the philanthropic entity of a not-for-profit healthcare network serving 15 counties in the mid-Atlantic region. Initially the funding organization supported development of the website as a resource tool to help its’ members locate healthcare providers, community resources, and health information. However, as the project progressed, it was recognized that health and community resource information was fragmented among various agencies within the organization and community and that the scope needed to be expanded to the community. Development of the website began with unstructured interviews with key stakeholders within the community and organization to determine the goals, objectives, and functionality of the website. These initial meetings lead to the development of a basic prototype which utilized a social networking format. However, stakeholders consistently voiced concern that the adopted design may divert users unfamiliar or, moreover, uncomfortable with this format. As a result, the website was redesigned from a social networking format to a comprehensive web-based database with online and call center support. Consequently, the various analyses previously outlined were conducted to design and evaluate the redesigned website. The results illustrate the feasibility and effectiveness of the model in the redesign of any healthcare consumer website. Data obtained from the analyses were analyzed with the Statistical Package for the Social Sciences (SPSS) for Windows Version 11.0 (a = .05, two-tailed tests). Task completion rates were analyzed by descriptive statistics, to include mean, percentage, standard deviation, and frequency. A one-way multivariate analysis of variance (MANOVA) was conducted to determine if there were any big effects between the three usability studies on the four Computer System Usability Questionnaire subscales. Analysis of variance (ANOVA) was utilized to compare the task success rate, defined as the average task completion rate in the study group, between the three usability studies. Content analysis was used to generate themes from the comparative analysis and content-based testing. 3.1.1. Phase 1: user/environmental/task/functional analysis for the redesigned prototype To determine the needs of the intended users of the website, a web-based questionnaire was emailed to a convenience sample of 300 philanthropic members, >21 years old, regardless of race and sex. The web-based 11-item questionnaire consisted of both multiple choice and open-ended questions querying user demographics, information desired from a health information website, previous type of health information searches, problems most often encountered on the WWW, and computer usage and experience. In addition to the web-based questionnaire, unstructured interviews were conducted following stakeholder presentations requesting feedback regarding system functionality, content, and intended users’ computer literacy. A total of 15 web-based questionnaires were returned resulting in a 5% response rate. Of the responses received, 100% (n = 15) of the respondents were Caucasian and computer literate, 67% (n = 10) were male, ages ranged from 57 to 86 years old, with education levels varying from high school to doctorate/professional degrees. The environmental analysis section of the survey revealed that 93% (n = 14) of respondents access the WWW from home daily with 40% (n = 6) accessing the WWW from work daily. The task analysis demonstrated that the respondents conduct a variety of functions on the WWW, to include 100% (n = 15) email correspondence and online shopping, 93% (n = 14) using an online map for driving directions, 87% (n = 13) searching online directories, 60% (n = 9) registering for a class and listening to a radio or news broadcast, 53% (n = 8) conducting online chats, and 33% (n = 5) searching online health information. Respondents reported that they want a
consumer health website to include information on local healthcare providers, community resources, health and health-related classes and events, health information by condition, and prescription drug information. When asked about problems most often encountered on the WWW, 40% (n = 6) cited broken links and links to commercial websites. 3.1.2. Phase 2: comparative analysis Three governmental consumer health information websites were then examined focusing on comparisons in key functionality, navigational tools, usability aspects, and user platforms. The analysis showed that a variety of health information is essential in consumer health websites including information on health conditions, latest health news, interactive health tools, community resources, and healthcare provider information. Furthermore, the analysis revealed that additional features such as the ability of the website to interface with mobile devices and alternative data display representations, to include audio and enlarged text, need to be considered to increase accessibility and enhance usability by users with visual impairment. All of the websites examined employed multiple user platforms, to include MP3 audio reports and smart phone applications. 3.2. The website developmental model for the healthcare consumer evaluation results 3.2.1. Phase 3: creation of the prototype The design analyses guided the redesign of the prototype from a social networking format to a web-based database supporting online and call center information assistance. During this phase, the website functionality, layout, and navigational tools were outlined. The prototype included searches of healthcare providers, assisted living facilities, skilled nursing facilities, home health agencies, and community resources by alphabet, specialty, and advanced search. The advanced search included expanded fields such as provider gender, insurance accepted, and availability of same-day and weekend appointments. Additionally, the prototype provided a searchable prescription drug assistance program database, latest health news feed, links to credible health information websites, and a community-wide events calendar. Other features included an online support button to connect with a live navigator via online chat and online driving directions interface to all listings within the website. 3.2.2. Phase 4: cognitive walkthrough To compare a first-time users’ and the developers’ conceptual model of the intended pathway of each task, eight of the most likely to be performed tasks of users within the website were identified, such as site registration, locating a healthcare provider, and finding disease-specific health information. For each task, the action sequences were performed and documented to a spreadsheet. Next, each task within the website was compared, asking: (1) what are the users’ goals, (2) is the action obviously available, (3) does the action or label match the goal, and (4) is there good feedback. This analysis identified six usability issues. The majority of the usability issues, 67% (n = 4), were problems with ‘‘Consistency’’. For example, all of the website search filters were labeled ‘‘Filtered By’’ and the button to complete the search was labeled ‘‘Find-It!’’, rather than ‘‘Search By’’ and ‘‘Search’’. The other two identified problems related to ‘‘Match’’, such as the use of technical language on the registration page that may not be intuitive to users. 3.2.3. Phase 5: heuristic evaluation Heuristic evaluation of the website was conducted using Nielsen’s ten usability heuristics [37]. The ten principles for interface design include: Visibility (users should always be informed about
H.A. Taylor et al. / Journal of Biomedical Informatics 44 (2011) 897–908
901
Fig. 2. Frequency of heuristic violations by severity.
the website’s status through appropriate feedback and display of information); Match (the image of the website perceived by users should match the mental model the users have about the website and should speak the users’ language rather than use technical terms); User Control (the website should provide cancel, undo, or return functions); Consistency (standards and conventions in web design should be followed); Error Prevention (it is best to design interfaces that prevent errors from occurring in the first place); Recognition (users should not be required to memorize a lot of information to carry out tasks as instructions for the website’s use should be visible as warranted); Flexibility (provide users the flexibility of creating customization and shortcuts to accelerate their performance); Minimalist (refrain from extraneous information as it is a distraction and a slow-down); Error Message (error messages should be informative enough such that users can understand the nature of errors, learn from errors, and recover from errors); and Help (always provide help when needed) [37]. The website was independently evaluated, working with all functions of the website, to generate a list of heuristic violations according to the outlined list of heuristics. The identified violations were documented on a table, noting location, type of problem, and the violated heuristic. Nielsen’s Severity Rating Scale, a scale from 1, signifying a cosmetic problem, to 4, signifying a catastrophic problem requiring immediate attention, was then applied to each of the usability violations identified by the heuristic evaluation [30]. The assigned ratings were based on consideration of the proportion of users who would experience it, the impact it would have on their experience with the site, and whether the usability problem would be a problem only the first time they encounter it or whether it will persistently bother them [30]. Lastly, proposed solutions were generated for each of the identified violations. A total of 14 usability problems were identified, with 57% (n = 8) categorized as minor and 43% (n = 6) major (Fig. 2). ‘‘Consistency’’ (n = 5) and ‘‘Minimalist’’ (n = 5) were the most frequently violated heuristics. Six of these violations were minor usability problems such as the use of unnecessary upper case lettering. Four of the violations were major usability problems including inconsistency in the location of the online support icon throughout the website. ‘‘Visibility’’ (n = 2) was the second most frequently violated heuristic. One violation was minor as there was no alert to users that external links exited the user from the website. The other problem was categorized as major as there was no user feedback on location within the website or how to return to the previous page without using the operating system’s toolbar back button. Lastly, there was one minor ‘‘User Control’’ violation as there was no cancel function to clear the Planned Giving Response/Request
Form. A major problem identified in a ‘‘Help’’ violation was that the online help icon was not clearly defined. Before proceeding to user-based testing, the website prototype was revised to correct the usability problems identified in this evaluation. 3.2.4. Phase 6: small-scale usability studies Three rounds of small-scale usability studies were conducted using talk-aloud methods in a private setting. Study participant criteria included those who: (1) are 21 or older, (2) are computer and WWW literate (have used a computer and the WWW for at least three months), (3) are able to speak and read English, (4) are mentally capable of informed consent, (5) are reachable by telephone, and (6) are willing to travel to the testing site. After obtaining informed consent, participants were taught the talk-aloud technique. First, it was explained to participants that the interest is in what they say to themselves as they read and/or examine the website. Participants were encouraged to say out-loud everything that they say to themselves silently, just as if they were alone in the room speaking to themselves. To practice talk-aloud, participants were given two practice problems. Participants were asked to talk-aloud while they completed a two-digit addition problem and a word scramble to demonstrate how to say everything out-loud that the participant may silently think. An addition problem was chosen rather than a two-digit multiplication problem to reduce participant stress during the exercise. Testing took approximately one hour, during which time participants were asked to do the following: (1) answer questions about their demographic information and frequency and use of the WWW, (2) complete a health literacy test, (3) complete twelve common tasks within the website while talking aloud, and (4) complete a questionnaire rating their overall impression and satisfaction with the site. During testing, the investigator documented observations, participant reactions, and whether each task was completed, including number of prompts, errors, and time to complete task. Audio recordings of the users’ experiences were also utilized to aid in documenting problems discussed and manually recorded by the investigator. Tasks were considered successfully completed if completed in less than 3 min without prompts using the intended actions. Prompts were provided within 3 min if the participant required assistance. Participants’ satisfaction with the website was measured using the Computer System Usability Questionnaire, a mixed-method instrument consisting of 19 questions based on a Likert scale, from strongly disagree (1) to strongly agree (7) with 4 as neutral [47]. The questionnaire measures users’ overall satisfaction and satisfaction with system usefulness, information quality, and interface
902
H.A. Taylor et al. / Journal of Biomedical Informatics 44 (2011) 897–908
Fig. 3. Website prototype homepage.
quality. The Computer System Usability Questionnaire is highly reliable with a coefficient a > 0.89 [47]. Health literacy was measured with ‘‘The Newest Vital Sign’’ instrument [48]. This instrument consists of a nutritional label with six quantitative questions measuring math skill, reading and comprehension, and abstract reasoning skills. Participants with more than four correct responses are unlikely to have low literacy and those with fewer than four correct responses indicate the possibility of limited literacy. The Newest Vital Sign instrument is reliable, with Cronbach’s a > 0.76 in English [48]. In the first two rounds of usability testing, a total of 20 participants, 15 for the first round of testing and five for the second round, were recruited from the philanthropic entity’s membership. The final five participants for the third round of testing were recruited through a mailing to hospital volunteers who expressed potential interest in volunteering for a health informatics program. The following sub-sections outline the identified usability problems and comparisons between testing the modified prototypes. 3.2.4.1. First round of usability testing. The 15 participants’ ages ranged from 34 to 80 years old with a level of education varying from some college to Master’s degrees and an average health literacy score of 5.27. Seventy-three percent (n = 11) were female and 100% (n = 15) were Caucasian and computer literate. The overall task completion rate for the twelve selected tasks was 85% (n = 191, SD 16.22). Overall, nine major usability problems were discovered with 100% (n = 9) violating ‘‘Match’’. For example, when asked to locate an exercise class, participants intuitively searched under the Community Resource tab rather than the Calendar tab. Additionally, participants commented that the representation of graphics on the website did not match with the intended user population. The images were not considered age appropriate, such as the homepage image of a young woman typing on the computer. Participants expressed that these younger images might initially dissuade them from searching the website. Following the first round of usability testing, the website was modified to resolve the identified usability problems. A second
round of usability testing was then conducted to test the effectiveness and user satisfaction with the website revisions.
3.2.4.2. Second round of usability testing. The five participants’ ages ranged from 28 to 78 years old, 60% (n = 3) were male, 80% (n = 4) were Caucasian, 20% (n = 1) were Multiracial, and 100% (n = 5) were computer literate with baccalaureate to doctorate/professional degrees and an average health literacy score of 5.27. The overall task completion rate for the twelve original tasks improved to 100% (n = 75, SD .00). Following the second round of usability testing, in an effort to adhere to HONcode guidelines, a ‘‘Contact Us’’ link on the website homepage was added to provide website contact details. Additionally, based upon stakeholder feedback, a new main tab was added to detail information related to end-of-life issues, such as advanced directives, long-term care, and financial considerations. As a result of adding substantial new content, a third round of usability testing was conducted using the original identified tasks.
3.2.4.3. Third round of usability testing. The five participants’ ages ranged from 28 to 81 years old, 100% (n = 5) were Caucasian and computer literate, 80% (n = 4) were female, with level of education from some college to doctorate/professional degree and a health literacy score of 5.4. The overall task completion rate was 92% (n = 69, SD 5.58). Four new usability problems, all related to the new content, were identified and categorized as minor. ‘‘Match’’ (n = 3) was the most frequent violation. For example, participants clicked on the new ‘‘Contact Us’’ page for online assistance rather than the online support button on the homepage (Fig. 3). Similarly, when asked to determine a retirement center’s address, two of the participants’ first inclination was to search under the new ‘‘Planning for Your Future’’ tab, as a subcategory is ‘‘Long-Term Care’’. This page is a description of the differences in long-term care rather than the searchable facility database found under the ‘‘Healthcare Providers’’ tab. The one violation in ‘‘Consistency’’ involved the use of saturated light blue lettering to hyperlink to the directions feature.
H.A. Taylor et al. / Journal of Biomedical Informatics 44 (2011) 897–908
903
Fig. 4. Comparison in task completion rates for the three rounds of usability testing.
3.2.4.4. Task success. Successful completion of each task by the participants was compared within each round of testing as well as the overall task completion rate. The ANOVA showed that differences in the overall task completion rate between the three rounds of testing were marginal (p < .10). However, as shown in Fig. 4, the mean differences in the completion of individual tasks increased overall within each round of testing.
subscales (all ps > .05). However, Fig. 5 shows the mean responses for each of the subscales of the questionnaire between the three rounds of usability testing. With exception of interface quality, which dropped in the second round of testing from 6.5333 to 6.333, user satisfaction improved with each iterative revision of the website. 3.3. Phase 7: modify the prototype
3.2.4.5. User satisfaction. The MANOVA was used because the subscales are moderately, linearly correlated and conceptually related. As expected, MANOVA did not show a statistically significant difference in user satisfaction between the iterative versions of the website (p > .25), as all participants ranked the website between agree and strongly agree. Although the sample size is small, the MANOVA test was used based on the relationship of the dependent variables. We conducted separate ANOVAs on the dependent variables and still found no significant differences between the three usability studies and the Computer System Usability Questionnaire
The results of the evaluation methods were used to iteratively modify the redesigned website. These analyses also provided insight into additional content and functionality that needed to be added to the website. Following final modifications, we conducted the concluding evaluation methods. 3.4. Phase 8: content-based testing Although literacy concerns were not detected in usability testing, analysis of the website’s content using the Flesch–Kincaid readability formula revealed the average reading level was 11.4. Subsequent revision to improve the readability resulted in an average Flesch–Kincaid Grade Level of 7.3. Research suggests that healthcare consumer information should be written at a 6–8th grade level, with the 6th grade level being ideal for readability by the majority of consumers [49]. 3.5. Phase 9: expert-based testing
Fig. 5. Comparison of user satisfaction in the three rounds of usability testing.
In order to further evaluate the content, ten members of the organization’s outcomes management staff were recruited through a targeted email. One week prior to the focus group study, the participants were emailed an introduction to the consumer health-related website with an embedded link to the website and the evaluation criteria. The assessment criteria included evaluation of the website’s content, accuracy, display of information, and
904
H.A. Taylor et al. / Journal of Biomedical Informatics 44 (2011) 897–908
readability. Before initiation of the study, the investigator obtained signed informed consent, to include authorization for audiorecording. The investigator then facilitated the focus group discussion requesting narrative feedback and discussion on the content evaluation criteria. Essentially two themes emerged, missing information and literacy concerns for the intended users. Participants identified the need to add a county search filter to the skilled nursing facilities advanced search as county of residence impacts Medicare reimbursement for this level of care. Additionally, participants expressed that adding a user-friendly description of Medicare coverage for skilled nursing facilities would be extremely useful since this is a question that they often incur in practice. The second theme was that the website’s literacy level may still be too high for intended users. Without lowering the literacy level, participants felt that a portion of intended users may not be able to navigate or understand the site.
3.6. Phase 10: final evaluation To test the effectiveness of the model, we employed an expert in usability engineering to evaluate the website’s compliance with standards of practice in web design, accessibility, quality, and usability. The evaluation criteria included Nielsen and Tahir’s homepage usability guidelines [24] and standards such as the HONcode, AMA guidelines for medical and health information websites on the Internet, and the W3C Web Content Accessibility Guidelines 1.0 [27–29]. At the conclusion of the evaluation, the identified usability problems were assigned heuristic attributes to facilitate the generation of themes. A total of 35 usability problems were identified. Of the problems identified, 17% (n = 6) were cosmetic, 51% (n = 18) were categorized as minor, 29% (n = 10) were major, and 3% (n = 1) were catastrophic. ‘‘Visibility’’ (n = 12) was the most frequently violated usability heuristic. One of these violations was cosmetic as a blank page that was under construction was not labeled ‘‘Under Construction’’. Seven of the ‘‘Visibility’’ violations were minor usability problems such as there was no explanation of the use of users’ email addresses in the registration process. Three problems were categorized as major such as the lack of an introduction to the search conditions for the prescription assistance database. The one catastrophic problem was that pop-up windows were used to access the ‘‘Details’’ page of all community resource and healthcare provider listings. Pop-ups may be confusing for users and blocked by some web browsers. ‘‘Match’’ (n = 10) was the second most frequently violated heuristic. One of the violations was cosmetic such as inappropriately bolded and underlined text. Eight of the problems were categorized as minor to include the call center and online support icons were two different sizes. One of the problems was categorized as major such as the website URL primarily directed at the .com web address rather than .org to signify not-for-profit status. The third most violated heuristic was ‘‘Consistency’’. Of the seven violations, two were cosmetic such as inconsistent bullet style within the website. One violation was categorized as minor, such as the order of the records table was inconsistent on one page. The remaining four violations were major to include the inconsistent coloring of visited versus unvisited hyperlinks. There were additionally three ‘‘Minimalist’’ and two ‘‘Flexibility’’ violations. Of the three ‘‘Minimalist’’ problems, two were cosmetic and the other minor. For example, the call center phone number was listed four times on the homepage. Of the two ‘‘Flexibility’’ violations, both were major such as no alternatives to auditory and visual content. Finally, there was one minor ‘‘User
Control’’ violation where at the end of the registration page there was only a registration button and not an undo or cancel button. 3.7. Phase 11: make final modifications for website release Following the final evaluation method, further modifications of the website were conducted prior to the piloted release in March 2010. Some of the modifications included the addition of exit disclaimers to external links, enhanced link descriptors, and removal of the majority of pop-up windows. 4. Discussion This study demonstrates that the Website Developmental Model for the Healthcare Consumer can be successfully applied to the initial design or redesign of a consumer health-related website. The methods employed led to iterative revisions resulting in improvements in the website’s usability, content, and overall user satisfaction. The design analyses initiated the user-focus that was maintained throughout the redesign of the website. These analyses were essential in order to match the users’ characteristics and needs to the functionality, content, and layout of the website. The unstructured interviews led to an early redesign that would have been costly if identified after the rollout of the site. Consumer health websites that are not user friendly lead to user dissatisfaction and ultimately abandonment of the website. The cost of user-centered design is often cited as the reason for omitting its use in the website design process, however previous work shows that user-centered design is economically attractive since the cost of fixing a website after development is 100 times higher than fixing it early in development [30]. However, since there is not a gold standard for one particular usability method, it is difficult to compare tests because all these tests measure different aspects of website usability. In order to conduct an accurate economic comparison, these methods would need to be equivalent in scope. Two decades of research conducted by Hix et al. demonstrate that progressing from expert evaluation to user-based statistical evaluation to formative evaluation to summative evaluation is an efficient and cost-effective strategy in user-centered design [50–53]. The return on investment in user-centered design is achieved through savings in development time and money, decreased cost of redesign, decreased user training costs, decreased user error, and increased user satisfaction thus ensuring that users will actually use the website. In approaching user-centered design, it is important to consider which methods will have the greatest impact on development of the website. A number of usability inspection and testing methods have been presented in the case study as components of a model utilized to reduce usability problems in the early iterative design of a consumer health-related website. However, we recognize that time, resources, and levels of expertise are often barriers to conducting usability evaluation by website developers. Developers require a clear understanding of how these factors affect which usability inspection method(s) to select when conducting usability evaluation and testing. Comparative effectiveness studies of usability methods provide insight into these considerations. Expert-based methods, such as heuristic evaluation, guidelines such as those proposed by HON, W3C, and AMA, and cognitive walkthrough have been increasingly used in evaluation of webbased consumer health information and clinical interfaces [54– 66]. Similarly, user-based methods, such as small-scale usability studies, are widely used in evaluating consumer health websites [11,36,58,60,67–72]. In comparison studies of these methods, heuristic evaluation has been found to be a quick, cost-effective, and
H.A. Taylor et al. / Journal of Biomedical Informatics 44 (2011) 897–908
intuitive method for usability evaluation [23,37]. Notably, in comparison to other usability inspection methods, such as small-scale usability studies, cognitive walkthrough, and software guidelines, heuristic evaluation identified more problems than comparison methods [73,74]. However, small-scale usability studies revealed more severe and recurrent problems in comparison to heuristic evaluation and cognitive walkthrough [74–76]. Similar findings were noted in a comparison study of heuristic evaluation to cognitive walkthrough, however only for expert evaluators [77]. The need for expert evaluators is cited in the literature as a disadvantage of conducting heuristic evaluation. However, Nielsen’s research in the role of expertise as a factor in the effectiveness of heuristic evaluation demonstrates that two or three double experts (those with usability and system domain experience) or three to five evaluators with usability expertise generally identify 75% of usability problems, with 14 novice evaluators needed to find similar amounts of problems [78]. Generally, small-scale usability studies, utilizing targeted end-users, identify the remaining usability problems unapparent to the heuristic evaluator. In our case analysis, small-scale usability studies identified nine major, recurrent usability problems that were previously undetected by heuristic evaluation. The employed tasks brought out rich insights and information regarding the usability of the website. The usability problems identified in the first round of testing were immediately revised resulting in marked improvement of these tasks in the second round of testing. The third round of testing was conducted solely because of the significantly mandated website functionality designs requested by stakeholders and standards of practice, such as website contact details and content guidelines. Thus, the overall task completion rate and rate of task completion for several individual tasks decreased in the third round of testing, following the addition of new content. This finding exemplifies that any additions or changes to websites must be followed with usability testing to validate revisions of the interface. The case analysis findings support the use of only five participants in small-scale usability studies. Although a total of nine problems were found by the first 15 participants, eight of the problems (89%) were identified by the first 5 participants. Early studies suggested that five participants are sufficient for small-scale usability studies, since 80% of usability problems will be identified by only five participants [38,79]. Later research argues that 15 participants are appropriate, as five participants will discover 85% of a website’s usability problems, 10 will discover 95%, and 15 will discover 97% [40]. Most recently, based upon their research experiences, Tullis and Albert recommend the use of five participants from each demographic representative of the users of the website [41]. They find this recommendation is most applicable when the scope of the user-based testing is limited (5–10 tasks) and the audience is well-defined [41]. For the organization’s philanthropic membership, the study participants were representative of this user group. However, further testing is needed in order to generalize the website to the overall regional population with varying education and literacy levels. Although there was no relationship between participants’ age and literacy to their understanding of the website, it should be noted that health literacy testing ensures that problems with the website are related to usability aspects rather than literacy aspects. The 2003 National Assessment of Adult Literacy (NAAL) indicates that 36% of American adults, 78 million people, have basic to below basic health literacy skills [80]. In addition, 11 million people in the 2003 National Assessment were unable to complete the survey because they lacked basic literacy skills [80]. These findings are significant since current evidence demonstrates that individuals with limited literacy have poorer health outcomes due to misunderstanding of written instructions, disease management
905
information, and prescription label warnings [81–83]. Therefore it is imperative to address health literacy in the development of a healthcare consumer website to facilitate understandability and readability of the website content. In comparing user satisfaction between the three rounds of small-scale usability study, although not found to be statistically significant, user satisfaction increased in three of the four subscales, to include overall user satisfaction, system use, and information quality. The third subscale, interface quality, slightly decreased in the second round of testing yet there was no feedback to provide insight into this finding. Despite the fact that the participants were asked to review webpages that contained more detailed content in the third round of testing, the scale did not decrease in the third round. The high overall satisfaction with the website throughout all three study groups, averaging agree to strongly agree on all measures, signifies the user-focus that was captured throughout the developmental process. Content and expert-based testing validated the accuracy and appropriateness of the information for the intended users and led to improvements in readability, although further improvements in readability of the content are planned. Our expert evaluation identified numerous, previously undetected usability problems and missing functionality. If we had conducted the expert evaluation earlier in the developmental life cycle, and only used five participants in the formative evaluation, this would have been sufficient and, moreover, cost-effective. Nielsen’s usability slogans effectively summarize the concepts behind user-centered design, ‘‘Designers are not users. Users are not designers. Your Best Guess is Not Good Enough’’ [30]. As clinicians, we are potentially the users, perhaps the designers, and our best guess is not good enough. As evidenced by the number of problems uncovered by the usability engineer at the end of the project, it is evident that the use of developers with experience in cognitive science, human factors, or usability engineering early in the design process is needed to ensure a strong user-centric design. Furthermore, in considering cost-benefit tradeoffs in usercentered design, the case analysis supports that if time and resources are limited, heuristic evaluation is a quick, effective method that would identify and resolve the majority of usability problems in an interface if used in early testing. Incorporation of guidelines, specifically W3C and HON, into the heuristic evaluation serves as a checkpoint to ensure that the interface incorporates best standards in accessibility, credibility, and quality of the information. Although our analysis showed that five participants are sufficient in small-scale usability studies, this method requires substantial time for planning, participant recruitment, and implementation, in addition to study funding, resources that are often limited in practice.
5. Limitations The low response rate of our web-based questionnaire is below the currently published average for web-based questionnaire response rates. In health research studies, response rates for webbased versus mailed questionnaires vary, generally with lower response rates for web-based questionnaires (about 10% lower on the average) compared to mailed questionnaires [84]. Concerns regarding anonymity, safety, privacy, and confidentiality are often cited as barriers to completing web-based questionnaires [85–87]. Another barrier is the generational divide known as the ‘‘gray gap’’. A significant shift occurs in Internet access around age 55 [86]. Approximately 52% of 50–54 year olds go online, yet only 43% of 55–59 year olds go online and just 34% of 60–64 year olds go online. These percentages continue to decline with age, as 23% of 65–69 year olds go online [86]. In our case study, 3.5% of the survey
906
H.A. Taylor et al. / Journal of Biomedical Informatics 44 (2011) 897–908
population were 50–54 years old, 3.5% 55–59 years old, 5% 60–64 years old, and 63% of our population was P65 years old. Despite our low response rate, using a web-based questionnaire remains a good method to collect demographic and Internet access and usage information. Recent studies show participant preference to web-based questionnaires when compared to mailed questionnaires or telephone interviews [88–90]. Benefits to web-based questionnaires include decreased data entry and coding errors, increased response times, enhanced participant recruitment, and the ability to efficiently follow-up with large cohorts [87,88,91]. However, if the initial response rates are low, it is important to use additional methods such as incentives and reminders during survey administration to increase the response rate [92,93]. Additionally, carefully designing the questionnaire with good usability principles is important as well [93]. These methods will be employed when using web-based questionnaires in future studies. Furthermore, we recognize that our study participants were mostly senior, Caucasian, well educated, and computer literate. This was not initially considered a study limitation as demographic data for the original target of this cited population indicated that the largest user group is 80–89 years old followed by 70–79 years old. However, changes in the organization’s strategic plan of the user population for the website changed due to expansion of the defined user population during development. Additional testing with a representative sampling of regional users with widely dispersed demographics needs to be considered in order to further test the website’s usability. Running tests of significance on such a small sample size is generally not recommended, however, we wanted to see if there were any effects, but also realized that the effects would have to be large given the sample size. We cannot conclude that there are not effects because we might have found them if we had a larger sample size. A limitation of the website itself is that it was not developed with full consideration of users with visual disabilities, as alternative information displays and user platforms were not incorporated in the website. This is an important consideration that will be incorporated in future improvements to the website. It should also be noted that representational analysis and keystroke level modeling were not conducted in the case analysis as these methods are generally used when comparing an old design to a new design. Since the initial prototype was not fully developed, these analyses were not applicable to the development of our website. 6. Conclusion Utilizing a user-centered framework, such as the Website Developmental Model for the Healthcare Consumer, has led to the development of a consumer health-related website that is usable and contains credible, accurate health information. The framework’s design methods helped to determine the system functionality and content while the evaluation methods tested the usability and understandability of the website for the initial intended users. The model’s methodology incorporates best practices in literacy, information quality, and human–computer interface design and evaluation that are applicable to the cost-effective design and development of any consumer health-related website. However if time, resources, and expertise are a barrier to conducting a complete usability assessment, heuristic evaluation in an efficient and intuitive tool that can be readily used by web developers. Role of the funding source This project was funded by the Foundation of FirstHealth. The Foundation did not have a role in the study design, collection,
analysis and interpretation of the data, or in the writing of the report. The Foundation of FirstHealth supports the authors’ decision to submit the paper for publication. Acknowledgments The authors acknowledge Suzanne Wilson, MD and Debbie Delong, RN for their helpful discussions and comments on this project, Julie Thompson, PhD for her assistance with the data analysis, John Crichton, MBA for his support in the development of the website, and Judith Hays, PhD for providing feedback on earlier versions of this paper. The authors also gratefully acknowledge the Foundation of FirstHealth, 150 Applecross Road, Pinehurst, NC 28374, for supporting this work. References [1] Pew Internet & American Life Project. The engaged e-patient population; 2008. . [2] Eysenbach G. Recent advances: consumer health informatics. BMJ 2002;320(7251):1713–6. [3] Heese BW, Nelson DE, Kreps GL, Croyle RT, Arora NK, Rimer BK, et al. The impact of the Internet and its implications for health care providers: findings from the first health information national trends survey. Arch Intern Med 2005;165:2618–24. [4] Broom A. Virtually hea@lthy: the impact of internet use on disease experience and the doctor–patient relationship. Qual Health Res 2005;15(3):325–45. [5] McMullan M. Patients using the internet to obtain health information: how this affects the patient–professional relationship. Patient Educ Couns 2006;63(1–2):24–8. [6] Murray E, Lo B, Pollack L, Donelan K, Catania J, White M, et al. The impact of health information on the internet on the physician–patient relationship. Arch Intern Med 2003;163(14):1727–34. [7] Impicciatore P, Pandolfini C, Casella N, Bonati M. Reliability of health information for the public on the World Wide Web: systematic survey of advice on managing fever in children at home. BMJ 1997;314:1875–9. [8] Eysenbach G, Powell J, Kuss O, Sa ER. Empirical studies assessing the quality of health information for consumers on the World Wide Web: a systematic review. J Am Med Assoc 2002;287(20):2691–700. [9] Berland GK, Elliot MN, Morales LS, Algazy JI, Kravitz RL, Broder MS, et al. Health information on the internet: accessibility, quality, and readability in English and Spanish. J Am Med Assoc 2001;285(20):2612–21. [10] Bernstam EV, Shelton DM, Walji M, Meric-Bernstam F. Instruments to assess the quality of health information on the World Wide Web: what can our patients actually use? Int J Med Inform 2005;74(1):13–9. [11] Eysenbach G, Köhler C. Does the internet harm health? Database of adverse events related to the Internet has been set up. BMJ 2002;324(7331):239. [12] Kortum P, Edwards C, Richards-Kortum R. The impact of inaccurate health information in a secondary school learning environment. J Med Internet Res 2008;10(2):e17. [13] Morahan-Martin JM. How Internet users find, evaluate, and use online health information: a cross-cultural review. CyberPsychol Behav 2004;7(5):495–510. [14] Bessell TL, McDonald S, Silagy CA, Anderson JN, Hiller JE, Sansom LN. Do Internet interventions for consumers cause more harm than good? A systematic review. Health Expect 2002;5:28–37. [15] Crocco AG, Villasis-Keever M, Jadad AR. Two wrongs don’t make a right: harm aggravated by inaccurate information on the internet. Pediatrics 2002;109(3):522–3. [16] Roberts JM, Copeland KL. Clinical websites are currently dangerous to health. Int J Med Inform 2001;62(2–3):181–7. [17] Johnson CM, Johnson TR, Zhang J. A user-centered framework for redesigning health care interfaces. J Biomed Inform 2005;38(1):75–87. [18] Eysenbach G, Jadad AR. Evidence-based patient choice and consumer health informatics in the information age. J Med Internet Res 2001;3(2):e19. [19] Badenoch D, Tomlin A. How electronic communication is changing healthcare: usability is a main barrier to effective electronic information systems. BMJ 2004;328(7455):1564. [20] Bias RG, Mayhew DJ. Cost-justifying usability: an update for the Internet age. San Francisco (CA): Morgan Kauffman Publishers; 2005. [21] Pressman RS. Software engineering: a practitioner’s approach. seventh ed. New York (NY): McGraw Hill; 2009. [22] Seffah A, Gulliksen J, Desmarais MC. Human-centered software engineering: integrating usability in the software development lifecycle. Netherlands: Springer; 2005. [23] Mayhew DJ. The usability engineering lifecycle. San Francisco (CA): Morgan Kaufmann; 1999. [24] Nielsen J, Tahir M. Homepage usability: 50 websites deconstructed. Indianapolis (IN): New Riders Publishing; 2002. [25] Shneiderman B, Plaisant C. Designing the user interface. Strategies for effective human–computer interaction. Reading (MA): Addison Wesley Longman; 2009.
H.A. Taylor et al. / Journal of Biomedical Informatics 44 (2011) 897–908 [26] Johnson C, Turley J. A new approach to building web-based interfaces for healthcare consumers. Health Inform J 2007;2(2):e8. [27] Health on the Net. The HONcode in brief; 1997. . [28] Winker MA, Flanagin A, Chi-Lum B, White J, Andrews K, Kennett R, et al. Guidelines for medical and health information websites on the internet: principles governing AMA web sites. J Am Med Assoc 2002;283(12):1600–6. [29] World Wide Web Consortium (W3C). Web Content Accessibility Guidelines 1.0; 1999. . [30] Nielsen J. Usability engineering. Boston: Academic Press; 1993. [31] Hackos JT, Redish JC. User and task analysis for interface design. New York (NY): John Wiley & Sons, Inc.; 1998. [32] Kirwan B, Ainsworth LK, editors. A guide to task analysis. London: Taylor & Francis; 1993. [33] Jenkins DP, Stanton NA, Salmon PM, Walker GH. Cognitive work analysis: coping with complexity (human factors in defense). Burlington (VT): Ashgate Publishing Company; 2009. [34] Duncan V, Fitcher DM. What words and where? Applying usability testing techniques to name a new live reference service. J Med Libr Assoc 2004;92(2):218–25. [35] Choi J, Bakken S. Heuristic evaluation of a web-based educational resource for low literacy NICU parents. Stud Health Technol Inform 2006;122:194–9. [36] Stoddard JL, Augustson EM, Mabry PL. The importance of usability testing in the development of an internet-based smoking cessation treatment resource. Nicotine Tob Res 2006(Suppl. 1):S87–93. [37] Nielson J, Mack RL, editors. Usability inspection methods. New York (NY): John Wiley & Sons Inc; 1994. [38] Dix A, Finlay J, Abowd GD, Beale R. Human–computer interaction. third ed. England: Pearson Education Limited; 2004. [39] Ericcson AK, Simon K. Protocol analysis: verbal reports as data, revised version. Cambridge (MA): MIT Press; 1993. [40] Faulkner L. Beyond the five-user assumption: benefits of increased sample sizes in usability testing. Behav Res Methods Instrum Comput 2003;35(3):379–83. [41] Tullis T, Albert B. Measuring the user experience: collecting, analyzing, and presenting usability metrics. New York (NY): Morgan Kaufmann Publishers; 2008. [42] Aronson AR, Bodenreider O, Chang HF, Humphrey SM, Mork JG, Nelson SJ, Rindflesch TC, Wilbur WJ. The NLM indexing initiative. In: Proceedings of the AMIA fall symposium; 2000. p. 17–21. [43] Fry EB. A readability formula that saves time. J Reading 1968;11:513–6. [44] McLaughlin GH. SMOG grading – a new readability formula. J Reading 1969;12:639–46. [45] Singh J. RAIN: readability assessment instrument manual. second ed. Midlothian (VA): ONE Research Institute; 2003. [46] Doak CC, Doak LG, Root JH. Teaching patients with low literacy skills. Philadelphia (PA): J.B. Lippincott Company; 1996. [47] Lewis JR. IBM computer usability satisfaction questionnaires: psychometric evaluation and instructions for use. Int J Hum Comput Interact 1995; 7(1):57–78. [48] Weiss BD, Mays MZ, Martz W, Merriam Castro K, DeWalt D, Pignone MP, et al. Quick assessment of literacy in primary care: the newest vital sign. Ann Fam Med 2005;3(6):514–22. [49] Weiss BD. Health literacy and patient safety: help patients understand. Manual for clinicians. second ed. Chicago (IL): American Medical Association and American Medical Association Foundation; 2007. [50] Gabbard JL, Hix D, Swan JE. User-centered design and evaluation of virtual environments. IEEE Comput Graph Appl 1999;19(6):51–9. [51] Hix D, Swan JE, Gabbard J, McGee M, Durbin J, King T. User-centered design and evaluation of a real-time battlefield visualization virtual environment. In: Proceedings of the IEEE VR’9 conference. Houston (TX); 1999. p. 1–8. [52] Gabbard JL, Swan JE, Hix D, Lanzagorta MA, Livingston MA, Brown D, et al. Usability engineering: domain analysis activities for augmented reality systems. In: Proceedings SPIE; 2002. p. 445–57. [53] Hix D, Gabbard JL, Swan JE, Livingston MA, Höllerer TH, Baillot Y, et al. A costeffective usability evaluation progression for novel interactive systems. In: Proceedings of the 37th Hawaii international conference on system sciences; 2004. p. 1–10. [54] Lathan CE, Sebrechts MM, Newman DJ, Doarn CR. Heuristic evaluation of a web-based interface for internet telemedicine. Telemed J 1999;5(2): 177–85. [55] Choi J, Bakken S. Heuristic evaluation of a web-based educational resource for low literacy NICU parents. Stud Health Technol Inform 2006;122:194–9. [56] Karahoca A, Bayraktar E, Tatoglu E, Karahoca D. Information system design for a hospital emergency department: a usability analysis of software prototypes. J Biomed Inform 2010;43(2):224–32. [57] Allen M, Currie LM, Bakken S, Patel VL, Cimino JJ. Heuristic evaluation of paperbased web pages: a simplified inspection usability methodology. J Biomed Inform 2006;39:412–23. [58] Nahm E, Preece J, Resnick B, Mills ME. Usability of health web sites for older adults: a preliminary study. CIN 2004;22(6):326–34. [59] Tang Z, Johnson TR, Tindall RD, Zhang J. Applying heuristic evaluation to improve the usability of a telemedicine system. Telemedicine 2006;12(1): 24–34. [60] Ebenezer C. Usability evaluation of an NHS library website. Health Info Libr J 2003;20:134–42.
907
[61] Ostergren M, Karras BT. Active Options: leveraging existing knowledge and usability testing to develop a physical activity program website for older adults. AMIA Annu Symp Proc 2007:578–82. [62] Anselmo MA, Lash KM, Stieb ES, Haver KE. Cystic Fibrosis on the Internet: a survey of site adherence to AMA guidelines. Pediatrics 2004;114(1): 100–3. [63] Bouchier H, Bath PA. Evaluation of websites that provide information on Alzheimer’s disease. Health Inform J 2003;9(1):17–32. [64] Zeng X, Parmanto B. Evaluation of web accessibility of consumer health information websites. AMIA Annu Symp Proc 2003:743–7. [65] Zayas-Caban T, Marquard JL, Radhakrishnan K, Duffey N, Evernden DL. Scenario-based user testing to guide consumer health informatics design. AMIA Annu Symp Proc 2009;55:719–23. [66] Horsky J, Kaufman DR, Oppenheim MI, Patel VL. A framework for analyzing the cognitive complexity of computer-assisted clinical ordering. J Biomed Inform 2003;36:4–22. [67] Johnson CD, Zeiger RF, Das AK, Goldstein MK. Task analysis of writing hospital admission orders: evidence of a problem-focused approach. In: Proc AMIA; 2006. p. 389. [68] Fonda SJ, Paulsen CA, Perkins J, Kedziora RJ, Rodbard D, Bursell SE. Usability test of an internet-based informatics tool for diabetes care providers: the comprehensive diabetes management program. Diabetes Technol Ther 2008;10(1):16–24. [69] Wu RC, Orr MC, Chignell SE, Straus SE. Usability of a mobile electronic medical record prototype: a verbal protocol analysis. Inform Health Soc Care 2008;33(2):139–49. [70] Kushniruk AW, Triola MM, Borycki M, Stein B, Kannry JL. Technology induced error and usability: the relationship between usability problems and prescription errors of a handheld application. IJMI 2005;74:519–26. [71] Duncan VD, Fichter DM. What words and where? Applying usability testing techniques to name a new live reference service. J Med Libr Assoc 2004;92(2):218–25. [72] Dabbs AD, Myers BA, McCurry KR, Dunbar-Jacob J, Hawkins RP, Begey A, et al. User-centered design and interactive health technologies for patients. CIN 2009;27(3):175–83. [73] Bailey RW, Allan RW, Raiello P. Usability testing vs. heuristic evaluation: a head-to-head comparison. In: Proceedings of the human factors and ergonomics society, 36th annual meeting. Santa Monica (CA); 1992. p. 490–13. [74] Jeffries R, Miller J, Wharton C, Uyeda K. User interface evaluation in the real world: a comparison of four techniques. In: Proceedings of the conference on human factors in computing system (CHI 91). New Orleans (LA); 1991. p. 119– 24. [75] Karat CM, Campbell R, Fiegel T. Comparisons of empirical testing and walkthrough methods in user interface evaluation. In: Proceedings of the CHI’92; 1992. p. 397–404. [76] Lewis CH, Polson PG, Wharton C, Rieman J. Testing a walk through methodology for theory-based design of walk-up-and-use interfaces. In: Proceedings of the ACM CHI’90 conference on human factors in computing systems. New York; 1991. p. 235–42. [77] Desurvire H, Kondziela J, Atwood, M. What is gained and lost when using evaluation methods other than empirical testing. In: Proceedings of the conference on human factors in computing systems (CHI 92). Monterey, CA; 1992. p. 125–26. [78] Nielsen J. Finding usability problems through heuristic evaluation. In: Bauersfeld P, Bennet J, Lynch G, editors. Proceedings in the ACM CHI’92 international conference on human factors in computing systems; 1992. p. 373–80. [79] Virzi RA. Refining the test phase of usability evaluation: how many subjects is enough. Hum Factors 1992;34(4):457–68. [80] US Department of Education. The health literacy of America’s adults: results from the 2003 National Assessment of Adult Literacy. Publication No. 2006483; 2006. [81] Schillinger D, Barton LR, Karter AJ, Wang F, Adler N. Does literacy mediate the relationship between education and health outcomes? A study of a low income population with diabetes. Public Health Rep 2006;121(3): 245–54. [82] Morris NS, Maclean CD, Littenberg B. Literacy and health outcomes: a cross-sectional study in 1002 adults with diabetes. BMC Fam Pract 2006; 7:49. [83] Wolf MS, Davis TC, Tilson HH, Bass III PF, Parker RM. Misunderstanding of prescription drug warning labels among patients with low literacy. Am J Health Syst Pharm 2006;63(11):1048–55. [84] Shin T, Fan X. Comparing response rates from web and mail surveys: a metaanalysis. Field Methods 2008;20(3):249–71. [85] Van Selm M, Jankowski NW. Conducting online surveys. Qual Quant 2006;40(3):435–56. [86] Pew Internet & American Life Project. Wired seniors: four million Americans aged 65 and older are online, sending email to family members and surfing for important information; 2001. . [87] van Gelder MMHJ, Bretveid RW, Roeleveld N. Web-based questionnaires: the future in epidemiology? Am J Epidemiol 2010;172(11):1292–8. [88] Akl EA, Maroun N, Klocke RA, Montori V, Schünemann HF. Electronic mail was not better than postal mail for surveying residents and faculty. J Clin Epidemiol 2005;58(4):425–9.
908
H.A. Taylor et al. / Journal of Biomedical Informatics 44 (2011) 897–908
[89] Rankin KM, Rauscher GH, McCarthy B, Erdal S, Lada P, Il’yasova D, et al. Comparing the reliability of responses to telephone-administered versus self-administered web-based surveys in a case-control study of adult malignant brain cancer. Cancer Epidemiol Biomarkers Prev 2008;17(10): 2639–46. [90] Touvier M, Méjean C, Kesse-Guyot E, Pollet C, Malon A, Castetbon K, et al. Comparison between web-based and paper versions of a self-administered anthropometric questionnaire. Eur J Epidemiol 2010;25(5):287–96.
[91] Kroth PJ, McPherson L, Leverence R, Pace W, Daniels E, Rhyne RL, et al. Combining web-based and mail surveys improves response rates: a PBRN study from PRIME Net. Ann Fam Med 2009;7(3):245–8. [92] Couper MP, Kapteyn A, Schonlau M, Winter J. Noncoverage and nonresponse in an internet survey. Soc Sci Res 2007;36(1):131–48. [93] Edwards PJ, Roberts I, Clarke MJ, Diquiseppi C, Wentz R, Kwan I, et al. Methods of increase response to postal and electronic questionnaires. Cochrane Database Syst Rev 2009;8(3):MR000008.