Jl. of Educational Multimedia and Hypermedia (2011) 20 (4), 361-385.
Exploring the Design, Development and Use of Websites through Accessibility and Usability Studies Alan Foley
Syracuse University, USA
[email protected] In this paper, data obtained from a university website accessibility and usability validation process are analyzed and used to demonstrate how the design process can affect the online experience for users with disabilities. Interviews, observations, and use data (e.g. where users clicked on a page or what path taken through a site) were collected. Findings indicate that using automated validation tools does not necessarily ensure complete accessibility. Students with low vision found many of the pages hard to use even though automated validation did not indicate issues for visual disabilities. While the pages were accessible for blind users, low vision students who did not use specialized software had access problems. Findings from this study are used to present principles for web designers interested in creating and testing usable and accessible websites.
According to federal data (U.S. Government Accountability Office, 2009), students with disabilities represented nearly 11% of all postsecondary students in 2008, a number that has almost tripled over the past 20 years (Steele & Wolanin, 2004). According to the 2000 U.S. Census, almost 50 million people (about 19% of all Americans over age 5) reported having a disability. By the year 2000, among children and youth under age 21, the percentage receiving federally mandated education services for students with disabilities had risen to 13%, or 6 million students. Students with learning disabilities (LD) constitute the largest single group, ranging (in various studies) from 46% to 61% of all students with disabilities. The percentage of students with disabilities who complete high school increased from 61% in 1986 to 78% in 2001. These students increasingly graduate with standard diplomas and are academically qualified to participate in higher education.
362
Foley
Additionally, students with disabilities are similar to their peers without disabilities with regard to age, race, and the schools they attended. As methods of teaching with technologies continue to proliferate, and as more educational materials are digitized, understanding the implications of disability in technological environments is a critical issue. Increasingly, advanced technology systems are being deployed to facilitate and support educational experiences. These technologies can be formal instructional technologies, those that have gone through the instructional design process, including “institutional” applications like Blackboard or other course management systems (e.g., Desire2Learn, Angel, or Moodle). The level of use and the scope of technology use exist within a continuum across higher education, but it is rare to find a course without some technology component. Additionally, many administrative and educational functions are now online and campus websites are now gateways to many aspects of higher education. Once merely static informational sites, university websites are information portals. While these systems are becoming ubiquitous, they are developed with little functional understanding of disability, and there are no design principles informed by research on disability to guide educational website development. This results in technology development that does not work for people with disabilities. For example, in an analysis of a number of predominant online educational tools, the American Foundation for the Blind (2008) found that almost one third of respondents (N = ~100) who used assistive technology to access online educational tools reported either that the experience was unreliable or inconsistent or that they were unable to access or use the tool at all. Web accessibility and usability are two concepts that can help drive the development of online educational tools that reach the broadest number of learners. The challenge for designers of educational websites is to create materials that are engaging, appropriate, and accessible. The first two terms immediately resonate with educators, and the third term, “accessible,” probably does as well, but the exact meaning of the term accessible might be ambiguous. Despite the prevalence of disability in educational contexts, there has been little research on the effects of web accessibility and usability in web design practices. One possible reason for this is that web accessibility standards (discussed below), are the result of an extensive development process involving people with disabilities, web designers, multimedia designers, engineers, and other technical experts. While not perfect, it is generally accepted among web developers that adherence to web accessibility standards will improve access to the web for people with disabilities. Because of the time and effort devoted to developing the web standards, it is possible that there is consensus on the effects of accessibility techniques.
Exploring the Design, Development and Use of Websites
363
Another reason for the lack of research in this area is the variation in the ways different people experience disability. For example, it is fairly safe to say that a site that does not conform to the lowest levels of web accessibility standards and does not provide alt text, or includes large portions of text in a format a screen reader can read, will not be accessible for a blind user; however, individual users will have different experiences (Burgstahler, Corrigan, & McCarter, 2004) depending on the type of screen reader they use or how technically savvy they are. Designers knowledgeable about the accessibility standards can make assumptions about the site’s use, but the reality of how the site works will vary from user to user. This research explores issues related to the overarching question: what design features and process are critical for ensuring and improving the accessibility and usability of websites for users with disabilities? More specifically, the following research questions are explored: • How does the functional accessibility of a website designed with accessibility in mind differ from the results of automated testing? • Do automated accessibility validation tools ensure that a website will be functional for students with disabilities? • How do accessibility and usability overlap in practice? To illustrate the challenges and issues involved in accessible and usable web design, the concepts of accessibility and usability will be compared and contrasted. Results from a usability and accessibility validation process on a university website that was redesigned with accessibility in mind will be analyzed. Data from this process will be used to demonstrate how the design process can affect online user experience. This analysis will be also used to illustrate how the design of websites affects students with disabilities. Finally, findings from this study will be used to present principles for web designers interested in creating and testing usable and accessible websites. Web Accessibility and Usability In this section, the terms web accessibility and usability will be discussed and the specific use of these concepts in this research will be defined. Both terms usability and accessibility can be applied in a variety of contexts (e.g. building design). In this paper these terms will be used solely to refer to web page design and development. Traditionally, educators have accommodated individual’s needs for special materials or alternative formats without changing courses (Bowe, 2000). Components of a standard curriculum or website are modified in some way to make them accessible to an individual with a disability. Conversely, usability generally refers to the functionality of a website for a broad group of people.
364
Foley
Web accessibility involves making websites and online materials accessible to individuals with disabilities who might use assistive technologies or alternative techniques, such as changing font sizes and colors, to access the Internet (Clark, 2003). Web accessibility concomitantly describes several processes: the ability of the user to access information electronically; the effort made by the designer to enable a page to function with assistive devices and multiple technologies; and an understanding of the nature of differences that might span the audience of a particular website. Often, efforts toward accessibility will greatly increase the usability of a site as well. Paradoxically, in some cases, efforts to make websites accessible for one group of people may have adverse effects on accessibility (or usability) for others. For a person with a disability, the challenge of accessibility is to identify the tools (e.g., a screen reader) that will provide the most convenient access to web-based and other electronic information. For the designer, the challenge of accessibility is to remove the obstacles that prevent these tools from functioning properly. Web accessibility standards have existed in various forms for the last decade (Foley & Regan, 2002; Thatcher, 2006), and accessibility has entered into legal and technical standards for web development (e.g., Section 508, WAI WCAG, various state policies). While the standards have existed for some time, there is still a paucity of research on the effects of implemented standards, both generally and specifically, with respect to educational web design (including web-mediated instruction and distance education). In one of the few studies on web accessibility standards in educational contexts, Opitz, Savenye, and Rowland (2008) studied the effects of implementing Section 508 and WCAG guidelines in the creation of instructional, webbased learning modules for adolescents. They evaluated the accuracy of response and times for participants and found that students who used a site that met accessibility standards scored higher on accuracy of response than those who used a site that did not meet the standards. In contrast, there is a general field of practice in usability studies; however, most work conducted in computer usability studies is in general areas like design and computer science and is not applied to instructional environments (e.g., Chen, 2001; Lehman, 2008; Marchetto, Ricca, & Tonella, 2008; Nielsen, 2000; Subraya, 2006). Usability generally refers to the process of bringing the users’ perspective into the design process (Reeb, 2008) and making the product user-friendly. Usability is not limited to practice of web design and, in fact, is widely practiced as part of product design in everything from kitchen tools (e.g., OXO Good Grips products) to automobiles. Reeb (2008) identifies 5 attributes that usability testing processes measure:
Exploring the Design, Development and Use of Websites
365
• Easy to learn: The user can quickly go from not knowing the system to getting some work done with it. • Efficient to use: Once the user has learned the system, a high level of productivity is possible. • Easy to remember: The infrequent user is able to return to the system after some period of not having used it, without having to relearn everything again. • Few errors: Users do not make many errors during the use of the system, or, if they do make errors, they can easily recover from them. Also, no catastrophic errors should occur. • Pleasant to use: Users are subjectively satisfied by using the system, that is, they like it. (9) Table 1 Usability and Accessibility Testing Purposes and Methods Used in this Study Usability
Purpose
Method for testing
Assesses how users react to, and interact with, the website
Use testing
Can allow the user to express personal impressions of the resource, such as satisfaction, utility, value, helpfulness, benefits, frustration, and self-efficacy Accessibility
Assesses how well website allows users with disabilities to access information Most based on the WCAG or Section 508 of the Rehabilitation Act
Interview
Automated validation of all test pages using Cynthia Says and spot checks of pages using WAVE validator Manual Page Validation
The relationship between usability and accessibility is an important one because these terms are often used interchangeably, although they are in fact quite different in terms of scope, philosophy, and method (See Table 2). When applied properly, website design that incorporates both accessibility and usability principles becomes much more functional.
Foley
366
Table 2 Usability and Accessibility Compared Usability Broadest Audience “Look & Feel” Efficiency of Use “User-friendly” User-based Design Subjective Satisfaction Error Frequency & Severity
Accessibility Validation Tools Standards Access to Content Assistive Technology Users with Disabilities Legal Requirements
Web accessibility and usability are two disciplines with a common focus but divergent practices. Both endeavors rely on standard sets of techniques to ensure a consistent experience across a diverse set of users. Both rely on creative individuals to build and deliver great sites and experiences that have an impact on the user. Both seek to extend the reach of the user and link individuals together to form a stronger, collective whole. However, despite the common theory that links them, web accessibility and web usability do not share a common set of practices. The distinction between the two concepts can be stated—accessibility without usability is possible; usability without accessibility is not possible. The Technical and legal basis for accessibility The World Wide Web Consortium,1 or WC3, leads what is perhaps the most comprehensive web accessibility standards initiative. The W3C is the organization responsible for the standardization of a wide variety of webrelated technologies (e.g. HTML, CSS, and Ajax). The W3C also coordinates a project known as the Web Accessibility Initiative (WAI), the focus of which is to publish standards for web accessibility, another set of standards for software used to create web pages, as well as a set of standards for browsers used to view web content. The W3C’s Web Content Accessibility Guidelines2 was released in 1999 (“Web Content Accessibility Guidelines 1.0,” 1999) and was the first major effort to establish accessibility guidelines for web design.3 The WCAG is not a legal mandate but rather a comprehensive set of guidelines to ensure accessibility. 1. http://www.w3.org/ 2. http://www.w3.org/TR/WCAG10 3. This standard consists of 14 guidelines, each with 3 levels of checkpoints. Priority One checkpoints are those that the web developer must satisfy to insure that the page itself is accessible. Priority Two checkpoints are those that the web developer should satisfy to ensure that certain groups will be able to access information on the web page. Priority Three checkpoints are those the web developer may complete to ensure that all content on the page is fully accessible. Typically, these guidelines are interpreted to mean that the first and second priorities are the most realistic and affect the most users, and that Priority Three guidelines are considered technically difficult, costly to implement, and limited in applicability.
Exploring the Design, Development and Use of Websites
367
There are a variety of methods for meeting the needs those users with disabilities, whether through assistive devices such as a screen reader or through presentational standards. The WCAG guidelines attempt to reflect the requirements of as many of these users as possible. What is often lost in the policy language is that users with disabilities by no means represent a uniform category, nor can they be neatly divided into subcategories. Within the disability groups outlined by the WCAG, there is a spectrum of issues and technologies. The case of individuals with visual disabilities is a good example. This group includes, but is not limited to, blind users, users with low or impaired visibility, and users with color deficits. Each group has a specific set of needs, often using a different set of tools to address those needs. For example, a blind user may use a screen reader to read the content of a web page aloud. In order for a page to be read by a screen reader, the page has to have text associated with all components of the page, including images. In contrast, another user with low visibility may need the page to be rendered in large print. Another user may be color-blind and may find pages with red-blue color combinations difficult to read. U.S. Federal and State Policy In addition to the WCAG guidelines, there are legal mandates for accessibility. In the United States, Section 508 of the Federal Rehabilitation Act4 sets standards for web pages designed or maintained by federal agencies. Section 508 requires that electronic and information technology that is developed or purchased by the federal government is accessible by people with disabilities, but it does not directly apply to the private sector or to higher education. While both the WCAG 2.0 and the Section 508 guidelines are informative for universities when developing their websites, neither of the guidelines is legally mandated. However, the Americans with Disabilities Act (ADA) requires colleges and universities to provide services to their students with disabilities that are equal to those services provided to students without disabilities (Seale, 2006). There is a small body of case law referring to the ADA and Internet accessibility, but there are no clear guidelines in the ADA with respect to web accessibility. Often, the issue of web accessibility is framed as making a website ADA compliant; however, the ADA does not yet specify accessibility on the Internet. There are several sets of standards designed to ensure minimal levels of accessibility, the most prominent being Section 508.5 There are automated tools that can verify certain elements of those standards in web pages; how4. The regulations referred to as Section 508 are actually an amendment to the Workforce Rehabilitation Act of 1973. 5. http://www.section508.gov/
Foley
368
ever, validation tools like Cynthia Says6 (CS) are merely initial steps in ensuring accessibility. Because of the variance in how people with disabilities encounter and use websites, as well as variations in technology, there are few accessibility items an automated tool can really validate. For example, CS can determine the presence of alt text (the text offered in place of images) and can, to some degree, assess its validity; however, it takes real people to determine whether the alt text is appropriate. Methods The research design for this project was emergent. Because the researcher sought to observe and interpret a website in use, it was neither possible nor appropriate to finalize research strategies before data collection had begun (Patton, 2001). Exploring both the design process of the website and the experiences of students with disabilities using the site is an effective method of providing naturalized generalizability (Stake, 1978). A number of authors have suggested various methods for evaluating software/hardware for accessibility and usability (e.g., Rubin & Chisnell, 2008). In much of this literature, a user-centered approach to design is suggested (e.g., Bruseberg & McDonagh-Philp, 2000; Crowston, Sieber, & Winn, 2007; Dervin & Nilan, 1986; Spurgin, 2006; Wilson, 2006). This entails involving the user at each step of the design process—involvement typically taking the form of interview and observation of the user while engaged with the software/hardware. This engagement typically involves the user completing several tasks deemed essential to the purpose of the software/hardware. Theofanos and Redish (2003) used this type of usability-testing protocol to study how individuals with visual impairments accessed and used the Internet. Theofanos and Redish analyzed participants in 4 different sessions, each session requiring the participant to engage in between 7 and 11 different scenarios. Rubin and Chisnell (2008) detail four different types of usability tests. Most consistent with the purpose of this research is an exploratory test because it is in this phase, typically carried out early in the development of a product, that the participant is asked to explicate their thinking processes. The interaction of user and website is observable, but the process guiding that interaction is within the student. In addition to the usability tests conducted in this study, the researcher has conducted numerous accessibility analyses of educational websites and has analyzed policy related to web accessibility. This previous research yielded a standard set of evaluation tools and techniques that are further employed in this analysis (Foley, André, Petri, Felix, & Hunzicker, 2006; Foley & Regan, 2002). 6. http://www.contentquality.com/
Exploring the Design, Development and Use of Websites
369
The “style guide” for the new university website was also analyzed. Because the guide contained all the design guidelines for the new site, it was an important element in evaluating the new website. Table 3 connects the research questions for this study with data sources and research methods. Table 3 Question/Method Matrix Question
Data Source
Method
How does the functional accessibility of a website designed with accessibility in mind differ from the results of automated testing?
New site “style guide” and specifications
Document analysis
Automated analysis data
Automated validation tools
Video data
Personal and usability interviews
Do automated accessibility validation tools ensure that a website will be functional for students with disabilities?
Validation tool results
Automated validation tools
Interview notes
Observation
Video data
Personal and usability interviews
How do accessibility and usability overlap in practice?
Interview notes
Observation
Video data
Personal and usability interviews
Questionnaire
“Think aloud” protocol
Participants The participants in this study were undergraduate and graduate students at a research-intensive university. All participants were volunteers who had been recruited through an email sent out by the Student Disability Services (SDS) office only to students registered for services with (SDS). To ensure the confidentiality of the student participants, SDS sent out the recruitment email, collected responses, and scheduled the testing sessions with the students. Participating students were compensated $40 for their time by SDS. Four students ultimately participated in the usability testing sessions. The 4 students who participated were all experienced Internet users. When asked roughly how many hours a week they spent using the Internet, including emailing, responses ranged from 15-50 hours a week, with the average being 36 hours. While these responses were estimates, it was apparent that all of the students were familiar and comfortable with Internet conventions. During the testing protocol, the students were encouraged to
Foley
370
“think aloud,” as they felt comfortable, to describe their thought processes and concerns. Table 4 summarizes the students’ disability types and assistive technology used. Table 4 Disability Types of Students in the Study Disability
Assistive technology used
Student 1
Low vision (evident – used AT)
Screen Magnifier and Screen Reader
Student 2
Low vision (evident – changed settings)
No specialized software; changed computer settings
Student 3
Dyslexia (hidden disability, but voluntarily disclosed)
Screen Reader (Apple VoiceOver)
Student 4
Learning disability (hidden disability)
No specialized software
Three of the 4 students voluntarily chose to disclose their disability, and those 3 students also talked at length about how they experienced their disability and how they used technology to address some of the difficulties their disability presented. The fourth student neither used assistive technology nor disclosed what disability he/she had. This student did not use assistive technology, disclose a disability, or have a disability the researcher could readily discern. This is an interesting point, because it captures much of the ambiguity about hidden disability. Because the researcher knew the student had a disability (the student was on the SDS list which requires selfregistration) and has worked with students with disabilities for many years, he could assume the student had a learning disability. Despite the knowledge that the student had a disability, the exact nature of the disability was still hidden. Two of the students were users of specialized assistive technology and were skilled in the tools they were using. While being used to accommodate different categories of disability (one, a visual disability; the other, a hidden disability), the assistive technologies these 2 students used were screen reading and/or screen magnification tools. One student with a hidden disability used a screen reader (in this case VoiceOver, which is part of the Macintosh operating system), as needed. Another student did not use assistive technology but made a series of modifications to the display settings on the test computer that replicated the settings the student used on their personal computer. Making these settings took the student several minutes at the beginning of the testing session. While the student was proficient in making these changes, it still took about 90 seconds to make the changes to the settings. In the “Settings” menu (Windows XP), the student made the following adjustments:
Exploring the Design, Development and Use of Websites
371
display: appearance: font size extra large control panel display: resolution 800x600 (show less) Figure 1 illustrates the manual process this student had to complete to be able to use the test computer.
Figure 1. The screen of a low vision student manually changing settings to increase the screen viewing size (the picture in the lower right hand corner is a web cam recording through Morea; the webcam was used to capture audio). This particular student changes the settings on every computer he used in this manner. Low vision is an excellent disability category for this type of research because low vision is a general term that refers to a variety of conditions that can exist in varying degrees or combinations. The loss of central vision creates a blur or blind spot but side (peripheral) vision remains intact. This makes it difficult to read, recognize faces, and distinguish most details in the distance. Loss of peripheral vision is typified by an inability to distinguish anything to one side or both sides, or anything directly above and/or below eye level. Central vision remains, however, making it possible to see
372
Foley
directly ahead. Loss of peripheral vision is sometimes referred to as “tunnel vision.” Blurred vision causes both near and far to appear to be out of focus, even with the best conventional spectacle correction possible. Generalized haze causes the sensation of a film or glare that may extend over the entire viewing field. Extreme light sensitivity exists when standard levels of illumination overwhelm the visual system, producing a washed out image and/ or glare disability. People with extreme light sensitivity may actually suffer pain or discomfort from relatively normal levels of illumination. Procedure Each student was asked to access and browse the new university website using the screen reader or other technologies that they typically use to access the Internet. Each session took 30-45 minutes. Study participants were presented with a web page that served as a “jump page.” The jump page contained links to the test pages; all the students had to do was move down the list. The jump page was plain text and links, with no graphics, and was 100% accessible when validated to both Section 508 and WCAG standards. The jump page linked to 7 selected pages on the new site. These pages were selected either for their representative nature of the rest of the site or for their inclusion of a special feature like video. Depending on the technology the participants used to access the Internet, researchers either videotaped the computer screen while the participants were browsing the university website or used screen-recording software called Morea7 (Table 1 details the test areas and methods). No identifying information was gathered in this process; the software recorded where the user moved the mouse or cursor on the computer screen. Data collected in this manner included the following: • Screen text: all text appearing on the screen during recording. • Window events: when a window or dialog gets focus, is opened, closed, or resized. • Mouse clicks: left, right, middle, and single- and double-clicks. • Keyboard activity: keystrokes the user makes. • Web page changes: browser events such as when and where a user navigates between web pages. The web pages being used in the testing protocol were evaluated for their accessibility compliance. The site redesign had already been in process for over a year before the user testing, and accessibility had been a consideration; however, accessibility had only been talked about in abstract terms such as compliance to Section 508 or the site being “ADA compliant.”8 Be7. http://www.techsmith.com/morae.asp 8. It is important to note that the ADA does not deal directly with the accessibility of the Internet. This may be due to the fact that the Internet was just emerging at the passage of the ADA.
Exploring the Design, Development and Use of Websites
373
cause of this, there had been no specific design guidelines regarding accessibility. While one of the goals of the site redesign was to ensure accessibility, the development team did not really know how accessible the site was. This is a very important point. In almost every conversation the researcher had with members of the university group responsible for the site redesign, the importance of accessibility was brought up. The university’s administration has made disability issues a priority and the chancellor has been a strong advocate for both web accessibility and assistive technology services on campus. Unfortunately, accessibility was not a part of the formal agreement with the design firm contracted to design the new site, so there was no basis for accessibility in the site’s development. Because of this, as accessibility issues emerged in the testing process, the design firm was not accountable for redesigning the site without additional payment. The guide lays out specifications for use of images, color, layout, and media, but none of the specifications include how to make those elements accessible or how to assess them for usability. Because the university did not make a specification for accessibility (i.e., the site will conform to Section 508), the design and the style guide do not take accessibility into account and any pages developed from the style guide will potentially have accessibility problems. In order to get an accurate sense of how the enacted design translated to an accessible experience, each page was evaluated using WAVE (web accessibility evaluation tool)9 and Cynthia Says. Both of these tools provide validation of the Section 508 standards but provide different types of feedback. WAVE shows the original web page with embedded icons and indicators to mark accessibility issues, while Cynthia Says provides feedback by listing issues in a tabular format. As noted earlier, use of these tools should not be considered definitive analyses of a page but rather a good initial step in an accessibility validation process. Because of the way these tools validate to the standards, false positives are rare. For example, automated tools typically look for elements that should be included, like ALT text, and fail sites when those elements are not present. It is more typical for an automated evaluation to point out several issues that need are in need of subjective assessment. None of the pages used in the test sessions actually passed the automated validation but generally only had small errors or deviation from the 508 standard that would require human assessment to determine the impact of the variance. There were 7 tasks in the testing protocol. Each task asked the user to find a particular piece of information (e.g., the location of the Office of Disability Services), website location (e.g., the Registrar’s office), or information (e.g., the dates of spring break in 2010). Each task included follow-up questions. The goal of the testing session was not to test the users’ abilities 9. http://wave.webaim.org/
374
Foley
but rather to assess how easy or difficult the various pages were to use. The researcher guided the user through the protocol and then asked follow-up questions at the completion of each task. Task 1 Task 1 instructed the students to navigate to the main page of the university library from the university home page. This task had several substeps, with the primary step consisting of finding and following a link to the university library from the newly redesigned university homepage. Figure 2 contains a wireframe representation of the home page that illustrates how content on the page is organized. The page contains 148 links, some of which are visible only in pop-up windows. The link required for the task is located at the bottom of the page.
Figure 2. Layout Elements of Test Page.
Exploring the Design, Development and Use of Websites
375
Task Two Task Two involved navigating to a page intended for alumni and completing two sub-steps: viewing a video on the page and finding calendar information. This task presented the most technical variety of the 7 tasks, mainly because it included video (and the video’s audio) content. Video is commonly problematic from an accessibility standpoint (Chisholm & May, 2008; Clark, 2003). It can also present issues for people with visual impairments if information is primarily conveyed visually. Video can also present issues for people with aural disabilities if captioning is not available. If the video interface is not device independent, then people with physical disabilities who use alternative input devices might face challenges. Task 3 Task 3 asked the student to try to navigate to the undergraduate online application page and find the online application link. This task was included because finding and accessing online forms is an important part of how students use the university’s website. Ideally, the students could have actually tested the application form; unfortunately, the application page itself was not ready for evaluation when the tests were run. Task 4 Task 4 was based on the “Admissions and Financial Aid” page and asked the students to review the “Financial Aid FAQ” section. This page did not present significant issues for the students; however, observing the students interacting with the FAQ did raise some concerns. Tasks 5-7 Tasks 5-7 will not be described in detail because they did not provide any significant additional information. From a development perspective, these tasks were important because they provided more detailed information on the design templates and how users encountered them. The tasks were not as specific, primarily because of the limited number of pages available on the development site. Results In this section the students participant’s issues and strategies for navigating selected tasks and the issues they faced in terms of web accessibility and usability is presented. A significant factor in the students’ site navigation was their level of technology background and expertise. The student who experienced some of the most difficulty in navigating the site was a student with low vision who used the least assistive technology but whose page viewing strategies would have
376
Foley
easily been improved via widely available assistive technology. The student who did not have a visual disability and who did not use assistive technology had no difficulty completing any of the tasks. While this student was reconfiguring the computer, the researcher made a statement that assumed the student was using a screen reader. The student responded, “I actually didn’t bother with Jaws [a common screen reader] because I have some vision and the more [technology] things I have to learn the less time I spend learning.” This statement is an excellent illustration in the variation of technology use among students with disabilities. The student with low vision who relied on the manual modification to the display settings had considerable trouble finding the link to the library and completing the task. In order to find the link to the library, this student scrolled horizontally and vertically across the page in a “z” pattern that took almost 30 seconds. Because the student had zoomed the page to such high resolution, he could not see the entire page and relied on a method of scanning the page that only allowed him to see a very small portion of the page at a time. This process has been called the “soda straw” approach because it is similar to trying to read a page through a soda straw, seeing only a small portion of the page at a time (see Figure 3).10 Even though the page conformed to Section 508 standards, it presented accessibility and usability issues to this student.
Figure 3. Illustration of the “Soda Straw” Approach to Reading a Web Page with a Screen Reader. 10. Screen reader users experience the same sensation on an even more granular level in that they hear one word of a page at a time.
Exploring the Design, Development and Use of Websites
377
Conversely, for the student with low vision using assistive technology, the same task was fairly straightforward. Both the screen magnification software and the screen reader used presented a list of links (read aloud and/or textually displayed) upon loading the page. The student with a hidden disability who used a screen reader could not complete Task One (finding the Library link). After searching for about 25 seconds, he said, “I know the library’s address, so I would just type it in at this point.” He also indicated that he would start searching submenus to see if it was embedded somewhere. Even when told where the link was geographically on the page, he could not find it. This student’s difficulty suggests that the navigation of the page and the sheer number of links make navigating difficult. All of the students were able to access and view the video and answer questions about the content of the video in Task 2. While all the students were able to access and view the video, a finding that would indicate a fair degree of usability of the page, the video was not captioned nor was there a transcript included. The video was prominently displayed on the page, which made it easy to find. The second part of Task 2 proved more challenging for several of the students. This sub-step asked students to find information in an “Event Information” box in the lower middle section of the page. The student with a hidden disability who used a screen reader could not complete this task at all and indicated that this task required navigating too much text for a specific piece of information. All of the students successfully navigated the page for Task 3 (finding the Application link) finding the link to the form; however, several commented that they found the process rather complicated given the importance of the task (getting prospective students to apply to the university). The link to the application itself was in a long list of links on the right-hand side of the page. The text of the page contained directions that pointed the students in a variety of places, and it was unclear to the students testing the page if the various links actually ultimately lead to the application. The researcher confirmed that most of the links maintained the sidebars and links on other pages, so the options presented to the students remained the same, but because this was not clear initially, it confused all the student participants. This is a clear violation of the usability principal that keeps users from making or experiencing errors (either errors in the application or errors the users might make themselves) and gives them clear ways to recover from errors. The student who did not have a visual disability and who did not use assistive technology had no difficulty completing any of the tasks
Foley
378
discussion The findings of this study suggest important considerations for website designers and for instructional designers who are developing web-based instructional spaces. An important concept confirmed in this study suggests that the ways in which two computer users with the same diagnosed disability (in this case low vision) might experience that disability are very different. That is, no two people experience disability in the same way. For example, a person who has been blind since birth has a different experience of blindness than a person who lost their vision as a young adult. The 4 students who tested the website in this project had disabilities that could be roughly categorized as low vision, cognitive disability, and learning disability, but the individual approaches to their disability and their uses of technology greatly varied. How does the functional accessibility of a website designed with accessibility in mind differ from the results of automated testing? Even though the website evaluated in this study was theoretically designed to be accessible, its functional accessibility and usability varied with the students using it. In the case of this website evaluated (and many websites designed with accessibility in mind), Section 508 was the basis for both design and automated validation. Section 508 is an important guideline for creating accessible web pages, but it is a low level of accessibility - the rough equivalent of WCAG Level One compliance. Because of this, web pages designed to meet the 508 standard might not have a very high level of functional accessibility. That is, those pages will present more problems for more users. Even though the pages tested in this project met the 508 standards, the pages still presented some challenges for some of the students using the site. The findings from this study indicate that even though a page was screen-reader accessible, it might not be accessible for all students with visual disabilities. The overarching strength of the new site, from an accessibility perspective, is that it was designed with accessibility as a goal, and the people involved in its design were willing to engage in testing to refine the site’s accessibility and usability. This is an important point; however, as is too often the case, the site was well on its way to completion when the accessibility and usability testing was conducted. Stating the importance of accessibility in a website’s design is an important counter balance to the perspective that making a site accessible will cost time, effort, and resources which accessibility an undesirable goal.
Exploring the Design, Development and Use of Websites
379
When the researcher engaged in the project, he thought that there was formal documentation in the site’s design process that mandated accessibility. As the website and its accompany documentation (the style guide) were examined more closely, it became clear that accessibility had never been formally specified at any point in the design process. Neither accessibility guidelines nor usability metrics are mentioned in the 48-page style guide. This is a possible explanation for the presence of some of the structural (e.g. page layout and content density) accessibility and usability issues that emerged in the validation process. Do automated accessibility validation tools truly ensure that a website will be functional for students with disabilities? While the site did pass an automated test, it still presented significant challenges for some students. Based on the evaluation in this project, it can be observed that the website as designed presented significant challenges to students who rely on the site being enlarged in order to view it. It was obvious that the new website did not provide adequate support for users with visual impairments who did not use a screen reader. Because the site was designed with a highly visual orientation, users who relied on enlarging the size of the screen had difficulty finding discrete bits of information on the site. While the website was designed to meet accessibility standards, and did validate for visual disabilities, much the site was highly unusable and significant portions were not accessible to students who did not use a screen reader. While all the students were able to use the site, the student with a hidden disability who used a screen reader expressed frustration at several points, indicating he would have given up or sought to complete the task at hand in another manner. Uncaptioned videos in the test pages would have made content contained in the video inaccessible to deaf users. The most obvious difference in the automated testing and user testing was that while the site did pass an automated test, it still presented significant challenges for some students. The overall site layout and use of large print-style images presented challenges for low vision students, and the complicated navigation structure presented challenges for a student with a hidden disability. Neither the lack of captions nor the lack of a transcript would have caused the page to fail an automated validation process but the page would not conform to §1194.22 of Section 508, which requires an equivalent experience for all users. Users should be able to follow the dialog and action in a multimedia file as it occurs. Captions—whether open or closed—must be timed to coincide with those events as they occur. A user with an aural disability would not have been able to access the content in the video at all.
380
Foley
How do accessibility and usability overlap in practice? While they seem discrete in theory, accessibility and usability overlap in important and sometimes confusing ways in practice particularly in the areas of hidden disabilities. The content density of some pages tested could potentially make it difficult for students with learning and reading disabilities to use the site. This, in fact, is a usability issue even for users with no disability, as it creates cognitive overhead as users try to sort out what is signal from what is noise. This is an important finding because many students in higher education have “hidden” disabilities—a generic term that simply means that a person’s impairment or condition is not obviously apparent or visible. People with hidden disabilities often are not easily recognized as having a disability. Hidden disabilities can include some people with visual impairments like low vision or colorblindness, those with physical difficulties such as a repetitive stress injury, those with learning disabilities, those who are hard of hearing, or those who have mental health conditions. Consideration of students with hidden disabilities is important because of two trends. The first trend is the increasing number of students with learning disabilities enrolling in colleges and universities (DaDeppo, 2009; Heiman & Precel, 2003). In 1988, 1% of college freshmen at four-year institutions were identified as having a learning disability. According to a report by the American Council on Education (Henderson, 2001), this grew to 2.4% in 2000. The National Center for Education Statistics (NCES), sponsored by the U.S. Department of Education, reported that 7.1% of all college students in the U.S. in 2004 were identified as having a learning disability (Snyder & Dillow, 2007). These numbers probably do not represent the full number of students with disabilities because students are not required to register for disability services, may not have a “diagnosis,” or might not be aware they have a disability for which they could receive accommodations. Considerations for research on usability and accessibility. Gaining access to groups of individuals with disabilities as research subjects presents several challenges. At the most fundamental level, identifying and recruiting research participants can be difficult. In higher education, students with disabilities are not required to register for services, and the level of services that are provided to them varies from school to school. Finally, it is important that the research itself does not present additional challenges for students with disabilities, who often face challenges that other students might not. For example, a student with a visual or learning disability might have to wait for an audio version of a text that the rest of the class had at the beginning of the semester, or writing an assignment takes
Exploring the Design, Development and Use of Websites
381
considerably longer because of a physical disability (like multiple sclerosis) or a cognitive disability (like dyslexia). Researchers should consider the impact of data collection activities that ask students with disabilities to spend time and energy they might need elsewhere. Principles for Accessible and Web Usable Design Designing for accessibility and usability often requires a paradigm shift for designers. In this section general principles informed by this research project are offered for web designers interested in creating and testing usable and accessible websites. Design for people, not technologies. Web developers usually assume that people with disabilities will be using assistive technologies, so they design accessibility for those technologies, not for the actual people with disabilities. Accessibility and usability is a team effort. Engaging in testing with assistive technologies is an important step, but it is not without challenges. For example, when designers used to working in a visual medium encounter a screen reader, they usually discover it is a terribly confusing medium to work with. Most web developers do not have screen readers and often do not have access to them. Designing for accessibility requires an organization’s commitment to acquiring and understanding the various assistive technologies that people might use to access the web. People build websites and they make design and development decisions based on perspective, experience, and training. Training for all designers should cover the general issues and challenges faced by users with disabilities. This may include discussion of assistive devices such screen readers, individual disabilities, and relevant accessibility policy. Having a design team that appreciates the benefits of accessibility and genuinely wants to make the website as accessible as possible for as many users as possible is critical to the ultimate success of and deployment-wide accessibility effort. Do not rely solely on automated tests. Automated validation tools are great resources for quickly assessing the overall status of a page; however, these tools do not provide robust information on more subjective issues like the appropriateness of ALT text. In addition to their limitations assessing accessibility, these tools do not assess usability at all. Information on the strategies and techniques users employ when navigating a site can only be obtained through user evaluation and testing
Foley
382
Engage in accessibility and usability efforts throughout the web development process. Usability and accessibility are processes, not products and should be an integral part of web design methods. Leaving accessibility and usability verification until late in the process is a common practice in web development and unfortunately often results in sites that require costly and time-consuming retrofitting, which may or may not happen. Engage a diversity of users in testing Because of the diversity of users both with disabilities and without, it is difficult for designers to gauge how accessible and usable a site is, or will be, unless they engage in testing with real users. It will not be possible to exhaustively test every disability type or user category; however, engaging in the testing process provides valuable perspective and feedback on a site’s design. Most designers are trained to think about the visual user interface (UI), with the mouse as the primary input device for navigation. In addition to needing guidance on how to navigate content outside of the visual UI, designers also require some sense of the most frustrating issues that people with disabilities have when navigating sites. The experiences of some of the students in this project reflect this: while the website met broad accessibility guidelines; it still presented significant challenges for certain users. Many web developers, even if they do include a variety of possible users, may come to view people with disabilities as outliers since most participants in use tests will not have disabilities. It is important for web developers to avoid thinking of people as numbers and making decisions along those lines (e.g. “Only one person had trouble with the site, and that was because they had everything magnified. Overall, the site is fine for pretty much everyone.”) Limitations The small number of participants in the study is a possible limitation. While the nature of this research was exploratory and descriptive, the researcher would have liked to include more participants. The challenges in conducting this type of research with students with disabilities are multifaceted. For example, some disabilities categories are easier to identify and recruit participants. It can be difficult to identify and conduct research on hidden disabilities if, for no other reason, individuals with those types of disabilities are less likely to self-identify. Another limitation is that the type of task-based usability studies used in this study artificial. Assigning users tasks assumes that the tasks selected by
Exploring the Design, Development and Use of Websites
383
the researcher or designer are the same ones that a user will engage in as they use the site. Acknowledging the difference in perspective of researcher and user is important; however, there are not clear alternatives to this type of research. Conclusion Web accessibility and usability are important and timely issues. Developing sites with high levels of accessibility and usability opens doors for individuals with disabilities in ways that were not previously possible and ensure that sites are usable for the broadest possible audience. A comprehensive approach to web accessibility and usability is multifaceted and touches on all aspects of the web design process from identification of standards to the implementation of an organization’s site. User testing is an important and powerful tool in developing websites that are useful and accessible. References American Foundation for the Blind. Distance learning: How accessible are online educational tools. (2008, November). Retrieved December 6, 2009, from http://www.afb. org/Section.asp?SectionID=3&TopicID=138&DocumentID=4492 Blass, E. (2001). What’s in a name? A comparative study of the traditional public university and the corporate university. Human Resource Development International, 4(2), 153–172. Bruseberg, A., & McDonagh-Philp, D. (2000). User-centred design research methods: The designer’s perspective. In Integrating Design Education Beyond 2000 Conference Proceedings (pp. 4–6). Presented at the Integrating Design Education Beyond 2000, Brighton, UK. (pp. 4–6). Burgstahler, S., Corrigan, B., & McCarter, J. (2004). Making distance learning courses accessible to students and instructors with disabilities: A case study. Internet and Higher Education, 7(3), 233–246. Chen, Q. (2001). Human computer interaction: Issues and challenges. Hershey, PA: Idea Group Pub. Chisholm, W., & May, M. (2008). Universal design for web applications: Web applications that reach everyone (1st ed.). Sebastopol, CA: O’Reilly Media. Clark, J. (2003). Building accessible websites. Indianapolis, IN: New Riders Pub. Cobb, P., Confrey, J., diSessa, A., Lehrer, R., & Schauble, L. (2003). Design experiments in educational research. Educational Researcher, 32(1), 9–13. Crowston, K., Sieber, S., & Wynn, E. (Eds.). (2007). Virtuality and virtualization (Vol. 236). Boston, MA: Springer. Retrieved from http://www.springerlink.com/content/ jm106t786t764w34/ DaDeppo, L. M. W. (2009). Integration factors related to the academic success and intent to persist of college students with learning disabilities. Learning Disabilities Research & Practice, 24(3), 122–131. doi: 10.1111/j.1540-5826.2009.00286.x Dervin, B., & Nilan, M. (1986). Information needs and uses. Annual Review of Information Science and Technology, 21, 3–33.
384
Foley
Foley, A., André, B., Petri, K., Felix, M., & Hunzicker, D. (2006). A public consortium emphasizes the importance of LMS accessibility. Campus Technology. Foley, A., & Regan, B. (2002). Web design for accessibility: Policies and practice. AACE Journal, 10(1), 62–80. Heiman, T., & Precel, K. (2003). Students with learning disabilities in higher education. Journal of Learning Disabilities, 36(3), 248. Henderson, C. (2001). College freshmen with disabilities, 2001: A biennial statistical profile. Washington, D.C.: American Council on Education. Retrieved from http://www.eric. ed.gov:80/ERICWebPortal/contentdelivery/servlet/ERICServlet?accno=ED458728 Lehman, T. (2008). Making library web sites usable: A LITA guide. New York, NY: NealSchuman Publishers. Marchetto, A., Ricca, F., & Tonella, P. (2008). A case study-based comparison of web testing techniques applied to AJAX web applications. International Journal on Software Tools for Technology Transfer (STTT), 10(6), 477–492. doi: 10.1007/s10009008-0086-x Nielsen, J. (2000). Designing web usability. Indianapolis, IN: New Riders. Opitz, C., Savenye, W., & Rowland, C. (2008). The effects of implementing web accessibility standards on the success of secondary adolescents. Journal of Educational Multimedia and Hypermedia, 17(3), 387–411. Patton, M. Q. (2001). Qualitative research & evaluation methods (3rd ed.). Thousand Oaks, CA: Sage. Reeb, B. (2008). Design talk: Understanding the roles of usability practitioners, web designers, and web developers in user-centered web design. Chicago, IL: Association of College and Research Libraries. Rubin, J., & Chisnell, D. (2008). Handbook of usability testing: How to plan, design, and conduct effective tests (2nd ed.). New York: Wiley. Seale, J. (2006). E-learning and disability in higher education: Accessibility research and practice. New York: Routledge. Snyder, T. D., & Dillow, S. A. (2007, July). Digest of education Statistics 2006. NCES 2007-017. National Center for Education Statistics. .Retrieved from http://www.eric. ed.gov/ERICWebPortal/contentdelivery/servlet/ERICServlet?accno=ED497523 Spurgin, K. M. (2006). The sense-making approach and the study of personal information management. Proceedings of the PIM. Proceedings of the PIM’2006 – A SIGIR’2006 Workshop. [Available at http://pim.ischool.washington.edu/pim06/index. htm] Stake, R. E. (1978). The case study method in social inquiry. Educational Researcher, 7(2), 5–8. Steele, P., & Wolanin, T. R. (2004). Higher education opportunities for students with disabilities: A primer for policymakers. Institute for Higher Education Policy (p. 84). Retrieved from http://www.ihep.org/Publications/publications-detail.cfm?id=59 Subraya, B. M. (2006). Integrated approach to web performance testing: A practitioner’s guide. Hershey, PA: IRM Press. Thatcher, J. (2006). Web accessibility: Web standards and regulatory compliance. New York, NY: FriendsofED, distributed by Springer-Verlag. Theofanos, M. F., & Redish, J. (2003). Bridging the gap: between accessibility and usability. interactions, 10(6), 3–51. doi: 10.1145/947226.947227 U.S. Government Accountability Office. Higher education and disability: Education needs a coordinated approach to improve its assistance to schools in supporting students (No. GAO-10-33). (2009). Washington, DC: U.S. Government Accountability Office. Retrieved from http://161.203.16.70/products/GAO-10-33
Exploring the Design, Development and Use of Websites
385
Web content accessibility guidelines 1.0. (1999, May 5). Retrieved March 18, 2010, from http://www.w3.org/TR/WCAG10/ Wilson, T. D. (2006). On user studies and information needs. Journal of Documentation, 62(6), 658–670. Wisdom, J. R., White, N. A., Goldsmith, K. A., Bielavitz, S., Davis, C. E., & Drum, C. (2006). An assessment of web accessibility knowledge and needs at Oregon community colleges. Community College Review, 33(3–4), 19–37. doi: 10.1177/009155210603300302