Assistive web browsing with touch interfaces - stony brook cs

20 downloads 25338 Views 2MB Size Report
based web browsing interface is Apple's VoiceOver [5], which has two types of .... ASSETS, 2007. [5] VoiceOver. http://www.apple.com/accessibility/voiceover/.
Assistive Web Browsing with Touch Interfaces Faisal Ahmed

Muhammad Asiful Islam

Yevgen Borodin

I.V. Ramakrishnan

Stony Brook University, Computer Science Department, Stony Brook, NY 11790, {faiahmed, maislam, borodin, ram}@cs.sunysb.edu

ABSTRACT This demonstration will propose a touch-based directional navigation technique, on touch interface (e.g., iPhone, Macbook) for people with visual disabilities especially blind individuals. Such interfaces coupled with TTS (text-to-speech) systems open up intriguing possibilities for browsing and skimming web content with ease and speed. Apple’s seminal VoiceOver system for iOS is an exemplar of bringing touch-based web navigation to blind people. There are two major shortcomings: “fat finger” and “finger-fatigue” problems, which have been addressed in this paper with two proposed approaches. A preliminary user evaluation of the system incorporating these ideas suggests that they can be effective in practice.

a) b) Figure 1. VoiceOver Gesture Interface on MacBook Users can browse by directly touching the screen of iPhone/iPad with one finger and then dragging the finger on the screen; in MacBook laptops, however, users have to interact with the touch pad, whose coordinates are mapped to those of the screen. So dragging the finger over some area r, will make VoiceOver read the content on the canvas covered by the corresponding area R. The end-user experience with this interface is often frustrating. Firstly, on iPhones and even iPads, web page elements can be too small to be touched with precision. Worse yet, the laptop touchpad surface area is typically much smaller than that of the screen. As shown in Figure 2, navigating a table can become an exercise in futility: if the user drags the finger horizontally along a table row, it is very easy to skip over table cells, shift to a different row, or even read something outside of the table. This is sometimes referred to as a "fat finger" problem, because the finger covers more than one webpage element or table cell at a time. Secondly, since the area occupied by different elements on the canvas varies considerably, the user does not know how far to drag the finger over the touchpad surface before they gets to the next element.

Categories and Subject Descriptors H.5.2 [Information Interfaces and Presentation]: User Interfaces; H.5.4 [Information Interfaces and Presentation]: Hypertext/Hypermedia – navigation

General Terms Algorithms, Human Factors, Experimentation, Design.

Keywords Touch interface, Web Accessibility, Navigation, Granularity

1.

INTRODUCTION

For blind users, touch interfaces can provide more flexibility in accessing various types of content. In the context of web accessibility, this added flexibility has the potential to facilitate easier navigation and content skimming [1, 3, 4]. Presently, the only assistive technology product that supports a usable touchbased web browsing interface is Apple’s VoiceOver [5], which has two types of interface: swipe interface and drag interface. (i) In Voiceover’s swipe interface, gestures only allow forward or backward traversal on the DOM tree. So users have to make many swipes in order to get to an element that is spatially close, but far out on the tree; For example, Figure 1-a illustrates how a swipe to the right made on the MacBook touchpad shifts the reading position right a single table cell Figure 1-b. To go a row down, however, one has to keep swiping to the right until the end of the current row, as shown with dotted lines on the screen. This leads us to the problem of "finger-fatigue" that may be caused by continuous swiping. In fact, to navigate from one arbitrary text element to another, the user may have to make as many swipes as there are elements between these two elements. (ii) The dragging interface, allows VoiceOver users to read the screen content by "touching the content". All visible content on the screen is rendered on a 2-D canvas, where each element occupies some “real estate.”

b) Figure 2. VoiceOver Dragging Interface on MacBook

2.

OVERVIEW OF OUR APPROACH

In this paper we develop algorithmic techniques to address “finger fatigue” and “fat finger” problems. Specifically we propose two ideas: the first idea is to extend the swipe interface by augmenting swipes with directions. This facilitates going to the next or previous element along the direction of the swipe. We extend the swipe interface to accommodate the explicit association of directions with gestures. Using the direction and the current

Copyright is held by the author/owner(s).

ASSETS’10, October 25–27, 2010, Orlando, Florida, USA.

ACM 978-1-60558-881-0/10/10.

235

focused element (i.e., the web element currently being read), we find the “next” element along the direction. The second idea is to impose a bound on the drag length in the dragging interface. When the length spanned by the dragging of the finger exceeds this bound the focus is set on the next element along the drag direction, thereby eliminating the problem of skipping content chunks. To illustrate, edges a, b, c, d, and e in Figure 3-a denote five single finger movements and a', b’, c’, d’, and e' respectively denote the corresponding web elements spanned by these movements in Figure 3-b. Notice that in all cases, these two are consecutive elements. The bound on the distance that finger has to shift to get to the next element can be set as a parameter.

a)

Figure 4. Mean and Std. Dev. of task completion time Following each task, participants rated the difficulty of completing each task on a 5-point Likert scale (1=Strongly Disagree) to (5=Strongly Agree), to get the subjective opinion about the HS and VO systems (Table 1). Subjects admitted that they often experienced difficulty while browsing the web using Apple’s VoiceOver due to the random skipping of the elements in the pages (the coarse granularity problem). Subjects strongly agreed that the table navigation would be easier if the coarse granularity can be handled in a controlled manner. By putting a threshold, HS is able to handle the coarse granularity. They also strongly agreed that HS is giving them a much better idea about the layout of the table compared to VO, because of the extended swipe and drag interface.

b)

Figure 3. Prototype of the Directional Dragging Interface

3.

Table 1. Average 5-Point Likert scale values by participants (Scale 1=Strongly Disagree to 5=Strongly Agree)

PRELIMINARY USER STUDY

To evaluate our touch-based directional navigation system we recruited two blind subjects. Two screen readers were used in our evaluation: (1) The touch interface of Apple’s VoiceOver (VO), which has two types of navigation: navigation with gestures and navigation by dragging, and (2) The touch-based interface of HearSay (HS) [2], which uses the directional navigation algorithms described in this paper. For our experiments we chose large tables containing columns and cells of varying sizes from Wikipedia pages. Tables served as good benchmarks for evaluating touch-based navigations because one could design tasks that were consistent and measurable. Three tasks were formulated for the evaluation purpose. In particular the first task was to add up all the values of a particular column, second one was to find a value of a particular cell and the last one was to read out all the cells that were in the same row/column. Participants were required to complete these tasks with HS and with VO - 3 tasks with HS and 3 tasks with VO for a total of 3 x 2 = 6 trials. The 3 tasks performed with a particular version of non-visual browser (either HS or VO) were conducted sequentially. The system order (HS/VO) was counter-balanced using a 4-cell design.

4.

I found this task difficult Finding out information from a particular table cell is more efficient

HS 2. 5 (0.71)

VO 4.5 (0.71)

4.5 (0.71)

2 (1.41)

5.

CONCLUSION AND FUTURE WORK

6.

REFERENCES

In this paper we extended the swipe and drag interface to cope with the “fat finger” and “finger fatigue” problems. The essence of the approach is to find the next element in a given direction. Preliminary experimentation suggests that blind users indeed find the interface useful. It also suggests that directional navigation with our touch interface has the potential to notch up end-user experience in web browsing by giving the user more control. [1] Bigham, J. P., C. M. Prince and R. E. Ladner. WebAnywhere: a screen reader on the go. W4A, 2008. [2] Borodin Y., Ahmed F., Islam M.A., Feng S., Puzis Y., Melnyk V., Dausch G., Ramakrishnan I.V., Hearsay: A New Generation Context-Driven Multi-Modal Assistive Web Browser, WWW, 2010.

RESULTS

During the study, we measured the time it took the subject to complete each task. Figure 4 shows the statistics of the measured time. The results of the user study show that the subjects were able to find the required information with Directional Navigation System at least twice as fast when compared to VoiceOver.

[3] Kane S. K., Bigham J. P. and Wobbrock J. O.. Slide Rule: Making Mobile Touch Screens Accessible to Blind People Using Multi-Touch Interaction Techniques. ACM SIGACCESS Conference on Computers and Accessibility (ASSETS '08). New York: ACM Press, 73-80

The evaluator spent on average 109.66 sec (Std. Dev. 42.16) per task using HearSay System and 260 sec (Std. Dev 93.17) per task using VoiceOver.

[4] Miyashita, H., D. Sato, H. Takagi and C. Asakawa. Aibrowser for multimedia: introducing multimedia con-tent accessibility for visually impaired users. ASSETS, 2007. [5] VoiceOver. http://www.apple.com/accessibility/voiceover/

236