ACCESSIBILITY ISSUES SIMULATION FOR MOBILE ...

4 downloads 205569 Views 207KB Size Report
guidelines focus strictly on applications itself and does not take into account physical environment ..... < http://developer.apple.com/technologies/mac/cocoa.html >.
ACCESSIBILITY ISSUES SIMULATION FOR MOBILE APPLICATIONS

Jan Vystrcil, Zdenek Mikovec, Marek Fronc, Jiri Drbalek Czech Technical University in Prague, Faculty of Electrical Engineering Karlovo nam. 13, 121 35 Prague 2, Czech Republic Phone: +420 224 357 647 {vystrjan|xmikovec|froncmar|drbaljir}@fel.cvut.cz

ABSTRACT In this paper we have identified and analyzed important accessibility issues typical for the world of mobile devices and we have compared them with common sets of accessibility guidelines. We have filled the gap we have identified, by means implementation of simulation tool that provides mobile application developers convenient way how to simulate these issues with any mobile platform. It is developed as a transparent layer that can be placed over the mobile platform simulator and provides effects such as reflection, finger occlusion, tremor or simulation of visual impairments like tunnel vision, blurred vision or color blindness. KEYWORDS Accessibility, Mobile application, Impairment simulation INTRODUCTION Importance of mobile applications is growing rapidly as it provides access to variety of very useful services. Moreover the penetration of mobile phones into everyday life is higher than any other ICT. Especially a smartphone became a useful assistant for everyday life. In particular the impaired users found it a very powerful supportive device in various situations. It is possible to say that every impaired ICT user is using at least mobile phone. Therefore it is necessary to keep in mind the accessibility of mobile applications in order to avoid exclusion of impaired users. Unfortunately the field of mobile applications and devices lacks sufficient effort towards their accessibility. Ignoring the accessibility of mobile applications and devices affects in the largest extent users with visual impairment who create a relatively large segment of the market. Therefore we focus primarily on this user group.

PROBLEM DESCRIPTION Lot of effort has been put towards the desktop application accessibility and some approaches and practices can be easily reused in the field of mobile devices and applications (e.g., importance of accessibility APIs to enable TTS for blind users or image contrast issues for low vision users). However the mobile environment brings new issues that are not present in the world of desktop computers and thus needs a specific handling (e.g., lighting conditions). There are several well established sources of guidelines that describe the accessibility requirements for users with different kind of impairments. These guidelines should serve as a basis for building of the guidelines for the mobile applications accessibility. All these guidelines focus strictly on applications itself and does not take into account physical environment that also influences the accessibility in mobile environment. There are also tools helping with accessibility of web pages for mobile environment like the W3C MobileOK Checker [3]. Existing guidelines should be modified and extended by solutions targeting the specific issues of mobile applications and physical environments. An important source of guidelines that should be used primarily is set of W3C WCAG guidelines [1]. WCAG guidelines version 2.0 are organized according to four key principles: perceivable, operable, understandable, and robust [2]. We will follow these principles. Following the perceivable principle we concentrate on presentation of information in a way that user can perceive and understand it efficiently and easily. We need to provide good descriptions for all non-textual content. Functional elements (entities) need to be named according to what they are expected to do. Contrast between text and its background is important as well as a possibility of smooth increasing of the font size. The perception in the world of mobile applications is mostly influenced by the mobile device display quality (size, colors, brightness, contrast) and physical environment conditions (especially visual and acoustic). Very important problem is a combination of issues like sharp change of environmental lighting in combination with light sensitivity impairment of the user. All these situations can cause an accessibility issue. The operable principle concentrates on possibility of interactions with all elements of the user interface. On the desktop lot of effort is put into the keyboard accessibility. In world of mobile devices there are typically reduced keyboards and recently the majority of newly developed mobile devices do not have any HW based keyboards. They are equipped with software keyboards only, that are much more complicated to use for users with low vision or blindness. Finally the design of operational HW keys (like home, back, search) is more often designed in a way that can not be haptically perceived without activating their function. The SW keyboards, large touch screens and haptically unrecognizable HW keys represent a source of completely new accessibility issues. The understandable principle concentrates on intuitive and consistent design of controls, good readability and understandability of text (e.g., indication of language of given text). It is focused on the issues users are facing when filling forms and prevent creation of mistakes. If some mistakes are present, system should inform user about the error occurrence in a proper way and guide them to correct those errors easily. In mobile environment we are facing the problem of much higher number of interruptions of user's work (like interaction with passing people) and necessity to perform several tasks in parallel (filling a form while crossing the street). If we imagine combination of these situations with user's impairment we will end up with very complicated situations that have to be checked against the accessibility guidelines.

Robust principle covers the last area. This principle concentrates on making applications robust in a way that they can be interpreted in a same way on a bunch of devices including special assistive technologies. In mobile world this is even more challenging task as there is a wide variety of mobile platforms which are developing rapidly. All above mentioned principles could be applied in mobile phones world. There are cases that deserve a special attention. One example could be filling of the large forms on small displays especially when half of the display could be occupied by software keyboard. In cases where part of the display is covered when filling form there is necessity of good traversing order between items of the form. We have to ensure not loosing the context and provide information which item is currently displayed and filled. This problem often occurs when layout of display is changed from portrait to landscape mode and vice versa. In the following section we will focus on the specific issues of mobile applications when used in physical environment. Issues specific for mobile world In comparison to desktop computers or laptops that are usually used indoor (at home, in the office or in a restaurant), mobile devices are often used in different physical environments like in public transport, on the street, in the park, etc. Therefore we have analyzed typical issues that can occur in these environments. One of common problems is the low contrast of the user interface on the display. This situation is more painful especially when the display is on direct sunlight (see left side of Figure 1) or the display reflects the surrounding landscape, or face of the user like in the mirror. The other problem with UI visibility is caused by changing reflections while moving, for example changing reflections of nice countryside when travelling in the bus or train. Finally if we take into account critical combination of environmental conditions and particular visual impairment we can identify other problematic situations. For example if there occurs a sharp change in the lighting conditions (like leaving a tunnel, strong source of light appearing accidently behind the display) and the user suffers from light sensitivity impairment, it can take much more time till the eyes will accommodate to the new situation. This can lead to serious accessibility issues (like missing an important alert, impossibility to react in a given time). The usage of the mobile devices on the move brings several other issues such as display tremor which makes especially small non-contrast text hard to read.

Figure 1: Reflection on the display and occlusion of the display with finger

Another set of specific issues are related to the user interaction with the mobile device. Touch screen devices bring another issue in a form of occlusion of the display by interacting fingers (see right side of Figure 1). Information we are looking for is displayed under the finger and we simply do not see it. This occlusion can be even bigger if the user suffers from visual impairment like low-vision on one eye and blindness on the other one. DEMONSTRATIVE USE CASE On the following use case we will demonstrate the importance and usage of the simulation tool. This is example of typical real life situation that can easily happen when visually impaired users use smartphones. Mobile Instant Messaging client Use Case Thomas is 25 years old musician, piano player. From his childhood he suffers from Retinitis Pigmentosa connected with tunnel vision which causes severely limited peripheral vision and very narrow view angle. Thomas is traveling by train from his concert in Gainsborough home to the Sleaford. In the train he met a new friend Sebastian who saw him playing on a concert. They talk about music during the train journey. It was inspiring for both of them and they really enjoyed it. They found that they are from different towns and Sebastian would like to invite Thomas to take part on the concert in his city. Thomas was delight and suggested to exchange contacts for Instant Messaging (IM) to arrange all the things for organization of the concert. So Thomas switched on his Android smartphone and he went through the menu to start IM application. After the application starts an automatic log-in he located and touched “Add contact” button. Sebastian dictated his user ID for IM service to Thomas. He typed it into the Add contact form and touched “Save” button. At that same moment the train was passing through the countryside where there was a bad and unstable GSM signal and the smartphone had lost the Internet connection. At the bottom of display phone showed for few seconds the “Toast notification” about problems with Internet connection and error with saving contact to the online server contact list. Because Thomas was checking the user ID typed on the top of the display he did not see the notification because of the tunnel vision while concentrating on the different part of the display. He thought everything is saved correctly, switched off the smartphone and put it back to the pocket. Next day he found that new IM contact of Sebastian was not saved and was really disappointed when he realized that he probably lost important business contact. How the simulation tool will help to discover the issue The above mentioned problem with overseen toast message can be simulated by switching on the visual impairment effect for simulation of tunnel vision. With this effect applied the developer can easily detect the problem with overseen error message presented in a form of toast.

ISSUES THAT SHOULD BE SIMULATED Some situations can be easily simulated by the developer but some not. For example the issues related to small screen can be evoked by usage of emulators or deployment of application to the target device. The dynamically changing reflections on the display, trembled display caused by ride on the bus is hard to simulate with the common tools. The changes of environmental light can affect also the ability of the visually impaired users to read the display. Simulation of different light sources and their intensity, dynamic changes and special situations (like moving reflections) can not be easily simulated in the developer's office. Occlusion of the display caused by the finger touching the display while having visual impairment is very hard to simulate without any special simulator. We have focused on these hard to simulate situations and developed a simulator which can easily provide an illusion of these situations to the developers. The list of situations that should be simulated is following: • Physical environment issues o Static reflection o Dynamic reflection o Display tremor o Finger occlusion • Visual impairments o Tunnel vision o Blurred vision o Color blindness • Combined issues o Reflection + Finger occlusion o Light sensitivity + Dynamic reflection o Display tremor + Blurred vision We tried to select the most important situations according to occurrence frequency and severity of accessibility problem. SIMULATION TOOL In comparison to the world of desktop and web applications there are many mobile platforms that are rapidly emerging and disappearing on the market and all the work related to accessibility can be easily lost with the discontinued mobile platform. Therefore it is essential to develop simulation tools which are as independent as possible on the mobile platform emulators and integrated developer environments. We have developed a simulator - Mobile Impairment Simulation tool [5] – which is implemented as a standalone application. It is totally independent on the mobile platform for which the mobile application is being developed. The simulation is done by means of on-thefly manipulation with the pixels rendered on the screen typically by some mobile platform emulator.

The simulator consists of 3 parts: ●

Main menu bar (see top of Figure 2) where it is possible to choose effects for simulation



Filter settings window (see right side of Figure 2) where parameters of effect can be adjusted



Simulation window (see blurred part in centre of Figure 2) where simulation effects takes place

Figure 2: Screenshot of Mobile Impairment Simulator with blur effect placed over the Android emulator [6]

You can notice different mobile platform emulators that are shown on Figure 2 (Android) and Figure 3 (Windows Phone 7 and iPhone). Currently there are implemented several effects that covers a subset of situations described in previous section. These effects are: • Physical environment effects o Static reflection o Display tremor o Finger occlusion • Visual impairments effects o Tunnel vision o Blurred vision o Color blindness

The static reflection effect provides preview of the mobile application when for example fluorescent lamps (see left side of Figure 3) on the ceiling reflects on the display. Developer can easily adjust strength of the static reflection effect and check if the user interface is still readable. Finger occlusion effect is implemented by modified mouse pointer that is attached to the photo of index finger or thumb (see right side of Figure 3). Usage of mobile device while walking or travelling by car or by bus brings tremor of the display and introduces text reading problem. Display tremor effect is simulated by random motion blur effect. Many of these effects can be also used to check the usability of the visual design for users with no visual impairment.

Figure 3: Screenshot of simulated effects - reflection of fluorescent lamp and finger occlusion

Current prototype of the Mobile Impairment Simulator has been developed for Mac OS X and uses Cocoa framework [4] for rendering of effects. Most of the filters are implemented using Core Image, Apple image processing technology that leverages programmable graphics hardware whenever possible to provide near real-time processing. Core Image API provides access to built-in image filters for both video and still images and provides support for creating custom filters. Unfortunately this technology is very closely tied to Mac OS X environment and there is currently no similar multi platform solution. Limits of this approach As it was mentioned previously, Mobile Impairment Simulator is fully independent on any type of mobile platform emulators. There is no relation between the mobile platform emulator and Mobile Impairment Simulator. Static image rendered by mobile platform emulator is periodically sent to Mobile Impairment Simulator, where it is modified according to requested simulation effect. Therefore we do not know the semantics of the objects on the display

(buttons, labels, input fields, etc.) thus issues like wrong tab-traversal can not be simulated. This is limit of the approach we have used for simulation On the other hand some kind of integration can be useful as it can make the development process smoother and faster. A useful extension can be an implementation of plugin for developer's IDE which will allow automatic start and comfortable parameterization of Mobile Impairment Simulator. CONCLUSION In this paper we have mentioned several accessibility issues related specifically to the area of mobile devices usage. These issues are different from those we are facing in the area of standard desktop computers. We have introduced a way of effective simulation of these aspects and we have developed prototype of Mobile Impairment Simulation tool that can simulate these issues. Mobile Impairment Simulation tool will be subject of third phase of pilot tests with mobile application developers carried out within the ACCESSIBLE project. In future versions of Mobile Impairment Simulator we are planning to implement more effects simulating visual impairments and dynamic changes of the physical environment. The dynamic reflections can be simulated by using video sequences with reflections in the simulator window. To provide independence on the host operating system we will implement Java based version of the Mobile Impairment Simulator, so it will be possible to run it on any operating system. ACKNOWLEDGEMENT This work was partially funded by the EC FP7 project ACCESSIBLE - Accessibility Assessment Simulation Environment for New Applications Design and Development, Grant Agreement No. 224145. REFERENCES [1] W3C WCAG 2.0 Guidelines (accessed 20.7.2011) < http://www.w3.org/TR/WCAG20 > [2] W3C WCAG 2.0 principles (accessed 20.7.2011) < http://www.w3.org/TR/UNDERSTANDING-WCAG20/intro.html#introductionfourprincs-head > [3] Mobile OK W3C checker (accessed 26.7.2011) < http://validator.w3.org/mobile/ > [4] Cocoa framework (accessed 2.8.2011) < http://developer.apple.com/technologies/mac/cocoa.html > [5] Mobile Impairment Simulation tool. (accessed 2.8. 2011) < http://cent.felk.cvut.cz/hci/accessible/www/index-mis.htm > [6] Android Emulator (accessed 2.8. 2011) < http://developer.android.com/sdk/ >

Suggest Documents