Enhancing Computer Accessibility for Disabled Users ...

4 downloads 1497 Views 382KB Size Report
its Visual Studio SDK to write code that interacts with this novel device originally intended for gaming but now more and more popular with learning, multimedia ...
Enhancing Computer Accessibility for Disabled Users – A Kinect Based Approach for Users with Motor Skills Disorder Loay Alzubaidi, Ammar Elhassan, Jaafar Alghazo College of Computer Engineering and Science Prince Mohammad Bin Fahd University Email: [email protected] Abstract-- In this paper, we present an application as a solution to problems encountered when using PC’s by users with motor skills impairment. This application utilizes the Microsoft Kinect Sensor and its Visual Studio SDK to write code that interacts with this novel device originally intended for gaming but now more and more popular with learning, multimedia and entertainment systems. Preliminary results from prototype testing show that the system is usable and has good potential. The intended initial domain of the application is teaching the Muslim holy book (Quran), although the ideas and application software can be adapted as a learning tool for students with disabilities in general. Keywords-3D Gestures; Audio; HCI; Hands-Free; Kinect; Motor Skills Impairment; Quran; Recitation; Video.

I.

INTRODUCTION

Computer users with disabilities are being offered more and more tools, both hardware and software based, in order to facilitate the use of computers and electronic systems. Poorly designed systems are no longer accepted and a whole new computer science area (Human Computer Interaction - HCI) is now dedicated to designing ergonomic and user friendly interfaces for all users, including those with disabilities. In this paper, which is part of a 1-year research project in developing tools and technologies for teaching disabled users to read the Muslim holy book (Quran), we introduce a software application solution that addresses difficulties faced by users with disabilities. The initial phase of this research project targets children with motor skills impairment. The application will utilize, as learning aids, the now popular array of sensors and kits developed for the games industry. Due to the popularity of SDK’s and abundance of Visual Studio development expertise for the Microsoft platform, we utilize the Microsoft Kinect sensor originally developed for the Xbox 360 games console to take advantage of its ability to

support movement, voice, and gesture recognition for application development [1]; the sensors ability to detect movement from up to 20 human body points such as head, hands, knees and feet is quite useful for the purposes of this work. A. Related Work There is good volume of work in this area today; some research has resulted in redesigned pc peripherals that are suitable for users with disabilities; e.g. Jang et al [2], introduced a novel mouse that is suitable for pc users with physical impairment. The preliminary indicators were that the users performed better with the use of the customized mouse over the use of the traditional standard mouse available in the market place. Other work has been carried out on the use of Kinect with different HCI applications; R. Francese et al [3] present two systems designed for 3D gestural user interaction on 3D geographical maps. The proposed idea in their research relies on the use of the Kinect as one of the key components for detecting 3D gestures in a Human Computer Interaction application for 3D geographical maps. Other work includes various games developed for children with disabilities such as the Super-Pop Project which developed a game for children with motor skills impairment, aided by the Microsoft Kinect sensor [4]. F. P. Martin et al [5] presented a paper outlining MAS; a flexible and scalable software platform designed to help people with disabilities. The software integrates technologies that enable users to control adaptive games, designed to explore, measure and develop social and cognitive skills of children with disabilities. N. Baloian et al [6] presented a paper for modeling educational software for people with disabilities. The paper described common aspects and differences in the process of modeling real world applications involving tests and evaluations of cognitive tasks for people with reduced visual or auditory cues.

____________________________________________________________________________________________________ This work is part of a research project funded by the NOOR foundation at Taibah University – Madinah – Kingdom of Saudi Arabia http://www.nooritc.org

J. Small et al [7] presented a paper on Web Accessibility for People with Cognitive Disabilities. The study investigated individuals with developmental cognitive disabilities (DCD) navigating W3C accessibility-compliant Web sites. The study concluded that the (2005) web accessibility guidelines do not sufficiently address the needs of people with cognitive disabilities. A. Anderson and C. Rowland [8] presented empirically-derived draft technical specifications for a suite of tools to evaluate the cognitive load of Web Pages. S. Baqai et al [9] presented a vision for leveraging the emergent role of Semantic Web Technologies for providing efficient and flexible means of knowledge modeling, storage, publishing, reasoning and retrieval from distributed Quranic knowledge sources. Finally, H. J. Hsui [10] presented a paper on the potential of Kinect in Education as an interactive technology and discussed how it can facilitate and enhance teaching and learning.

ellipse over the selected button for 3 seconds to perform the click event and play the selected resource. The Kinect SDK is a developer toolkit, a set of software libraries, for developing applications that utilize the Microsoft Xbox 360 camera/motion games sensor. This SDK provides an excellent programming interface to the Kinect system. The Kinect for Windows SDK includes drivers that interact with the hardware and provide a set of API functions for reading data and status from the camera, sensors, microphone and motor. The SDK supports Microsoft Visual Studio programming languages including C# and XAML.

This paper presents a good example of the design and implementation of a complete system utilizing current state-of-the-art technologies to serve as an Interactive tool to enhance the teaching and learning process. The application in this case aims to teach the Holy Quran to students with disabilities. The same concept can be applied to other learning material and other populations of the community. The authors aspire to design and implement this system to serve as a learning aid for different use in the teaching community at later stages of the project. II.

OBJECTIVES

The main aim of this project is to design a suite of Hand-free Applications for educational purposes that allow disabled users to use a PC to learn the Holy Quran (initially). These tools will support a hands-free operation mode for users with motor skills impairment. The Microsoft Kinect sensor and SDK is capable of detecting up to 20 points or joints in the human body [11] as shown in figure 1 below. This research project produced an application that enables the users to operate the majority of PC functions with their head as an exclusive control point for the entire application, thus targeting users with Physical Disabilities particularly those affecting motor skills. The user are able to play Audio and Video media to listen to and watch Quranic lessons and recitations from various sources while controlling the application with only head motion. The main feature of the application is an ellipse (cursor) that tracks the user’s head motion as an alternative control point to the mouse in regular applications. The user will be required to hold the

Fig. 1 Sensing Points Supported by Kinect [11]

III.

APPLICATION COMPONENTS

A. Hardware: The main hardware components used in this project are: • Kinect for Xbox 360 sensor • Personal computer with a 2.5-GHz (or faster) processor • Dedicated Windows 7–compatible graphics card that supports DirectX® 9.0c with 2GB RAM on-board • 4-GB RAM (6-GB RAM recommended) B. Software: The main software components used in this project are: • Microsoft Visual Studio including C# and XAML • Kinect SDK for Visual Studio

2



Windows 7 or Windows 8 as the host development and testing operating system

C. Application Interface hierarchy The application interface/hierarchy will feature the following options as illustrated in figure 2 below.

Start and Configure These use cases are exclusive to the Admin who can be a helper to the disabled user, including parent or sibling or classmate. The complexities of the system will be made very simple so that the admin tasks can be done by anyone able to use a PC with Microsoft Windows. Some of the tasks here include copying the files to the users workstation, plugging the Kinect sensor in via USB, restarting the application or rebooting the system altogether. Mode The user selects the Audio or Video options. The system will play ‘amma' part only within these 2 categories. While the video option plays the common Hafs-Asim recitation, the Audio option supports all of the 7 recitations. Other parts of the Quran and other recitations shall be added as the development progresses further. Application Interfaces The application consists of three menus: a)

The Main menu allows the user to select Audio or Video modes (Figure 4).

Fig. 2 Application Interface Hierarchy

D. Use Case Diagram The main functions of the system are shown in figure 3-Use Case Diagram below. This shows 3 actors and 8 Use Cases. Use cases represent the main functionality categories that the system supports, some of the important use cases are explained here:

Fig. 4 Application Main Page

b) The Video sub-menu allows the user to watch Quran recitations from (‘amma part’), the user can select any suraa, by selecting a certain surah (chapter), the learning/teaching video will start (Figure 5).

Fig. 5 Video Sub-Menu (‫)اﻟﻤﺮﺋﻴﺎت‬ Fig. 3 Use Case Diagram

c)

The Audio sub-menu allows the user to access the seven types of recitations (Figure 6) and

3

from those, they can listen to a chapter from ‘amma part‘ in the selected recitation.

Table 1 Usability Metrics – Empirical Results Usability Metrics – Empirical Results Metric Disabled User 1 Completion Rates 1 Usability Problems Task Time (seconds) 5 4 Task Level Satisfaction 10 worst 1 best 3 Test Level Satisfaction 10 worst 1 best

Task

Select Audio Option

Fig. 6 The Audio Sub-Menu with the Seven Recitations (‫)اﻟﺼﻮﺗﻴﺎت‬

IV.

RESULTS

A complete system was designed and developed that will enable users with physical impairment to interact with an application for learning the Holy Quran. The customized prototype was tested on a selected user population to provide insight into the expected behavior and usefulness, or lack thereof, of the final product. Pilot Study Although no field tests were conducted with disabled users, lab testers have emulated the targeted uses of the system by using their head exclusively for controlling and operating the system. The usability test results show that the application meets its intended use and provides a hand-free multimedia environment for disabled users to learn the Holy Quran. The results of the usability test indicate that all tasks were 100% completed by both able and disabled users, though the satisfaction rate for each task level (which indicates the difficulty of the task) is 100% for able users and approximately 57% for disabled users, while the test level satisfaction (which measures user’s impression of the overall ease of use) remained 100% for able users and approximately 73% for disabled users. Zero errors were recorded by both user groups for all tasks. The results of the usability test are shown in table 1 below. The task time (measuring the duration a user takes to complete the task) shows that, as expected, it always takes longer for disabled users to complete the task. Finally, usability problems causing issues or delays have occurred consistently for disabled users but never for able users. We note here that the disabled users operated the system in full including the hands-free function, and able users used the standard hand-operated mouse with the system.

Select Chapter 113 followed by Chapter 110

Navigate back to Main menu

Able User 1 0 2 1 1

Errors Completion Rates Usability Problems Task Time (Seconds) Task Level Satisfaction 10 worst 1 best Test Level Satisfaction 10 worst 1 best Errors

0 1 3 18 7

0 1 0 5 1

3

1

0

0

Completion Rates Usability Problems Task Time (Seconds) Task Level Satisfaction 10 worst 1 best Test Level Satisfaction 10 worst 1 best Errors

1 0 4 2

1 0 2 1

2

1

0

0

It is evident that the system operation is successful with the use of the user’s head exclusively. The users can switch between the various menu options by maneuvering the mouse cursor to the desired screen and button as in the screenshots above, and by using the “hold the cursor” option to simulate the click option they have indeed been able to operate the full spectrum of functions successfully. V.

CONCLUSION AND FUTURE WORK

The system was designed and implemented according to the requirements set for this project. The customized system provides a hands-free environment for students with disabilities and serves as a learning aid for teaching the Holy Quran. The same concept can be utilized and expanded to include teaching students other subjects (science and maths). The current system is intended only for users with physical impairment, though it can be used for a wider range of users with disabilities. A few functions can be added to the system to make it more comprehensive. Some of these functions will be added in the second iteration of this application while other modification are dependent on the sensor technology development and will have to wait until new generations of sensor are developed

4

and released. Additional functions are derived from the challenges faced while testing the systems which include: • • • •

No left/Right Double Clicks – Only Clicks Stability of the Cursor – Cursor movement and control is not as smooth as with a standard mouse No Drop-Down Menu support No audio control for users with visual impairment/blindness

Statistical usability analysis based on actual field tests of the system when operated by disabled users compared to able bodied users is currently unavailable. The system will be subjected to a comprehensive field test on the target user population and a thorough statistical analysis of the results of that test will be carried out. The system currently supports part of the Holy Quran, this can be considered a limitation yet it is the intent of the authors and developers to cover the Holy Quran in its entirety in future releases. Work on the second edition of the application will commence soon and will include a new requirement, in addition to those listed above, of utilizing the Kinect's voice recognition capabilities to enable blind users to utilize the complete capabilities of the system for learning the Holy Quran. VI.

POTENTIALS OF THIS SYSTEM

Even though this system is initially designed as a customized interactive environment for the teaching and learning of the Holy Quran for students with disabilities, yet the potential use of this system are far greater. It is the intention of the authors to setup a multidisciplinary research group from maths, science, education and linguistics to design a comprehensive and interactive system for students with disabilities to serve as a learning aid for these users to supplement and complement the school curriculums. VII.

REFERENCES

Advanced Visual Interfaces, 2012, Capri Island, Italy [4] The Super Pop Project. – Retrieved on 8th March 2013 from http://www.engadget.com/2012/12/14/superppproject-ga-tech-kinect/ [5] F. P. Martin, R. C. Palcios, A. Garcia-Crespo (2009), “ MAS: Learning Support Software Platform for People with Disabilities” 1st ACM International Workshop on : Media Studies and Implementations that help Improving Access to Disabled Users, Oct 23, 2009, Beijing, China [6] N. Baloian, W. Luther, J. Sanchez (2002),” Modeling Educational Software for People with Disabilities: Theory and practice”, Proceedings of the ACM Conference on Assistive Technologies, ASSETS 2002, Edinburgh, Scotland, UK, July 8-10, 2002. [7] J. Small, P. Schallau, K. Brown, and R. Appleyard (2005), “Web Accessibility for People with Cognitive Disabilities”, Conference on Human factors in Computing Systems, CHI 2005, Portland, Oregon, April 27, 2005, USA [8] A. Anderson and C. Rowland (2007), “ Improving the Outcomes of Students with Cognitive and Learning Disabilities: Phase 1 Development for Web Accessibility tool”, 9th International ACM SIGACCESS Conference on Computers and Accessibility, October 1517, 2007: Tempe, AZ, USA [9] S. Baqai, A. Basharat, H. Khalid, A. Hassan, and S. Zafar (2009), “ Leveraging Semantic Web Technologies for Standardized Knowledge Modeling and Retrieval from the Holy Quran and Religious Texts”, International Conference on Frontiers in Information Technology, 16-18 December 2009, Pakistan\ [10] H. J. Hsu (2011), " The Potential of Kinect in Education" , International Journal of Information and Education Technology, Vol. 1, No. 5, December 2011. [11] Microsoft Kinect Sensor – Retrieved on 8th March 2013 from http://msdn.microsoft.com/enus/library/hh438998.aspx

[1] Kinect for Windows – Retrieved on 8th March 2013 from http://www.microsoft.com/enus/kinectforwindows/ [2] M. Jang, J. Choi, S. Lee (2010), “A customized mouse for people with physical disabilities”, The 12th International ACM SIGACCESS Conference on Conference on Computers and Accessibility, Oct. 25-27, 2010 Orlando, Florida, USA [3] R. Francese, I. Passero, G. Tortora (2012), “ Wiimote and Kinect: Gusteral User Interfaces add a Natural Third dimension to HCI”,

5

Suggest Documents