the test was to test the application and the device, not the child. Some of the virtual objects in the application were brought alive by constructing similar tangible objects of ..... Series, Vol. 11, IOS Press, 2003, 979-983. [7] Reachin Technologies ...
Experiences on Haptic Interfaces for Visually Impaired Young Children Saija Patomäki, Roope Raisamo, Jouni Salo, Virpi Pasto, and Arto Hippula Tampere Unit for Computer- Human Interaction (TAUCHI) Department of Computer Sciences FIN-33014 University of Tampere, Finland +358-3-2158876
{saija, rr, jouni, vp, arto}@cs.uta.fi use services, and to participate in the society. Their problem is the lack of appropriate user interfaces.
ABSTRACT Visually impaired children do not have equal opportunities to learn and play compared to sighted children. Computers have a great potential to correct this problem. In this paper we present a series of studies where multimodal applications were designed for a group of eleven visually impaired children aged from 3.5 to 7.5 years. We also present our testing procedure specially adapted for visually impaired young children. During the two-year project it became clear that with careful designing of the tasks and proper use of haptic and auditory features usable computing environments can be created for visually impaired children.
We have developed three multimodal learning and play environments which were tested with visually impaired young children. We used the Phantom device [8] as a part of an integrated display system called the Reachin System [7]. The aim of the two-year project was to test in practice designs for haptic applications that are aimed for young visually impaired children. We gathered useful information on different features that are characteristic to Phantom such as virtual materials, shapes, objects and magnets. We also wanted to gain knowledge about the virtual space utilization and navigation and orientation in 3D space. The ultimate goal was to develop useful games and applications to give our special user group the possibility to become acquainted with the computers so that they are encouraged to use and benefit from the technology also later in their lives.
Categories and Subject Descriptors H.5.2. [Information interfaces and presentation]: User Interfaces – auditory (non-speech) feedback, haptic I/O, usercentered design.
General Terms Design, Experimentation, Human Factors.
2. USERS
Keywords
The target group was chosen to be severely visually disabled children. Our users were eleven children from this group, initially of ages 3.5 – 6.5 years. Distributions of age, gender and impairment of the testing group are shown in Table 1.
Visually impaired children, blind children, Phantom, haptic feedback, learning, haptic environment
Table 1. Distributions of the testing group.
1. INTRODUCTION
age, gender and impairment (November 2002)
People with disabilities have problems with computers. Visually impaired persons do not have equal access to the services and benefits that computers can bring. Graphical user interfaces improved the usability of computers in general. However, this development had fatal consequences to the blind; the dominant feedback channel, graphics, either undermines or totally prevents the computer use from them. Nevertheless, disabled people can benefit from using computers, because computers can be their most important technical aids to learn, to receive information, to
age girls boys had residual sight
3,5 2 2 2
4 2 0 2
5 1 1 2
6 1 0 1
6,5 0 2 1
in total 6 5 8
All three haptic environments constructed during the project were tested with the chosen test group. In all three phases the user group remained almost the same. There were eleven children (Table 1) testing the haptic elements in the phase one (OctoberNovember 2002). Two of the children were pilot testers and nine were actual testers. In the phase two (May-June 2003) the children were from four to seven years. Eight of ten were actual testers and two were pilot testers. In the third phase (November-December 2003) the age of the children was from 4.5 to 7.5 years. In total there were eleven testers, one pilot and ten actual testers. In addition, before these tests the third environment was pilot tested with three sighted children.
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. ICMI’04, October 13–15, 2004, State College, Pennsylvania, USA. Copyright 2004 ACM 1-58113-954-3/04/0010...$5.00.
281
The level of impairment varied in the testing group so that there were three totally blind children and three children who had a fair residual sight with glasses. Altogether eight of the children had partial vision but only five saw enough to make any use of visual feedback.
comfort the child that was working with a device in a strange environment. Both the control room and the test room were introduced thoroughly to the child before each session. Usually the children wanted to study the environment by touching with their hands to feel secure and to orientate themselves.
The children formed a heterogeneous test group. Besides their ability to see they differed in many ways that affect the children’s ability to accomplish the tasks. Factors that had an influence were age, gender, ability to concentrate, and other skills such as fine motorics. For example, matured pen grasp proved to be a crucial skill when handling the device. Since blind children have often other disabilities than visual impairment alone their impact can not be ignored. Parents’ activity to train the child’s skills, and the structure of the family, for example the number of siblings, may also have an impact on the child’s development. It is obvious that all of these numerous factors could not be fully taken into account in analyzing the results.
A test started with informal chat with the child and the parent to get the preliminary data and to give safety and confidence to the child. Next there was an interview. In the last two phases the child was asked for some details of the device, former applications, and the previous test arrangements. Then the purpose of the test, observation and videotaping were explained to the parent in a general level. The parent was told that the test can be discontinued whenever needed. It was particularly emphasized that the goal of the test was to test the application and the device, not the child. Some of the virtual objects in the application were brought alive by constructing similar tangible objects of real materials. We called these touchable hand-made objects as real-world models. The purpose of these models was to ease the child’s comprehension of virtual objects. The test began with the exploration of the real-world models if they were in use in that particular test. The tangible model was examined with hands and then with a plastic stylus (seen in Figures 2 and 3) to get used to the idea of touching objects with a stick that is a crucial working method when using the Phantom as the haptic device. The materials, parts of the space and directions were also shown.
3. MULTIMODAL ENVIRONMENT We used a Reachin Display [7] that includes a Phantom Desktop device [8] and a display with stereoscopic CrystalEyes 3D glasses. The Phantom is a 6-DOF (degrees of freedom) haptic interface that generates accurate force feedback to simulate touch. The Phantom is operated with a pen-like stylus that is attached to a robotic arm that generates force feedback (Figure 1).
The child got familiar with the machinery, i.e., the stylus and the button before the actual testing took place. The children practiced the use of devices with the training application that was a simple static virtual user interface. The very same user interface was in use in the actual testing but augmented with tasks. With the training application the child was able to practice navigation and the use of machinery, and he or she got familiar with different parts of the virtual space. The real-world models and/or training application were in use in all test phases. In the first phase the tangible paths and materials were similar to virtual ones in the application. In the second phase the real-world models were not in use but the training application of the user interface was. Finally, in the third phase both realworld model of the user interface and the training application of the user interface were in use. In each phase practicing was carried out first with the help of an assistant and then independently.
Figure 1. A child is using the Phantom device in the test. The Reachin Display system utilizes a dual-processor computer. Magellan space mouse was used as another input device. The device consists of one large handle and several smaller buttons.
The testing phase followed after the training. The goal was that the child would use the device as independently as possible. In each test an imaginative tale was told that was linked to the working with the device. This way the tasks made sense to the child which supported for concentration. Before the tests it was agreed that the assistant will help the child only in few cases like if the device got overheated or the position or the hold of the stylus got too difficult. In practice she had to intervene more than we had expected, extent of which varied between the users.
4. TESTING PROCEDURE During the project that lasted for two years (2002-2003) an established testing procedure was developed. Tests were carried out in the usability laboratory. The laboratory was composed of two rooms that are separated with a one-way transparent mirror. In the control room observing and videotaping of the test was done by the test organizer while in the test room the child used the device with the guidance of the test assistant. In the test room there was also one parent who was asked not to communicate with the child. The presence of the parent was aimed to support and
The timing of the breaks was not planned in advance because we decided that children could rest every time they need to. After all the tasks were done the child taught the parent to use the device and the application. This way we got to know how the child learned and comprehended the application and machinery. Lastly,
282
the final interviews that focused on details about the application were carried out with the child and the parent.
During the tests the children were given haptic, auditory and visual feedback. In the path tracking task the auditory feedback was simple, only a beeping sound to indicate the position in the space and music to reward the children of the completion of the task. The beeping sound moved from left to right at the same time as the stylus was moved from left to right. The pitch of tone was altered when the stylus was moved upwards or downwards. When the child found his (her) way to the end of the path the music started. In the tale the child was guiding baby moles through the tunnels to their grandmother’s house.
During the tests diverse data were collected; the tests were videotaped and log files were stored of the application use. The evaluation of usability tests was based on the information gathered in both ways. For example, the time sequences that indicated children’s various holds of the stylus or assistant’s interference to the usage were marked down from the videotape to make conclusions of usability of the device. Also children’s comments, gestures and expressions were paid attention to. The evaluation of the task completion was based much on log files but also videotapes were used to get the idea of manner of carrying out tasks and the degree of assistance the child was provided. From the log files we got accurate information on, for instance, task initiation and completion times, touching of virtual objects, overheating of the device, and so on.
5. PHASE 1: THE HAPTIC ELEMENTS
Figure 3. A real castle wall path model (left) and a virtually implemented castle wall path model (right).
In the first testing phase some haptic elements were implemented. The elements consisted of various surfaces and paths. The purpose of the test was to familiarize the children with the Phantom device and to find out whether the device was applicable for them. In the test [6] the idea was to compare the hand-made real-world models with the similar virtual ones presented with the computer.
5.1 Results The haptic elements were tested with nine children. The easiest material for them was sandpaper (8/9) and the hardest was the backside of a mouse mat (4/9). Sponge and glass washboard were recognized equally well (6/9). Hard surfaces were easier to recognize than soft surfaces perhaps due to the additional auditory feedback of hard surfaces caused by the mechanics of the Phantom device. These sounds imitate fairly accurately the real ones. Children were also noticeably more eager to touch the hard materials than the soft materials.
First, the tangible materials, sponge, sandpaper, backside of a mouse mat and a glass washboard were explored by hands, and then with the plastic stick (Figure 2). The real materials were glued on a carton box. After practicing with the real materials the virtual surfaces were explored one by one and a similar real material was to be found on the box. The artificial surface was on the floor of a virtual room while the walls restricted the area so that the stylus could not get lost. In the story the rooms were told to belong to the members of the Bear family. Different materials were explained to be various carpets of the bears they had wanted to decorate their rooms with.
The easiest paths to track were the direct line (6/9) and roof-top (5/9) paths. The most difficult paths were saw-tooth and castle wall paths (3/9), because the children had difficulties particularly in coping with the strict corners of the path.
5.2 Lessons learned Since the children got stuck in the corners of the line we decided to try out rounded corners and wider paths. Also the rough base material of the path may have affected tracking because the stylus slides more easily on a smooth surface. At first the surfaces in the application were flat and with little friction but the pilot testing showed that this kind of surfaces were hard to differ from mechanical stop when the stylus reaches its physical limits. Surfaces were made rougher and friction was increased before the first test phase. The new surfaces had more haptic feedback but some surfaces were made a bit too grainy and the users got stuck when trying to move. These grainy surfaces were smoothened before the second test phase, and because surfaces became more similar to each other, friction was used more to give different feel to them. This solution was also used in the third testing phase.
Figure 2. The box of different real-world materials (left) and the virtually implemented sponge material (right). The second task was to track differently shaped paths (Figure 3). First, in this task, the real models were used to support the children to understand the paths. The children probed the tangible models with both hands and then with the stick. Afterwards, the children tracked the virtual paths twice, with and without auditory feedback. Path shapes were a direct line, roof-top, saw-tooth and castle wall. The purpose was to keep the stylus in the bottom of the track and to move along the path. The path was engraved in the work surface. The surrounding surface was slippery and the ground surface was coarse. If the stylus accidentally came out of the path the child had to find the track again and to continue following the track with the stylus.
It was noticed that an essential factor in successful tracking of the paths was the correct pen grasp and the correct position of the stylus. This was hard for the majority of the children mainly because their fine motorics was not in the sufficient level for the Phantom device. Some hardships with the device resulted from ergonomics because the device is designed for adult users.
283
In the tests it became clear that the given visual feedback did not necessarily ease the tasks for the children that could see something. It was noticed that as a half of the visually impaired children could see something, the given visual feedback dominated which caused a shortage in their concentration on tactile sensation. In addition, as the visual feedback was given with the Reachin display without the mirror the position of the monitor caused the visual feedback to be somewhat different from the haptic feedback. This made children in some cases move the stylus to the wrong direction. The visual feedback is meant to be given through a transparent mirror. We had to remove the mirror for the safety of the children since they could not see it and would not have benefited from it. There is a chance for accidents if it were present. In the normal usage situation the mirror would make the visual feedback to be congruent with the haptic feedback.
Figure 4. The first task was to find the alarm clock. It is the small rectangle in the upper left hand side of the room. The second room (Figure 5) contained a snow tunnel. The goal was to find the mailbox through the tunnel and then to get back to the starting point. There were two different routes to choose from.
6. PHASE 2: THE GAME ENVIRONMENT In the second phase we designed a more game-like application. All the instructions needed for the game were given by the characters of the story. The tale taught children about the seasons. The voices of brother, sister, mother and father Badger and Artsi the ant were played by five different people. The spoken lines were recorded beforehand and then added in the application. In the story the children walked together with the agent ant from season to season around the year. The members of the Badger family lived in four rooms that represented the four seasons. The Badgers asked the children to do various chores in their rooms.
Figure 5. In the winter room the goal was to move along the tunnel and to find the mailbox. The third task was to compare various virtual surfaces (Figure 6). In the task there were three pieces of fabric on the floor, one large piece of fabric and below it two smaller pieces of fabric. From the small pieces the child should choose the fabric that felt the same as the large piece of fabric. The fabrics were replaced with new ones as the correct one was found. This was repeated four times with different samples of surfaces.
In this phase visual feedback was not used but haptic and auditory feedback were present. Note that the children did not see the graphics of Figures 4-7. There was a rich variety of sounds that included music, voices of the characters, and sounds of nature and different objects. The sounds of nature were wind, rain, thunder and birdsong. The sounds of different objects contained, among other things, clock, sewing machine, balloon, and door sounds. Other new features in the game environment were magnetic objects. With the magnetic objects it was possible for the agent to grab the stylus and to transport it from one place to another and to make areas that attract the stylus. Magnetic objects were supposed to ease navigation in the virtual environment. Besides the Phantom device a separate physical button was in use. With the button the child could open doors in the virtual world and make selections.
Figure 6. The purpose in the third task was to find the equivalent fabrics. In the fourth room (Figure 7) there were four balloons and the task was to break them. The balloons were in the center of the room. First the child had to find the balloons. A balloon gave a creaky sound as the stylus hit its surface. When the stylus was under the balloon it had to be drawn to the user’s direction and when it was over the balloon it had to be pushed away.
In the user interface there was one room with four doors (Figures 4-7). The room was surrounded by a corridor through which the navigation from door to door took place. In the corridor a typical sound for the present season was provided. The content of the room depended on which door was used. When the child was in the room only the door entered was available for exit. Navigation and opening of the doors in this room-corridor user interface was practiced first with a training application. In the first room (Figure 4) the aim was to find the alarm clock that had fallen down to the floor. A ticking sound of the clock that moved horizontally between the loudspeakers following the stylus guided the user to find the clock. When the stylus was on the right side of the clock the ticking sound was heard from the left, and the other way around. On the floor there were different obstacles, furniture that the child might bump into. The furniture had to be bypassed by going around it with the stylus.
Figure 7. In the final room there were balloons that had to be found and then broken by pushing them.
284
place. Separate physical buttons were used instead of virtual buttons. Both the large handle and small buttons of the Magellan worked very well with the Phantom.
6.1 Results For this user group all tasks in the game environment were too hard, since they had no visual feedback. The first task was especially difficult since none of the children found the alarm clock. The location of the clock was not easy because it was far from the door on the other side of the room. The virtual obstacles also hindered the search. The other difficulty was linked to the 45degree angle of the haptic device that made the stylus slide downwards due to gravity as the clock was upwards.
In the tests of the game environment it became clear that we had succeeded in designing the room-corridor navigation interface that was usable for the children. Although the idea of a single room the content of which depends on the door used may seem confusing, the concept was actually working very well. In the tests the children did not express in any way that a single room instead of many would have been confusing for them. In addition, this model may be clearer for the blind than for sighted people due to the reason that the interaction is based purely on haptic and auditive feedback, and they may perceive the rooms as different.
In the second room only one child found the mailbox by herself. The return trip was made independently by three of the children. The return was easier to cope with because the stylus slides downwards partly by itself due to the 45-degree angle in haptics. When the stylus was in the corner of the path the child had to come up with the idea that the continuation of the path had to be found by moving the stylus in different directions in order to track the path.
Since in this phase we did not give the children anything but haptic and auditory feedback, partially sighted children were seeking something else to look at from surroundings. This is because the sight is a dominant sense for a human being. We conclude that visual feedback should be included if the child can benefit from it.
In the third task where the fabrics had to be recognized in each part of the task merely one or two of the children succeeded in fulfilling the task. The children had difficulties in exploring the three materials on the floor because they obviously could not piece together where they were in the room. The other problem was to hold the stylus on the material and press the button at the same time.
7. PHASE 3: THE LEARNING ENVIRONMENT The same interface model was kept in the learning environment. This time the user interface was implemented with six doors and the tasks inside the rooms were changed.
Since there were eight users the balloons had to be broken altogether 32 times. Only three balloons were broken down during the whole test series. First, the balloons were too hard to find, and second, when the children did find the balloon they usually pushed the stylus to the wrong direction.
In the game environment the thing that puzzled us was that eventually we could not be sure how the children comprehended the three-dimensionality of a virtual room. As we wanted their notion of the virtual room to be as similar as possible compared to sighted people we made a real model of the space. The tangible model was made from carton (Figure 8). The children got familiar with the user interface first with the real model using the plastic stylus, and then with the training application.
The room-corridor user interface model proved to be working nicely with the children. Half of the children could find their way from door to door through the corridor effortlessly. The doors were also easy to open with a separate button. Finding the door and opening the door were successfully facilitated with the magnet feature. The door magnet attracted the stylus and held the stylus against the door while the button was pushed. The children were also excited about the lively characters in the application. Although the lines of the characters were quite long most of the children were not bored. On the contrary, the children were listening to the characters patiently and with great interest. The lines were entertaining without leaving aside the informative value of the speech. Some of the children even tried to talk back to the characters but sadly the characters could not reply. In future designs speech recognition could augment the fantasy world.
Figure 8. The real model of the navigation interface of the learning environment was made from carton. In addition to haptic and auditory feedback we decided to give visual feedback since in the previous tests it was noticed that children who were partially sighted wanted visual feedback. As we now knew that the Reachin display arrangement could not be used based on the experiments done in the first phase we decided to use a separate display that was placed on the left hand side of the Phantom (Figure 1). Seven of the children could see something but only five of them benefited from the visual feedback in these tests.
In the tests it became evident that in creating the feeling for virtual surfaces the implementation was realistic enough because children constantly tried to touch virtual objects with their free hand beneath the robotic arm. One child who was partially sighted even looked under the stylus several times and obviously could not believe that in reality there was nothing. The children also tried to reach for materials in the third room.
6.2 Lessons learned
The theme of the learning environment was recognition of animals. Since the tasks in the game environment were too difficult for these children we deliberately designed easier tasks in this phase.
With various features that Phantom offers it is possible to ease up the handling of the device. The magnets turned out to be very handy when we wanted to transport the child to the required place or when we wanted to help the child to hold the stylus in a certain
285
In the first room (Figure 9) the task was simply to concentrate on listening to information and holding the stylus gently while automatic transportation took the child from place to place. The child was brought in turn to the nests of a mouse, a squirrel and a capercailzie, and these places were accompanied with information on each animal. With this task the child got accustomed with moving along with the transportation magnet.
animal belonged to the domestic animals or animals living in the forest. To place an animal in the farm the stylus had to be moved to the right and to get it in the forest the stylus had to be moved to the left. If the child chose the wrong side he or she was advised to try again. By pressing the separate button the sound of an animal could be heard again. When the stylus was moved to the correct place the picture of an animal appeared and the child got information on the animal in question.
Figure 9. Magnets slid the stylus automatically around the room in the first room.
Figure 12. Based on the sound the animals had to be divided in the farm or wild animals in the fourth room.
The aim in the second task (Figure 10) was to find the animals hidden in the room. The purpose of the task was to find out whether the children had comprehended the notion of the room and were able to point at the floor, the ceiling and the walls with the stylus. The stylus had to be pressed against the surface for a short moment after which the picture of an animal was shown and the sound and the poem of the animal were played.
In the fifth room (Figure 13) the task was to choose the requested material between two surfaces that were on the floor of the room. The child had to find the material that matched the adjective. There were four pairs of materials that were to be recognized one by one. For example, in the first part of the task the child was asked to find the soft surface as the offered materials were sponge and glass washboard. The child had to explore both materials and then choose one or the other. Choosing the material took place so that the stylus needed to touch the material to be chosen before the button could be pressed.
Figure 10. The light areas on the sides of the second room reacted to the contact of the stylus. The room of sounds (Figure 11) was the third place where the children visited. In each side of the room there were four areas that played sounds when they were touched with the stylus. Thus, there were altogether 24 different sounds in the room. For example, in the floor there were sounds of nature, upper wall had sounds of vehicles and right-hand side wall had sounds of instruments in it. The children were encouraged to find as many sounds as possible and to wander around freely in the room. The aim of this task was to find out which areas in the 3D space dominated.
Figure 13. In the fifth room the children explored materials with the stylus. In the last room of the learning environment there was not any special task. The room was simply to reward the children with the Ti-Ti the bear song for the work they had done with the device and the application.
7.1 Results The first task where the automatic transportation was in a key role was going very well with eight of the children. Two of the children instead constantly ripped the stylus from magnet because they could not concentrate and also the stylus was too hard to handle for them. In the second task the children had to touch each side of the room with the stylus. The easiest sides to find were the floor (9/10) and the lower wall (8/10). The most difficult sides to find were righthand side wall (3/10) and left-hand side wall (4/10). The upper wall and the ceiling were found equally well (6/10). The results of the room of sounds (the third room) are in line with these. Figure 14 illustrates that lower part of the space dominated strongly. All the test subjects were right-handed that can also be seen in the
Figure 11. When touched with the stylus the rectangular areas played sounds in the third room. The task in the fourth room (Figure 12) was to place ten animals based on the sounds they made in the farm or in the forest. For example, when a dog barked, the child had to decide whether the
286
picture. Interestingly on the ceiling upper parts were used more often than lower parts of the surface. These findings of space utilization can be used when planning the places of different virtual objects in virtual environments.
approximately 90 % of the test time the working with the Phantom was running smoothly. Table 2. The amount of warnings per test subject and the proportional time for grasping the device like a pen.
ceiling
subject
1
2
3
4
gender
F
F
F
F
F
M
M
M
M
M
warnings
0
6
0
5
10
20
107
14
53
45
90% 89% 98% 100% 2%
9%
11%
1%
pen grasp
6
7
8
9
10
15% 11%
It is also a known fact that the majority of visually impaired children are also multiply disabled and it is more common for a boy than a girl to be a multiply disabled. Our purpose was to test the applications with the children who are only visually impaired. Since the diagnoses concerning other possible disabilities are hard to accomplish under school age we cannot be sure that the only handicap for all test subjects is residual sight. Our guess is that this striking distinction between genders in handling the device might also result from undiagnosed handicaps.
Figure 14. Different areas touched in the room of sounds. The darker the area the more often it was touched.
7.2 Lessons learned With careful task planning where the benefits and constraints of the Phantom are taken into account it is possible to make usable interfaces for young visually impaired children. For example, from the phase two to the phase three the navigation interface remained almost the same but the tasks were eased and redesigned. In Figure 15 it can be seen that the children’s independent working time with the device was increased in most cases. They did not need guidance of the assistant in such a degree than in the phase two. Test subjects seven and eight had not completed tasks in the game environment; that is why they have a value only for learning environment. time of independent work in the test
5
8. DISCUSSION During the tests it was clear that the children’s fine-motoric skill development level had an effect on tracking the paths. Also the ability to concentrate on both the instructions and on working with the device had an important role. It was clear that these factors had a great effect on whether the Phantom was suitable for the child or not.
100 % 90 % 80 % 70 % 60 % 50 % 40 % 30 % 20 % 10 % 0%
As our subjects were children their linguistic development level was a crucial factor that influenced in succeeding in exploring the materials. If the child had a concept for a certain material in his (her) mind it was more presumable that he (she) would recognize the virtual material as well. If the things have names, they are easier to remember. 1
2
3
4
5
6
7
8
9
The children also came up with imaginative words that described the surfaces in their mind. For example, the glass washboard was a radiator or a guitar, while the mouse mat was a drum and the sponge a trampoline. They came up naturally with the figures of speech that felicitously corresponded to the haptic elements. So, although the assistant had made up a core story often children participated in making a whole new story or they continued the tale as they pleased. It might be wise to offer a virtual world to the children and just sit back and see where the fantasy guides them. Different tangible buttons, haptic and audio features could end up being particles of their own tales, plays or games. Tangible Viewpoints [5] is an example of this kind of systems and it could be used with blind children excellently.
10
test subject
game environment
learning environment
Figure 15. The independent working time increased when the tasks were purposefully eased. Unexpectedly when using the Phantom a key factor seems to be the gender since the girls were distinctively more patient and proficient when they used the device compared to the boys. The boys lacked the ability to concentrate both on the given information and on working with the device. In the amount of Phantom overheating warnings that indicate severeness of usage, there was a clear difference between the boys and the girls as can be seen in Table 2. The number of warnings that the girls got was from 0 to 10 while the boys got from 14 to 107 warnings.
Sjöström [10] has made several applications for the Phantom device and he has done informal tests of these with visually impaired people. His study started with games and educational programs that were aimed especially for blind children. We used his results when designing the games. He has also found that there are a few factors that have an effect on the use of Phantom [9]. Firstly motor skills, secondly the ability to concentrate and lastly the ability to understand the tactile experience that is a cognitive process. Our findings support his results.
A possible explanation for successful usage was the ability of pen grasp. In Table 2 it can be seen that the majority of girls held the stylus correctly almost for the whole test time. According to these tests it can be said that if the pen grasp covered at least
287
The most serious constraint when using Phantom is the lack of getting general impression of the object or space. This is caused by the single point of contact the device is based on. Still, according to Yu et al. [11] this one-point method is good at providing kinesthetic rather than cutaneous sensation. When we think how the notion of a certain shape is built without visual feedback the motion is the key factor. The problem is that blind people cannot feel the objects systematically because they cannot know to which direction the stylus has to be moved to touch the object.
to use simpler haptic devices like tactile or force feedback mice for under seven-year-old blind children. Our further research with the Phantom device will be directed for over seven-year-old visually impaired children, since then we can guarantee the level of motoric development required with the device.
10. ACKNOWLEDGMENTS This work was supported by the Nordic Development Centre for Rehabilitation Technology (NUH), and the Academy of Finland (grant 202180). We thank the children and their parents for their help.
Jansson and Ivås [4] examined how a short practicing period with the Phantom would make the performance of the device better. They found that most of the adult subjects improved their performance, but there were large individual differences in coping with the device. Likewise, in these tests with the children there was a large difference between the children as an operator of the machine. Some of our test subjects, despite their young age, managed and handled very well the device whereas others would have needed more practice.
11. REFERENCES [1] Colwell C., Petrie H., Kornbrot D., Hardwick A., Furner S. (1998). Haptic virtual reality for blind computer users. Proceedings of ASSETS 1998, ACM Press, 1998, 92-99. [2] Immersion Corporation. http://www.immersion.com [3] Jansson G. and Billberger K., The Phantom Used without Visual Guidance. The First Phantom Users Research Symposium (PURS 99), Heidelberg, Germany. http://mbi.dkfz-heidelberg.de/purs99/
Jansson and Billberger [3] have used the Phantom in experiments on how accurately and fast small 3D objects can be identified compared with real objects that are examined by hand. According to their research grasping the object with bare hands is considerably faster and more accurate to feel the shape of the object than with the Phantom stylus.
[4] Jansson G. and Ivås A.. Can the Efficiency of a Haptic Display be Increased by Short-Time Practice in Exploration? Proceedings of the Haptic Human-Computer Interaction Workshop, University of Glasgow, UK, 2000, 22-27.
Colwell et al. [1] used Immersion Corporation’s Impulse Engine 3000 [2] with blind and sighted subjects. Their conclusion was that blind persons were more discriminating in their judgements of the coarseness of the artificial textures. It was also noticed that the shapes of the objects were felt but the objects often remained unidentified. This same phenomenon was present in our results.
[5] Mazalek A., Davenport, G., Ishii, H. Tangible Viewpoints: A Physical Approach to Multimedia Stories. Proceedings of ACM Multimedia ’02, ACM Press, 2002, 153-160. [6] Raisamo, R., Mannonen, M., Pasto, V., Salo, J., Patomäki, S. and Hippula, A. Usability testing of haptic and auditory interfaces for visually impaired children. In Ger M. Craddock, Lisa P. McCormack, Richard B. Reilly, and Harry T.P. Knops (eds.), Assistive Technology – Shaping the Future (AAATE 2003), Assistive Technology Research Series, Vol. 11, IOS Press, 2003, 979-983.
Yu et al. [11] tested embossed line graphs to be followed with one-point touch of the Phantom stylus. It was suggested that engraved form should be used in at least line graphs. According to their results friction and surface texture of the objects made embossed graphs easier to distinguish from one another. In our study this result was found to apply also in interfaces for visually impaired children.
[7] Reachin Technologies AB. http://www.reachin.se [8] SensAble Technologies Inc. http://www.sensable.com [9] Sjöström C., The IT Potential of haptics: Touch access for people with disabilities. Licentiate thesis, Certec, Lund Institute of Technology, 1999.
9. CONCLUSIONS In this paper we have presented design and implementation of multimodal environments for visually impaired children. Our main emphasis was in haptics augmented with auditory and visual stimuli. Physical models were especially useful when testing the haptic elements and virtual environments with the children. During these tests we developed a testing procedure that takes into account the needs of young visually impaired children. The results and the testing procedure will be usable when research and development in this field continues. In the future we are planning
[10] Sjöström C., Non-Visual Haptic Interaction Design: Guidelines and Applications. Doctoral dissertation, Certec, Lund Institute of Technology, 2002. [11] Yu W., Ramloll R., Brewster S. Haptic Graphs for Blind Computer Users. Proceedings of the Haptic HumanComputer Interaction Workshop, University of Glasgow, UK, 2000, 102-107.
288