Paper-based multimedia interaction as learning tool for ... - I3Lab

3 downloads 0 Views 929KB Size Report
Jun 9, 2010 - Burnett M., Cook C., Rothermel G. End-user Software. Engineering. CACM, 47 ... Cassell J, Ryokai K, Making Space for Voice: Technologies to ...
Paper-based Multimedia Interaction as Learning Tool for Disabled Children Franca Garzotto and Manuel Bordogna Department of Electronics and Information, Politecnico di Milano Via Ponzio 34/5, 20133 Milano (Italy) [email protected], [email protected] disabilities often use Augmentative Alternative Communication (AAC) aids to communicate and interact with other people [3]. One AAC paradigm is visual communication, which relies on visual symbols and images as language elements. The most used low tech tools for this form of communication are the so called PCS (Picture Communication Symbols) systems. They employ single tangible picture cards that represent objects, activities, people, events, as well as more abstract concepts like  feelings  (e.g.,  happiness,  sadness,  disappointment)  or  relationships (e.g., friendship, before, after).  These cards can be hang on the wall or moved around on a table or collected in a “wallet” that can be carried around, and are used by language disabled children to form a message, or as supplementary visual prompts by caregivers.

ABSTRACT

The purpose of our research is to support cognitive, motor, and emotional development of severely disabled children in the school context. We designed and implemented a set of novel learning experiences that are both low-cost and easily customizable, and combine the visual communication paradigm of Augmented Alternative Communication (ACC) with multimedia tangible technology. Using an application framework developed at our lab (called “Talking Paper”), teachers and therapists can easily associate conventional paper based elements (e.g., PCS cards, drawings, pictures) to multimedia resources (videos, sounds, animations), and create playful interactive spaces that are customized to the specific learning needs of each disabled child. Paper based elements work as visual representations for the concepts children must learn, as communication devices, and as physical affordances for interacting with multimedia resources. The paper presents the approach and its application in a real school context, highlighting the benefits for both disabled and non disabled children. The latter were involved as co-designers of multimedia contents and learning activities. Their creative participation favored group-binding and increased tolerance and sense of community in the classroom, so that the overall project became a means for real inclusive education.

The purpose of our research is to combine this AAC visual communication paradigm with multimedia tangible technology, addressing children with severe cognitive, linguistic and motor disabilities in the school context. The key idea of our approach is to link multimedia resources (videos, sounds, or animations) to paper based visual elements (PCS cards, drawings, pictures), and to create playful hybrid interaction spaces that engage children with digital and tangible contents in order to develop cognitive associations between different representations and to improve language and communication skills.

Categories and Subject Descriptors

To validate the appropriateness of this approach, we first carried on a contextual study both in a public primary school that hosts non disabled students and children with medium-high motor and cognitive impairments, and in a rehabilitation center for language impaired people. This preliminary work resulted in a deep understanding of children’s needs, of teachers’ and language therapists’ goals, of their practices and constraints. Together with disability experts, we then designed and implemented a set of learning experiences, using a scenario-based method and an application framework employing RFID technology. Finally, we carried on a pilot evaluation at school, involving severely disabled children for a period of three months.

K.3.0 [Computers and Education]: General; H.5.2 [Information Interfaces and Presentation]: Multimedia Systems, User Interfaces General Terms

Design, Human Factors Keywords

Disabled children, Learning, Augmentative and Alternative Communication (AAC), Inclusive Education, Paper-based Interaction, Tangibles, Multimedia, Design INTRODUCTION

Children with severe cognitive, linguistic and motor Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. IDC 2010, June 9–12, 2010, Barcelona, Spain. Copyright 2010 ACM 978-1-60558-951-0/10/06…$10.00.

A number of studies pinpoint the benefits of tangible interaction for children’s learning in general [18, 20,19, 24, 26, 28, 28, 30] and for children with special needs in particular [14, 16]. Still, very few works explore the effectiveness of this interaction paradigm for children with multiple severe disabilities, and consider the gamut of

79

design factors that contribute to make tangible learning experiences effective for this target group. In particular, to our knowledge the potential of paper-based interaction as a learning means for severely disabled children in a real school context has never been investigated.

“grasp”, “move” an object, as well as confidence on their own actions. Requirements on Childrenʼs User Experience

PCS based communication Since PCS are the main form of communication for severly language disabled children, any experience for communication skills development should make use of this visual language.

A further novelty of our work is that we involved the classmates of disabled children as co-designers of educational experiences. A number of examples exist of participatory design methods that engage children as technology design partners [8, 12]. Still, children typically participate in the design of interactive products for which they are the intended users. In our case, non disabled children contributed to the design of technological experiences devoted to their disabled friends; they acted as proxies and assumed the role that in other design approaches is taken by end users. Besides being original from a research view point, this approach brings a number of benefits for non disabled children, and represents a way to achieve real inclusive education.

Each child is unique Cognitive, linguistic, and motor abilities are significantly different even among children diagnosed with the same type of disability such as autism or cerebral palsy. Each disabled child has a unique set of skills, which may vary over time. An effective learning experience must match the individual’s needs in order to develop or augment her abilities. Contents and tasks must be customized on the child’s peculiar profile and tailored to her developmental, linguistic and cognitive level at each specific time.

THE CONTEXTUAL STUDY

Tangible Interaction

The preliminary contextual study was carried on at the public primary school “Rainbow” in Lodi (Milano) and the rehabilitation center Benedetta Intino in Milano. The school currently hosts 190 children, 16 of them with severe motor or cognitive disabilities. The therapeutic center is one of the most advanced in our country and assists every year from 2,000 to 2,500 disabled people.

The use of tangible material stimulates multiple senses and promotes the development of cognitive functions and perceptual and bodily skills. Use of familiar situated material The learning experience should make use of contents children like and are familiar with, e.g., music, videos, stories, voices of educators, classmates, or relatives, or images of known environments or situations. This makes learning more “situated” and is a means for consolidation, as discussed below.

We visited the school several times, participated as observers in all types of educational activities of both disabled and non disabled children. At the therapeutic center, we looked  at  many  hours  of  videos  reporting  examples  of  therapeutic  sessions  for  language  development and discussed with language pathologists the equipment,  material,  and  procedures  they  use  with  disabled children. We finally carried on semi‐structured  interviews  and  three  focus  groups  involving  regular teachers, educators specialized  in  children  with  special  needs, the school principal, and a language pathologist that  assist teachers and children at school.  

Consolidation Most disabled children have limited long term memory and, in normal education sessions, they experience the difficulty of retaining a learned concept or skill from one session to the next one. It is necessary to design activities that on the one end introduce novelty and create new challenges, and on the other end include moments in which children repeat and consolidate what they have previously understood. It is crucial to set a proper frequency (e.g., have a session two or three times per week), and to unfold the experience along a relatively long period – several weeks or months. It is also important to use content related to known subjects or material that children have been engaged with before in other school activities or at home, as mentioned above.

Educational Goals

Educational activities for children with severe cognitive, linguistic and motor disabilities attempt to achieve a wide spectrum of benefits: •





Cognitive benefits - to enforce children’s understanding of PCS symbols, to extend their vocabulary, and to improve their capability to form very simple “phrases” and to understand very simple narrative structures [27]. Affective benefits - to promote interest, engagement, and positive attitude towards specific educational activities and the school in general, to increase  self‐ esteem,  and to enforce positive relationships with school mates. Psychomotor benefits - to improve control and coordination of simple movements such as “point to”,

Engagement To foster interest, motivation, and engagement, the educational activities should be perceived more as a game than a learning or therapy session. Multimedia in all its forms – videos, animations, cartoons, interactive stories - is a powerful ingredient for making an educational experience more fun. In addition, the proposed tasks should be “sufficiently” challenging, without being discouragingly hard or boringly easy; they should progress through increasing levels of difficulty as the experiences unfolds,

80

with new concepts and linguistic elements progressively being introduced, unlocking new opportunities along the way, and stimulating children to reach for the boundaries of their skills.

control the multimedia resources placing the RFID reader on the corresponding paper-based items. The main characteristics of our technological approach is to implement the end-user development paradigm, which promotes the idea of building systems that enable non programmers “not only to personalize computer applications and to write programs, but to design new computer based applications without ever seeing the underlying program code” [6]. We adopted a software application framework developed at our lab – called Talking Paper – which in few clicks allows also inexperienced users with no hardware and software knowhow to configure the resources needed for a learning experience, i.e., to install the RFID reader and to link RFID tags to multimedia resources or to built-in multimedia control behaviors. Talking Paper has been originally designed for being used by teachers and “normally functioning” children. Its simplicity and effectiveness for creating learning and play experiences have been tested in a number of primary schools, e.g., to build hybrid storytelling spaces integrating multimedia and paper-based contents. For the purpose of the project reported in this paper, we extended the original version of Talking Paper with a set of functionalities devoted to support the management of a wide amount of tags and multimedia resources, including proper functions to organize storage of and access to multimedia material and their links to RFID tags.

Concentration and Timing Most disabled children have limited capability of concentration, and it is difficult to keep them focused even on a simple task for a long time. Each activity should be designed to be relatively short (3-5 minutes) and a session of activities should last no more than 30-45 minutes. Requirements on Technology

Easy tailorability The strong need for customized intervention on each child implies that technology must be open, highly flexible and adaptable, to enable the customization of contents and tasks to the child’s characteristics and her evolving needs over time. Technology should not be “closed”, i.e., offer a frozen set of contents and activities, nor it should be customizable by expert programmers only. Schools typically cannot afford hiring computer specialists for this purpose, and disability specialists are not programmers. Customization should be so simple that inexperienced users (e.g., teachers, educators, therapists) can autonomously achieve it. Technology should be regarded as an open “construction kit” [21] that can be easily filled with new contents and configured to create educational experiences appropriate to the needs of the specific child, structuring different learning resources and tasks as they become necessary by the evolution of child’s ability.

Our  approach  meets  most  of  the  technological  requirements  emerged  from  the  contextual  study.  By  supporting  the  interactive  use  of  PCS  cards  and  other  paper‐based  visual  material  linked  to  attractive  multimedia  resources,  we  combine  the  benefits  of  tangible interaction with the potential of multimedia for  engagement  and  learning.  The  Talking  Paper  system  meets the requirements for easy tailorability, because of  its  framework nature and its tested paradigm of end user development. In addition, Talking  Paper  meets  the  organizational  constraints  on  costs,  considering  the  current  prices  of  RFID  readers  and  tags.  Finally,  the  whole  set  of  hardware  –  an  ordinary  PC,  the  RFID   reader, and the self‐installation CD‐ROM ‐  can be easily  moved around.  

Portability and Low Cost Considering  the  very limited budget of the public school system, technology should be affordable in terms of equipment and maintenance cost, and it should  be  easily  installable  in  different  physical  contexts  and  situations  of use, at school or at home.  TECHNOLOGICAL APPROACH

From a technological perspective, the key idea of our approach is to associate physical visual contents with multimedia resources on the screen, and have the child engage with these contents using paper-based interaction. Paper based elements play multiple roles: they work as representations for concepts children must learn, as visual communication devices, and finally as physical affordances for interacting with the multimedia digital space. They can be physically moved around on a surface, and represent a tangible, easily reconfigurable interface for exploring and controlling multimedia elements on the computer.

DESIGNING THE EXPERIENCE

Effective  technology  per  se  provides  only  the  enabling  condition  for  creating  a  learning  experience.  What  makes  it  useful  and  engaging  is,  obviously,  how  users  work  with  technology:  The  true  challenge  is  to  design  appropriate  contents  and  tasks  that  meet  children’s  needs and educators’ goals. This experience design work involved all disability experts who participated in the contextual study and the research group from our lab.

The solution for creating such hybrid interaction spaces is extremely simple and low cost, and makes use of RFID technology to bridge the gap between digital and paperbased visual material. Each paper element is equipped with an RFID tag associated either to a multimedia item (video, animation, voice, music) or a multimedia control behavior (“suspend”, “stop”, “continue”). Users can activate and

Our target was disabled children who suffer  of  severe  impairments in body movement and muscle coordination, difficulty in control  of  fine‐grained  movements  (e.g.,  grasping small objects), and limited cognitive functions  such  as  memory, problem-solving, attention, visual and

81

language comprehension.  They  understands  from  40  to  70%  of  speech  and  non  verbal  communication  (i.e.,  conventional gestures and face expressions), but are not  able to speak. They normally communicate through face  expressions,  body  movements,  sounds,  and  portable  VOCAs and PCSs.  

Materials

The digital educational resources comprise a set of short (20-30 seconds) videos, each one presenting a “situation” of the child’s school life. In each video situation, there is one “key” action, performed by one or two classmates, which the child is familiar with, but does not belong yet to child’s vocabulary, i.e., the child does not know the PCS card for this action. Digital material also include a set of a 5-10 seconds “congratulations” videos; each one shows the educator pronouncing a simple phrase (e.g. “Andrea is running”), applauding and warmly congratulating with the child.

Experience design was devoted to create multimedia and tangible materials needed to achieve the learning goals mentioned in the previous section, and to define children’s activities with these contents. Initially, for approximately one month, our work unfolded along an iterative cycle of design sketching, experience prototyping, short tests with disabled children, evaluation, tuning and partial redesign of contents and activities. From the knowledge gained during this preliminary design phase, we abstracted the characteristics of contents and tasks that are appropriate to meet the educational goals for the above profile. The final output of this design process was a set of activity scenarios.

For each “situation”, the tangible material includes: i) a PCS tagged phrase, linked to the corresponding congratulations video; it describes a situation in terms of “subject-action” or “subject-action-complement” (e.g. “Andrea-running”, “Maria - going to - bathroom”) and comprise one unknown PCS symbol for the key action;

Scenarios, a well-known HCI design technique [4, 22], are “stories of use”. In our approach, a scenario provides an articulated narrative around a disabled child trying to achieve a given (set of) goal(s) using paper-based interaction on given set of educational resources. A scenario is structured in the following components: Child’s profile; Goal (the educational benefits that the child is supposed to achieve by performing a set of tasks); Materials (the tangible and digital resources employed in the Activity); Activity (defining the tasks proposed to the child); Context (defining the environment in which the Activity takes place).

ii) a tagged picture linked to a video situation and showing the face(s) of the classmate(s) performing in the video; iii) a tagged PCS action card that represents the key action in the phrase and the new symbol to learn, and is linked to the corresponding video situation; iv) a set of identical but untagged PCS phrases that are used for “control” purposes and an A3 picture of the whole class on cardboard. Activity

At the beginning of the session, the educator presents to the child a set of 4-5 tagged pictures of her school friends and asks her to select one and to discover what happens when she moves the RFID reader on a picture (Figure 1). After seeing the associated video (as many times as the child likes), the corresponding PCS phrase is presented and explained to the child, pinpointing the new symbol.

Modeling design specifications in terms of activity scenarios was intended to make our design solutions more understandable and reusable for educators and therapists working in the future with similar children for similar goals in the school context. In our scenarios the profile is the one of the target group defined above. The goals are refinements of those indentified in our contextual study. All activities are assumed to take place in the school “resource room”, i.e., a space where disabled children carry on specialized activities that cannot take place in a regular classroom, either because they require special equipment or might be disruptive to the rest of the class (such as those to improve language development).

After this task is repeated several times, a friend’s picture is presented again, together with two PCS phrases. Only one is correct, i.e., it refers to the key action rendered in the video where the friend performs. This is interactive and linked to the congratulations video. The wrong one is untagged and passive. The child is invited to choose the right phrase, point at a phrase and discover what happens on the screen. The positive feedback, i.e., the congratulation video, confirms correct choices. The absence of feedback, when the choice falls on untagged, incorrect phrase, indicates a wrong answer. The same task is repeated several times, increasing the cardinality of the wrong PCS phrases among which the child must make her choice.

For lack of space, we report here some examples of activity scenarios, which comprise progressively more ambitious goals and increasing levels of task complexity. Scenario #1: Enriching the vocabulary Goal

The purpose is to extend the child’s vocabulary, i.e., to expand the set of PCS representations she knows and can use to communicate. The child should learn to recognize a known concept rendered through a video, and associate a new, previously unknown PCS representation to this concept, so that she becomes able to use it to communicate that concept.

At the end of a session of tasks, the child is engaged in the progressive creation of her own “interactive poster” of the class. She is asked to place the tagged PCS cards for the key actions she has progressively learned during the session on the A3 cardboard picture, near the faces of the children that perform that action in the video (Figure 2), thus creating her own interactive poster of her class.

82

Multimedia resources include a 2 minutes animation of the entire story, and a set of 20-25 seconds animations, one for each scene. Tangible resources comprise: i) two tagged story posters; These are A2 (42 x 59,4 cm) sized interactive cardboard sheets depicting the whole story, with the scenes visually arranged along a path (Figure 3). The path provides a visual trait d’union among the different scenes and facilitates child’s understanding of the narrative flow. One poster is linked to the animation of the entire story, and includes big paper-based “control buttons” for playing and suspending it. The other poster contains one tag for each scene linked to the scene animation.

Figure 1. Scenario #1 - interacting with friends’ pictures

ii) For each scene, two identical A4 sheets presenting the scene and two identical PCS phrases symbolically describing it (Figure 4). One sheet and the corresponding phrase are tagged and linked to the scene animation, the other sheet and phrase are untagged. iii) one untagged A2 cardboard showing the story path only on a white background.

Figure 2. Scenario #1 - the interactive posters progressively created of the class created by the child Scenario #2: Understanding communication structures Goal

more

complex

The purpose is to enrich the child’s capability of understanding and using relatively complex PCS phrases, developing cognitive skills of recognition, association, and synthesis through interactive storytelling. Through this scenario the child should learn to: • • •

Figure 3. Scenario #2 - Interacting with the Little Red Cap story poster

recognize the main narrative components of a known story, i.e., the key scenes of the story; building associations between a story component and the corresponding symbolic representation [11], i.e., PCS “phrases” that have some degree of complexity; re-creating the representation of the story from its components, by putting scenes together to form a whole in a meaningful order.

Materials

Contents are related to characters and actions of a popular children tale that the child knows well, from picture books read by adults and schoolmates in the classroom or at home (e.g., “Little Red Cap"). Educators can assume a basic of understanding of the story subject, and focus on specific learning tasks. The story is de-structured in few (3-6) simple scenes (e.g., “Little Red Cap encounters the wolf”).

Figure 4. Scenario #2 – tagged PCS phrases (on the left) and scene cards (on the right)

83

Activity Initially, the child interacts with the tagged story poster linked to the full story animation, looks at the video, and learns how to play and suspend it (Figure 3). Then the child discovers the contents and interaction possibilities of the second story poster – the one linked to individual scene animations. Initially, she randomly activates the animations; then she exercises to play the story scene after scene, in the proper order, to consolidate her understanding of the logical flow of events.

she must employ more complex cognitive skills to place scenes along the timeline in the correct order. Initially, the child freely explores the interactive pictures and plays the corresponding videos. Then she learns to build the association between a picture and its PCS representation, following the pattern of tasks defined in Scenario #2. Finally, the child builds her own story. She is given the empty cardboard strip, the unordered set of tagged pictures and the corresponding tagged PCS phrases, and is asked to place each picture and phrase one after the other, in the proper order (Figure 6).

The activity proceeds with the child looking at each scene sheet and corresponding PCS phrase, and activating the corresponding animation. Then she is exposed to one scene and a set of PCS phrases (of increasing cardinality, from 2 to the maximum number of scenes) and is asked to recognize the correct phrase for the current scene. As in Scenario #1, when she tries to play the video, only the correct phrase works, being the others untagged and not interactive. The child also learns to build associations in the opposite direction, between PCS representation and its corresponding scene. The paradigm is similar: the child is exposed to a PCS phrase and a number of scenes, and has to pick up the correct scene matching the phrase. After a number of repetitions of the above tasks, the child is asked to put together the whole story from its components, arranging tagged scene sheets and the corresponding PCS phrases in the proper order on the path of the empty untagged A2 poster (Figure 5). Scenario #3: “My” story Goal

The learning focus is on understanding temporal concepts (“before” “after”) and their linguistic representations, as a prerequisite to develop more complex narrative skills. An additional goal is to help the child enforce self-esteem and capability of expression of self, by creating a tangible interactive story about herself and her life at school. Materials

Multimedia and tangible elements are related to moments of the child’s everyday life at school and involve the child as main character and starring actor. The digital resources include a set of 15-20 seconds videos, each one recording a child’s activity, e.g., arriving with the bus, answering roll call, working in the classroom, playing with classmates during the break, performing in the theater or in the music laboratory. Each video is linked to a tagged picture that offers a preview of the activity. The tangible material also includes tagged and untagged PCS phrases as in Scenario #2, plus a long white strip of 60x30 cm cardboard that represents an empty timeline on which the child can place tagged pictures to create her interactive story.

Figure 5. Scenario #2: The final interactive story “Little Red Cap”

Activity

Most tasks follow the patterns adopted in phase 2. Still, in this case the child has never seen the whole story before and she has to build it from scratch. In addition, she cannot take advantage of visual clues (e.g., the story path) to arrange the proper flow of scenes on the strip. Therefore

Figure 6: Scenario #3 - “My” interactive story (“Anna’s morning at school”)

84

ENGAGING THE WHOLE CLASS

Emotional benefits

In the design and implementation of scenarios, we involved all non disabled classmates (forty students) of the two disabled children engaged in the evaluation (discussed in the next section). Classmates participated in a group discussion devoted to identify the elements of the classroom life that normally attract their friends’ attention, as well as the stories they like more. For example, they noticed that one of the child is particularly attracted when her friends make a joke, and proposed (for Scenario #1) to include materials and tasks about “joking”, not belonging to the disabled children vocabulary. Classmates also pinpointed that their disabled friends enormously like to answer the roll call, thus they suggested that one of the videos for interactive story to be created in Scenario #3 could deal with this event.

Anna and Paolo’s interest, engagement, and enjoyment were evident in a number of situations. They always looked happy and smiling when they had to go and “play with the paper” in the resource room. They tried to report at home what they were doing and attempted to explain it to her family - something that they very seldom do. Anna’s mother called the school educator with these words “What are you doing so special and fantastic? Every Monday and Wednesday Anna comes back home so excited and attempts to tell me something about the computer!” At the beginning of the project, children were at the same time suspicious and attracted by the interactivity of visual paper elements. After looking a few videos, they became more and more enthusiast, and more curious of discovering new stuff in each session. The video showing the educator’s applause for a correct choice was very rewarding, motivating children to repeat the task several times.

In addition, non disabled children played as starring actors in the videos for Scenario 1 and as secondary actors in Scenario #3, and took pictures for the various educational tasks. Finally, they participated in the creation of the paper material and the tagging process. Video recording and editing was done by our staff, while educators created all PCS symbols and phrases, as well as the drawings for the story scene cards and posters employed in Scenario #2. Classmates painted the drawings, helped tagging the different elements, and used the Talking Paper framework to link them to videos and animations.

The whole project induced an increase of self-esteem and strengthened the relationship of disabled children with classmates. The children were very proud of the interactive posters they create in Scenarios 1 and 2, and wanted to show them to their friends. When the final version of the Little Red Cap interactive poster was ready, it was demonstrated to the whole class, and children received a warm applause: they felt protagonists and clearly perceived they were doing “something special” and highly appreciated by their friends.

PILOT EVALUATION

The evaluation involved two disabled children, a girl (Anna) and a boy (Paolo) aged 10 and 11, whose profile correspond to the design scenarios. Both  children  have  a  diagnosis  of  severe  spastic  diplegy  (probably  due  to  cerebral palsy for genetic causes), which strongly affects body movement and coordination, and has impoverished their cognitive development (also limited because of the lack of specialized treatment in early childhood).   

Cognitive benefits

By effect of this project, educators noticed a significant improvement of linguistic and narrative capability, especially if compared with the results of regular educational activities. Linguistic skills were tested with Anna in a semi-formal way. For two weeks, she was exposed to two-four new PCS symbols per session, and practiced again with these symbols in one or two following sessions. Differently from the usual practice, these symbols were not added to her PCS card portable wallet thus she never used it again for the two following weeks. At this point we included the symbols in her wallet and put her in situations where she had to communicate using them: she performed absolutely correctly, using the proper symbols to form her messages. This was considered an excellent result by educators and language therapists, being the learning and retention rate much higher than normal. In average, she usually learns one or two new symbols per week, and if not exercised, she tends to forget a new symbol after 7-10 days.

The two children performed the whole set of scenarios defined at the end of the design phase, over period of three months, with the support of their respective specialized educators. The authors participated as observers, and videorecorded each activity. At the end of each session, we discussed with educators the problems encountered by the child, her or his behavior and reactions, also analyzing the video recordings. At the end of the whole pilot study, we carried on a series of interviews with each educator, children’s parents, and the language therapist, to evaluate the actual benefits gained by children and identify the issues that need to be addressed to extend the project to other disabled subjects at school. BENEFITS

Children improved the capability of organizing concepts in a meaningful order - a prerequisite for developing more advanced narrative skills. At the beginning, they could not work on story creation as required by Scenario #2 without the help of educators. They managed to perform this task alone only after several sessions. But when they proceed to Scenario #3, they immediately understand the assignment

Both children achieved a number of benefits, although at different degrees and in different moments because of their individual differences in terms of disability and personality. In addition, the whole project was beneficial for non disabled children participating in the design process, as discussed at the end of this section.

85

and autonomously performed the temporal arrangement of scenes of “their” story on the timeline sheet.

enforced positive relationships among disabled and non disabled children, stimulated authentic social cohesion and promoted a stronger sense of community. In this school, non disabled children are normally encouraged to interrelate with their disabled classmates. In most situations, they play with them, and sometimes they also “take the role of teacher”, e.g., to deliver instruction such as reading them a book. But in this project “normal functioning” children perceived even more strongly the responsibility of working with and for their disabled friends, and participated with all their energy and heart. As a child wrote in his homework “It was so fantastic! We should call other children like Anna in our class, so we can do this project again and again”. This project gave him the opportunity to do something new and special, and to value diversity.

Improvement of autonomy and motor control

Initially, children could interact with the system only with a strict assistance by educators. They (especially the boy) did not grasp immediately how to use the RFID reader properly: Having limited motor control, they tended to move the reader around without a precise focus, activating videos and animations randomly. They could point to a specific area only with the help of an adult. Still, we noticed a progressive refinement of coordination and motor control. Because of their motivation and enjoyment for the experience, children really wanted to learn doing things by themselves. By the end of the first month, both Anna and Paolo had became more autonomous and had learned to place the reader where they wanted or where they were requested to point to – a result that what judged extraordinary by educators and the language therapist.

RELATED WORK

A wide variety of tools are available from industry and academia to support Augmentative and Alternative Communication (AAC). Most of these aids are mainly intended to supplement or replace speech; they range from simple tools that do not involve digital technology, as PCS (Picture Communication Symbols) systems, to more sophisticated solutions that employ various devices and interaction paradigms “to speak for” the impaired child. For example, commercial systems like VOCAs  (Voice Output Communication Aid) are  voice  synthesizers  that  generate  recorded  spoken  text  on  demand  activated  using  oversized  buttons  as  interface.    Visual  communication  technology  ranges  from  small  size  handheld  devices  like  “Say it! SAM Communicator” or Lingraphica SmallTalks for iPhone, to larger touch screen devices like Dynavox and Activity Pad. The main drawback of most of these systems is that they are closed, i.e., they incorporate a wide amount of visual communication aids and educational resources, but are not extensible with customized material, or customization requires professional training and strong programming expertise. A valuable attempt to meet the needs for easy tailorability is MOCOTOS [15], developed in an academic context. This AAC tool is a cell phone sized portable device that comes preinstalled with a comprehensive library of PCS cards, but allows users to add custom cards by taking pictures, scanning in material, or creating digital drawings or images. MOCOTOS also provides custom audio cues for cards, rapid access to the card library, and rearrangement of card displays to support multiple activities.

The  use  of  a  cabled  RFID  reader  was  something  worrying us at the beginning of the project. This device,  in  principle,  introduces  a  level  of  indirectness  between  child’s  senses  and  the  paper  interface.  In  addition,  the  cable  potentially  represents  an  impediment.  Still,  the  device  proved  to  be  effective,  and  children  liked  it.  According  to  therapists,  using  as  interaction  proxy  a  “big” device with gross pointing capability gives a child  a  stronger  sense  of  materiality  and  is  useful  for  child  with  no  fine  grained  motor  control.  In  addition,  the  reader  was  used  as  a  means  of  communication:  By  grasping  and  moving  the  reader  the  child  externalizes  her  will  of  doing  something;  by  moving  it  around  or  beating it on the table,  or throwing it away, she express  intentions and feelings – “I’m anxious to continue”, “I’m  angry and want to stop”. The cable was long enough to  not  constrain  children’s  movements,  and  it  was  useful  when  they  wanted  to  launch  it  away  in  moments  of  discouragement or irritation.   Benefits for non disabled children

At the end of the pilot study, non disabled students were asked to write a reflection on the project as homework assignment. The whole class also participated in a two hours group discussion on what they feel about and have learned from the project. From this material and from teachers’ interviews, we could identified a number of benefits also for these children who did not participated as users but worked as co-designers of the educational experience. There was an enormous excitement for the dramatization of the experience – all wanted to be actors! They obviously enjoyed working with multimedia and tangible technology, and teachers noticed an improvement of their computer ability. In addition, by carrying on these activities in group, collaboration skills were developed.

Few AAC interactive products are explicitly designed to help disabled children both to communicate and to learn. The most notable example is LinguaBytes [14, 16], a modular, tangible play-and-learning system created to stimulate language and communication skills of toddlers (with a developmental age between 1 and 4 years) with multiple disabilities. Generated through multiple iterations of designing, building, and testing experiential prototypes in therapeutic settings, the current version of the system

More important, the project increased the atmosphere of mutual understanding and tolerance in the classroom,

86

comes preinstalled with a wide set of digital resources words, interactive stories and accompanying games and exercises. LinguaBytes is also equipped with tangible input materials – story booklets, PCS cards, puzzles, and programmable RFID labels that can be attached to objects of play and capitalizes on tangible interaction to make the learning experience more engaging and fun, at the same time enhancing children’s perceptual and motor skills. The richness of the learning resources makes LinguaBytes a flexible and adaptable learning environment. The major drawback is the lack of functionalities that allow non programmers, i.e., educators and therapists, to include new content.

rehabilitation centers, and evaluate technological solution involving disabled children that have some degree of autonomy and independence. In contrast, we investigated the use of tangible technology in a real school context, for a significantly long time, engaging severely disabled children. In addition, our approach employs paper - very poor and low cost tangible material that is conventionally used at school and children are familiar with. Furthermore, we offer an open environment that educators (and children) can use to create customized experiences that meet the unique needs of each disabled child. Finally, our study provides a more comprehensive understanding of how tangible technology can be used in educational settings that involve disabled children, and of the gamut of benefits that we can expect to achieve. In particular, we broaden the perspective on the role of non disabled children in the design of assistive interactive technology. This process typically involve adults only - educators, therapists, parents, interaction designers, technologists. In contrast, we engaged all classmates in the creation of the learning experiences for their disabled friends and pinpoint the benefits for both disabled and non disabled children resulting from this participation, highlighting how this overall process can become a means for real inclusive education.

The last decade has seen a number of tangible interaction platforms appear both in research and in the market that exploit interactive storytelling to support communication and early narrative development for young children. Today there is a wide choice of options, depending on whether children want to listen to stories, interact with them, or tell a story on their own. Examples range from commercial digitally augmented books like LeapFrog Tag Junior Books, to narrative environments that allow children to use paper-based drawings or tangible characters for creating or interacting with a multimedia narrative (e.g., KidPad [9] and PageCraft [5]), to more complex playground environments or room-sized immersive storytelling spaces (e.g., [1,2, 7, 10, 23]). While the learning benefits of “traditional”, i.e., non tangible, interactive storytelling for language impaired children have been acknowledged since a long time [31], it is debatable if the tangible interactive environments above mentioned are truly suitable for severely disabled subjects, as their reported uses involve “normal functioning” children only.

The ultimate goal of inclusive education is to end all forms of discrimination, enabling all students, including those with disability and ‘special needs’, to learn and participate effectively within general educational systems. Clearly, the development of communication capabilities is a fundamental “functional” ingredient for inclusion of disabled children. Still, inclusive education is something more profound. As mentioned in [25], “When inclusive education is fully embraced, we abandon the idea that children have to become "normal" in order to contribute to the world.... We begin to look beyond typical ways of becoming valued members of the community, and in doing so, begin to realize the achievable goal of providing all children with an authentic sense of belonging.” We should regard inclusive education as a reciprocity process in which the disabled child is not integrated, but integrates; she triggers inclusion, and is not “included”. By involving “normal functioning” students in the creation of learning experiences for their disabled classmates, we implemented the principle that individual differences between students are a source of richness and not a problem, and the stakeholders of inclusions include all children, not just the disabled ones. Together they create strategies and mutual opportunities for learning, invention, innovation, and expression.

CONCLUSIONS AND FUTURE WORK

Our work represents a novel case study where low‐tech,  paper‐based  multimedia  interactive  experiences  were  designed for and successfully used by severely cognitive  and  language  impaired  children.  Our  research  has  systematically  involved  so  far  only  two  disabled  children;  obviously,  further  studies  are  needed  to  confirm the benefits of our approach at a larger scale. In  addition, it is important to investigate whether and how  our solutions are generalizable, i.e., at which degree the  technology,  materials  or  activities  we  have  developed  are appropriate for other forms of disabilities.   As highlighted in the previous section, the idea of using digital technology for AAC visual communication is certainly not new, nor is the use of tangible interaction for learning purposes, either for “normal functioning” [18, 20,19, 24, 26, 28, 28, 30], and for disabled children [13, 14, 16, 17]. Still,  our work contributes to knowledge and research on interactive technology for disabled children (language impaired in particular) for a number of aspects.

Our solutions were tested in a school that is particularly open to innovation and actively attempts to achieve inclusive education. Still, we would like to expand it to other schools that involve disabled students but have limited experience in novel projects involving assistive technology, and are less equipped in this respect. For this purpose, in the context of the L4A (“Learning for All”)

Most works in the existing literature related to innovative assistive technology investigate the use of multimedia and tangible interaction in therapeutic contexts, e.g.

87

national project we are developing a training plan for teachers trying to promote our approach at a larger scale.

16. Hengeveld B. et al. The Development of LinguaBytes: An Interactive Tangible Play and Learning System to Stimulate the Language Development of Toddlers with Multiple Disabilities. Advances in Human-Computer Interaction, 381086, 2008 17. Hornof J. Designing with children with severe motor impairments. Proc. CHI 2009, ACM Press, 2177-2180 18. Markopoulos P. Read J., MacFarlane S., Hoysniemi J. Evaluating Children’s Interactive Products. Morgan Kaufman, 2008 19. Marshall P. Do tangible interfaces enhance learning? Proc. Tangible and Embedded Interaction 2007 (TEI’07). ACM Press (2007), 163 - 170 20. Price, S., et al., Using `tangibles' to promote novel forms of playful learning. In Interacting with Computers 15, 2 (2003). 169-185. 21. Resnick, M., and Silverman, B. Some Reflections on Designing Construction Kits for Kids. Proc IDC 05, ACM Press (2005) 22. Rosson, M.B. and Carroll, J.M. Usability Engineering: Scenario-based Development of Human-Computer Interaction. London: Academic Press, 2002 23. Stanton D. et al., Classroom Collaboration in the Design of Tangible Interfaces for Storytelling. Proc. CHI 2001, ACM (2001), 482-489. 24. Stringer M et al. Teaching Rhetorical Skills with a Tangible User Interface. Proc. IDC 04, ACM Press, 11-18, 2004 25. UNESCO. Inclusive Education: The Way to the Future. http://www.ibe.unesco.org/National_Reports/ICE_200 8/brazil_NR08.pdf (inspected Jauary 2010) 26. Verhaegh J., Fontjin W., Hoonhout J. TagTiles: optimal challenge in educational electronics. Proc. Tangible and Embedded Interaction 2007 (TEI’07). ACM Press (2008), 187 - 190 27. Vygotsky, L. S.. Mind and Society: The development of higher psychological Process. M. Cole et al. (Eds.). Cambridge, MA: Harvard University Press (1978). 28. Xie L. Antle A.N., Motamedi N. Are tangiblle more fun? Comparing children's enjoyment and engagement using physical, graphical and tangible user interfaces. Proc. Tangible and Embedded Interaction 2008 (TEI’0). ACM Press, NY (2008), 191 - 198 29. Zaman B. Vanden Abeele V., Markoupoulos P., Marshall P. Tangibles for Children, the Challenges. CHI 2009 Extended Abstracts, ACM Press NY (2009), 4729-4732 30. Zuckerman, O., S. Arida, and M. Resnick. Extending tangible interfaces for education: digital Montessoriinspired manipulatives. Proc.CHI 2005. ACM Press, NY (2005), 859 - 868. 31. Waller A, O’Mara D, Tait L, Booth L, Hood H. Conversational Narrative and AAC – a Case Study. Augmentative and Alternative Communication 17, 221232 , 2001.

ACKNOWLEDGMENTS

Authors are enormously grateful to all participants in this project–children, parents, educators, and therapists, at Scuola Arcobaleno in Lodi (Milano) and the center Benedetta D’Intino in Milano. This work was partially supported by National Project L4A (“Learning for All”) – Grant Num RBNE07CPX 001. REFERENCES

1. 2. 3.

4. 5.

6. 7.

8.

9. 10. 11. 12. 13. 14.

15.

Alborzi, H., et al. Designing StoryRooms: Interactive Storytelling Spaces for Children. Proc. DIS '00. 2000, ACM, 2000, 95-104 Benford, S., et al. Designing Storytelling Technologies to Encourage Collaboration Between Young Children. Proc. CHI ’00, ACM Press, pp.556-563. Beukelman, D., & Mirenda, P. Augmentative and alternative communication: Supporting children and adults with complex communication needs. Paul H.Brookes, Baltimore, 2005. Borchers, J.O.: A Pattern Approach to Interaction Design. John Wiley & Sons (2001) Budd J., Madej K., Stephens-WellsJ., Jong , J., Katzur E., Mulligan L. PageCraft: learning in context a tangible interactive storytelling platform to support early narrative development for young children. Proc. IDC 2007, ACM Press, 97-100 Burnett M., Cook C., Rothermel G. End-user Software Engineering. CACM, 47, 9, (2004). 53-58 Cassell J, Ryokai K, Making Space for Voice: Technologies to Support Children’s Fantasy and Storytelling, Personal and Ubiquitous Computing, 5, 3, 169-190, August 2001 Druin, A. The role of children in the design of new technology. In Behaviour and Information Technology (2002), vol. 21, no. 1, 1- 25. Druin A. et al. KidPad: A Design Collaboration Between Children, Technologists, and Educators. Proc. CHI’ 97. Atlanta, GA, USA, ACM Press Fontijn, W., and Mendels, P. StoryToy the Interactive Storytelling Toy. in Proc. Pervasive Computing, 2005 Barrett, M. The development of language. Psychology Press, 1999 Garzotto, F. Broadening children's involvement as design partners: from technology to "experience". Proc. IDC 2008. ACM Press, 186-194 Guha M.L., Druin A., Fails J. A. Designing with and for children with special needs: An inclusionary model. Proc. IDC 2008, ACM Press, 61-64 Hengeveld B., Hummels C., Overbeeke K. Tangibles for Toddlers Learning Language. Proc. Tangible and Embedded Interaction (TEI'09), 161-168, ACM Press, 2009. Hayes G., Monibi M., Mocotos: Mobile Communications Tools for Children with Special Needs. Proc. IDC 2008, 121-125, ACM Press

88

89