MyWord: Enhancing Engagement, Interaction ... - ACM Digital Library

2 downloads 0 Views 1MB Size Report
Jun 22, 2018 - For example, Freddy chose to ... later interview, Teacher B reported that Freddy often travelled ...... Jeremy Johnson, Peter Presti and Kimberley.
Designing for Different Abilities

IDC 2018, June 19–22, 2018, Trondheim, Norway

MyWord: Enhancing Engagement, Interaction and SelfExpression with Minimally-Verbal Children on the Autism Spectrum through a Personal Audio-Visual Dictionary Cara Wilson, Margot Brereton, Bernd Ploderer and Laurianne Sitbon Queensland University of Technology Brisbane, Australia {cara.wilson, m.brereton, b.ploderer, laurianne.sitbon}@qut.edu.au [42]. They also often involve complex overhead navigation and metacognitive processes. However, if the learner is not interested in or motivated by this pre-defined content, it can present a barrier to learning [45]. Since children on the spectrum often have very specific areas of interest and knowledge which can provide a great source of motivation [45], it seems there is an opportunity to springboard learning through their known interests. For example, typically in word games or digital dictionaries, ‘A’ is for ‘Apple’, but if a child on the spectrum were enabled to choose what ‘A’ stood for, their specific interests may lead them to choose, for example, ‘Air Conditioner’ instead. In this research, we sought to approach learning from this perspective, asking how we can begin by building on the children's own interests and motivations, in order to engage in meaning-making in a standard part of the school curriculum: learning to read and spell words.

ABSTRACT

Digital technologies to support children on the autism spectrum often offer predefined content for modelling, communicating and training. However, children may not relate to the content, and it may not match their own personal interests and motivations. This paper investigates the use of MyWord, an interest-based, child-led technology, as an exploratory probe. This audio-visual dictionary app supports a child to build their own personalised catalogue of favourite words, images and audio over time. We undertook a field study over two school terms in an autismspecific primary school with 12 minimally-verbal children aged 5 to 8 and their teachers and speech therapists. Findings indicate that creating dictionary entries involved processes of personal choice, representation of the self and interests, and dynamic action and play. Use of personally and contextually relevant words enhanced engagement, interaction and self-expression. We contribute a novel, flexible, interest-based technology, and reflections on its use in autism-specific school contexts. We highlight the importance of the child’s lived experience and holistic child-led approaches to technology design.

This approach may be particularly important in the support of minimally-verbal children on the spectrum. It is estimated that around 30% of children on the spectrum are minimally-verbal and may communicate with; no spoken language at all; atypical non-speech sounds only; a few words or phrases in limited contexts; or echolalic language, i.e. the child repeats language of others but does not generate it [40]. Verbal communication is used throughout school curricula, and may provide challenges for minimally-verbal children engaging in typical verbal and written language tasks [28]. We ask, could using the child’s own personal interests and individual choices motivate engagement in classroom curricula and activities that centre on the spoken and written word?

Author Keywords

Autism; audio-visual dictionary; interest-based technology; classroom interaction; self-expression; engagement, motivation; child-led; child-centred ACM Classification Keywords

CCS → Human-centered computing → Human computer interaction (HCI) → Empirical studies in HCI INTRODUCTION

Educational technologies designed for children on the spectrum are often targeted towards teaching very specific, pre-defined skills, such as emotion recognition [25], social skills for specific situations [18], or specific behaviours

To address this question, we developed a very simple audio-visual dictionary app called MyWord, expressly focused on building a personal collection of words. MyWord allows children to create a digital dictionary of words that mean something to them. The app was designed so that a child initially takes their own photos to insert into a dictionary format and then can add an audio recording to these, such as the sound of the word, enabling them to build a personalised catalogue of words over time.

Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]. IDC '18, June 19–22, 2018, Trondheim, Norway © 2018 Copyright is held by the owner/author(s). Publication rights licensed to ACM. ACM ISBN 978-1-4503-5152-2/18/06…$15.00 https://doi.org/10.1145/3202185.3202755

In this paper, we present a study exploring how children, teachers and speech therapists react to, explore and integrate the app in an autism-specific classroom setting. We contribute a novel, personal, interests-based

106

Designing for Different Abilities

IDC 2018, June 19–22, 2018, Trondheim, Norway

Figure 1. Screenshots from the MyWord app showing (a) the alphabet home dictionary screen, (b) the entries collected under the letter F and (c) Cody's favourite fire engine

technology, which departs from most language learning technologies by focusing on using the children’s own content to encourage attention to and engagement with new words. The approach is child-led and embraces the child holistically as the words represent their own interests and choices. We reflect that the novelty of this research lies in its focus on 1) minimally-verbal children’s use of the technology, 2) personalisable approaches to engagement with words and 3) concepts usually reserved for tangible and multimodal designs, such as movement and dynamic action.

Special Interest Areas (SIAs) are engaged and encouraged, while self-stimulation and distraction behaviours decrease [45]. SIAs are vital elements of self-image and motivation and can be encouraged and leveraged by those who support children across multiple settings such as home and school. Educational technologies designed for children on the spectrum are often targeted towards teaching very specific, pre-defined skills, such as emotion recognition [25], social skills for specific situations [18], or specific behaviours [42]. However, if the learner is not interested in or motivated by pre-defined content, it can present a barrier to learning [45]. Increasingly, work in HCI has begun to address the importance of personal approaches to technology design with children on the spectrum. Frauenberger et al. [12] and Spiel et al. [39] explored codesign with children on the spectrum (who are largely verbal) to develop their own personalised technologies, demonstrating the importance of personal approaches and of encouraging children’s creativity and engagement. Here, we also seek to support personalised learning through technology, but note that providing individual support for individual design with each child involves significant resources. Instead, our designs must work in a classroom context in which the technology must work flexibly within the curriculum and provide a satisfying experience to teachers, teacher aides, speech therapists and up to six minimally-verbal children at any given time.

RELATED WORK Autism: Neurodiversity and Ability over Disability

Children on the autism spectrum1 typically experience challenges with social interaction, social communication and repetitive behaviours, interests and/or activities [2]. As Greenspan et al. [14, p.21] discuss, terms such as disorder can demoralise and create negative expectations, as opposed to pinpointing a child’s unique strengths. BaronCohen [6] champions the concept of neurodiversity, which suggests that there is no one way for a brain to be “normal”, and emphasises what a person can do over what they cannot. In Human-Computer Interaction (HCI), Abilitybased Design [46] addresses the importance of interactive technologies which adapt to users’ abilities, skills, and contexts, as opposed to catering solely to their perceived inabilities. Ability-based Design proposes seven principles which should be upheld in design; ability, accountability adaptation, transparency, performance, context and commodity. This work subscribes to the philosophy of Ability-based Design and extends it further by designing to appeal to the user’s personal interests and motivations.

Our previous research shows that interest-based planning and scheduling technologies have been shown to enhance social interaction, verbal communication, and engagement in class activities in autism-specific classrooms [43]. We consider here how such an approach may work in the context of a) engagement in word learning (oral and written) and b) minimally-verbal children on the spectrum.

Interest-based Approaches to Autism Support in Design

Children on the spectrum often have very specific areas of interest. Research has shown that language, social communication (e.g. fluidity and frequency) and social sensitivity increase when children on the spectrum’s

Technologies Spectrum

for

Learning

with

Children

on

the

Antle [4, p.27] outlines that, within Child-Computer Interaction, the aim is often to “facilitate engaged and playful learning, rather than supporting adult work practices”. Indeed, increasingly research in this area is exploring approaches such as multimodal interaction [29, 30], motion-based touchless interfaces [7] and mixed

1

There is much debate around the terminology used to describe autism. In this paper, in adherence with guidelines on best practice by Kenny et al. [23] we use the terms “children on the autism spectrum” and “children on the spectrum” when discussing our child participants.

107

Designing for Different Abilities

IDC 2018, June 19–22, 2018, Trondheim, Norway

tangible and software technologies - such as digital living media systems [15].

and applicability of digital technologies. Here, we look at how simple apps, provided on the iPad, a widely available device (and extensible to other tablet platforms), may facilitate engaged and playful learning [4].

For example, Linguabytes [17] provides a play-andlearning platform based on developmental opportunities and individual interaction styles with toddlers with disabilities. This Augmentative and Alternative Communication (AAC) device provides a very engaging tangible interface for word learning, but we note it does rely on predefined words and tangibles to represent them, rather than the child choosing their own words from their own experiences. In our choice of exploring a personalised App for word learning, we recognise the quality of experience of tangibles. However, we sought to use an App for children to photograph tangibles in their own environments, in order to create a portable, broadly appropriable approach that blends the flexibility of digital technologies with the qualities of the tangible world. Other technologies provide digitised versions of prevalent communication methods such as Motocos, a PECS-like (Picture Exchange Communication System) app [32], which is useful as it is based on the existing PECS communication system. However again these did not explore children integrating their own images or audio.

AAC and Commercial Technologies

AAC apps support the communication challenges that may be experienced by individuals on the spectrum [26, 36]. However, most AAC technologies provide third-person icons, audio, cartoonised images or graphic symbols [32] instead of first-person photos and audio, which may be more personally relevant. With predesigned content, these technologies do not build on the child’s own life and selfexpression. An extensive search of mobile app stores, commercially available products and related academic work revealed that there are many sophisticated AAC and literacy apps, such as Spectronics Clicker [37], Proloquo2Go [33], KnowjiVocab [24], Choiceboard creator [38], Lingraphica Smalltalk [27] and Dynavox [10]. While some of these apps do allow children to add their own photos and voices, they involve a significant amount of overhead navigation and are complex, requiring the user to employ broad metacognitive processes in order to use them. For example, Clicker [9], a popular suite of Apps used in Australian Schools with children on the spectrum, begins with building sentences by using simple words and pictures. However, its premise is that a child can fathom that words in a sequence connect together to make something meaningful and sustain attention and interest in building the sentence. Each page contains several word and picture choices to construct the sentence, which may be too complex. We needed and thus created a very simple app, where a child could focus on single words of great interest. Knowji Vocab requires users to choose the word that fits in a sentence, while being provided with the dictionary definition and multiple visual representations of the word. Proloquo2Go allows the user to insert images but this is one small feature within a much bigger and complex system. Proloquo2Go and Dynavox products are globally-available and popular. We found these rely heavily on preloaded icons which, although perhaps helpful to have a commonly understood language of icons, can often be presented in large volumes on busy, information-heavy screens, and again do not allow for children’s own customisation. Those which do allow for customisation can be time-consuming and taxing in their configuration or end-user programming [32]. AAC devices are also often inflexible, indiscrete, bulky and/or expensive [16, 26]. Thus, we found these unsuitable for exploring a person-centred approach with minimally-verbal children.

Examples of customisable technologies for children on the spectrum include Bartoli et al’s [7] Bubble Game, Space Game and Shape Game. These are touchless motion-based games designed to e.g. improve speed and accuracy of motor-visual coordination and are designed to be easily customisable by therapists and parents based on each child’s individual needs. Our aim is to build on customisable, needs-based concepts such as these, but from a holistic perspective which steers away from assessment and accuracy and focuses on broader concepts of engagement. Previous work shows that providing children with a platform through which to direct their own learning makes learning relevant to the child [e.g. 43]. In line with Greenspan [14], we suggest following the child’s lead in order to establish joint attention and greater cycles of interaction. This way, personal interest may be identified and leveraged in order to expand their area of interest. As Hurst and Tobias [19] note, providing increased control, autonomy and empowerment over design elements of assistive technologies can improve the adoption and integration of such technologies into daily life. Garzotto et al. [13] combine paper-based tangibles with multimedia with children with severe cognitive, linguistic and motor difficulties in school contexts, producing scenario-based methods and concepts for integrating RFID technology in such contexts. We take inspiration from this work, but focus here on readily available mobile devices, as these are largely used and usable by our population of minimallyverbal children on the spectrum.

In summary, we outline that there exists a wealth of engaging technologies which aim to support communication and learning with children on the spectrum. In terms of AAC, few technologies designed commercially and within HCI focus on child-led, personally relevant content such as photos and audio. Further, they often seek

In summary, tangible and multimodal approaches provide great support for learning words. Apps offer the potential to blend tangible approaches with the flexibility, accessibility

108

Designing for Different Abilities

IDC 2018, June 19–22, 2018, Trondheim, Norway

to measure and assess rather than to support individual interests and strengths. Interest-based approaches have focussed on designing with verbal children on the spectrum, but have yet to embrace their minimally-verbal peers.

to encourage collaborative curation of content between parent and child, ownership of entries, and exploration of new words [44].

STUDY DESIGN

To evaluate engagement with MyWord we conducted a field study in two classroom settings. The participants comprised 12 children on the spectrum (aged 5 to 8; 1 female, 11 males), two teachers and one speech therapist at an autism-specific primary school in Brisbane, Australia. The inclusion criteria specified that child participants would be pupils at the participating school, and that teacher participants indicated a willingness to be involved in the study. Advice from parents and professionals was sought on each child’s suitability for the study and initial participant observations by the first author were used to ensure suitability. Older and more verbally-able children were excluded. All children in the study were already familiar with iPads, using them at home and across settings, even though fine motor skills may still have been developing for some.

Classroom Settings and Participants

MyWord is a personal audio-visual dictionary iPad app. It works by enabling the user to create entries for individual words and to add their own photos and audio recordings to each word. The app is simple in layout, comprising a home screen (in which users can type their name and take a profile photograph of themselves), a search function, and a word game function. Words in the personal dictionary are organised though an alphabet screen. (Fig. 1a) Users can click on a letter to find existing words beginning with that letter. They can click on the ‘+’ icon to add a new word (Fig. 1b). Users are then prompted to enter the word they would like to create an entry for. The app then provides the option to use an existing photo or to take a new one. One or more photos are then inserted under the word as a new entry (Fig. 1c). Following this, users have the option to create an audio recording pertaining to the entry. Upon returning to the alphabet screen, a new tile is created showing the entry, beside other entries for that letter (Fig. 1b). The app is designed to be flexible in use so that it can either be used independently by children or can be used in conjunction with proxies, depending on the child’s abilities.

The children had varying levels of abilities and all bar one were minimally-verbal. As this study wholeheartedly strives to support strengths in lieu of dwelling on difference, and as many of the children share common characteristics with their classmates, these abilities are described from a whole-class perspective. Class 1 was a reception class taught by Teacher A. It consisted of six children (all male) aged 5 to 6 years old, who had just started primary school. Much of the early education for Class 1 surrounded activities of daily living such as sitting still, paying attention to the teacher and class activities, learning to go to the bathroom etc. These children were typically still learning to control their bodies and used various self-stimulatory techniques to get proprioceptive feedback from their bodies. Children in this class typically communicated in one-word repetitions or in noises and utterances. Their receptive language skills were just beginning to develop and, over time, there was a marked increase in ability to follow teacher instructions. Generative and functional language skills were still an area under significant development in Class 1. A typical goal for children at this stage is to learn specific sight words, such as ‘the’, ‘to’, ‘on’, ‘at’, etc. These are typically learned by rote (i.e. repetition), often when children are expected to be seated at their desks, paying joint attention to the interactive

MyWord was not specifically designed as an AAC or learning tool, but rather as a working prototype to be used in the spirit of flexibility and open-ended implementation in autism-specific classrooms. The idea was to garner an understanding of how a simple yet highly personalisable app may be able to support individual strengths and interests and address common challenges of those on the spectrum. The concept of MyWord arose from a conversation between two parent-researchers, one whose child on the spectrum always asked her to type her favourite words into Google, and the other whose child was always keeping lists of her favourite words. Combining these ideas with the known phenomena that children on the spectrum are often very engaged with things of personal interest to them [45] (e.g. smoke alarms, washing machines, wheels, fans) led us to develop a prototype to explore how children on the spectrum might use a personal dictionary to support their interests and strengths. This concept was explored in a pilot study with one child and one parent, showing potential

Figure 2 (a) Kelvin with the Speech Therapist and his two favourite things - his Cookie Monster toy and the trampoline, (b) Rory's entry for the sight word 'in' and (c) an image from Matthew's MyWord showing him with one of his gnomes

109

Designing for Different Abilities

IDC 2018, June 19–22, 2018, Trondheim, Norway

whiteboard, while a passive visual display showed the target words and static icons or cartoonised images depicting the word. In contrast, in Class 2, the children tended to be much more accustomed to school. They often had greater verbal abilities and more familiarity with words and reading. Class 2 were older consisting of six children aged 7 to 8 years old (5 male, one female) and taught by Teacher B. These children typically were equally minimally-verbal as those in Class 1, however their receptive language was more fine-tuned. So, for example, children in Class 2 could follow instructions to a greater degree, had more control over their behaviours and were more independent. However, it is important to note that there were still many significant communication challenges in Class 2, and none of the children produced generative or functional language (hence they are minimally-verbal).

analysis (VIA)-inspired approach [20] with an interdisciplinary team of four academics, all of whom were involved in the study and its ethics agreement. This process involved watching, discussing and theming chunks of video data together, specifically as a means to verify broader themes from observations in more fine-grained detail. One of the key tenets of VIA is to ground the understanding of particular phenomena in real data which can be scrutinised through replay and viewed by multiple people. As per [20] we identified phenomena of interest in the video recordings by questioning whether they were typical or atypical and the team had robust discussions around these potential phenomena. The unit of analysis was the individual child, within the class setting. Further, analysis of the data was discussed and reflected on with teachers and speech therapists, in acts of member checking and in order to clarify any points of uncertainty between the analysts. Below we describe the three main themes in the use of MyWord identified in this analysis: engagement in academic tasks, interaction with other children and teachers, and self-expression of strengths and interests.

Classroom Observations, Interviews and Analysis

The aim of the field study was to explore how MyWord is used in an autism-specific classroom context. Given the exploratory nature of the research, our approach was predominantly qualitative. The study ran over two class terms (20 weeks, plus school holidays). Each child was given an iPad for the duration. After an initial introductory session with the first author, the teachers and children were given free reign over how to use the app. Names have been changed in line with ethical consent.

FINDINGS

We explored how using a digital tool which allows children to create their own catalogue of personally relevant words and related photographs and audio recordings is used in autism-specific classroom settings. We did not attempt to measure and quantify learning, nor to measure whether children could spell or read particular words. Rather we sought to understand the nature of the whole activity when it is based on a child’s interests and choices, through class observation and interviews with teachers, speech therapists and teacher aides. Our participants reported; enhanced engagement in, and attention to, academic tasks; enhanced interaction between peers and between children and adults; and enhanced self-expression regarding their own interests, all surrounding the activity of creating a personal digital dictionary. Teachers suggested that overall these factors lead to greater attending to, engaging with, and connecting meaning to, new words.

Data was collected regularly through visits to the school by the first author, where participant observations and semistructured interviews were undertaken. In addition, diary data was collected in the form of a Teacher Reflective Log which asked teachers to detail changes and developments in the children’s engagement and interaction through use of the app. Data was collected in the form of field notes, audio recordings, video recordings, text input and photographs. In total, we recorded 30 hours of classroom interactions (both with the app and in the form of observational scanning) and conducted 8 separate interviews with school staff, including teachers, speech therapists, teacher aides and team leaders. Participant observations were conducted on ten occasions, typically lasting between 2 and 4 hours. Audio recordings, video recordings and field notes were transcribed and analysed following a thematic approach [8] to examine how children and teachers engaged with MyWord and how they integrated it into their classroom activities. This work was carried out over a relatively long period of time, while visiting the field site fortnightly, and, as such, our themes and concepts were continuously evolving and iterating as new understandings and data came to light. Our data did not consist of one homogenous data set collected during one experimental paradigm. Instead, we engaged in reflective strategising and reframed themes as they emerged over time, using a Debrief O’clock approach [41]. Central to this process was the use of video data, which enabled the team to investigate in detail how engagement unfolded in the classroom and to share the observations with all authors. We analysed the video data using a video interaction

Self-Representation and Interests foster Engagement in Academic Tasks

MyWord was frequently used to engage children in academic tasks and to foster engagement in whole-class activities. Teachers reported using the app on a daily basis as part of their lessons, with some children indicating they would like to use it more (perhaps three or four times a day) and others less (for example when a child was displaying challenging behaviour and refrained from joining in activities). Overall, we observed that children whom teachers had previously identified as finding it challenging to attend to class activities, such as learning sight words by rote on the interactive whiteboard, were showing greater joint attention to the words in their MyWord app. Teacher A describes how the app enables engagement through the process of creating meaning: “I think it helps overall, because rather than just looking at a picture, they're

110

Designing for Different Abilities

IDC 2018, June 19–22, 2018, Trondheim, Norway

seeking out the item as well and having to write the word in, having to say the word.”

completely involved in the process and he's integrating that in the app because he loves travel.”

In Class 1 (5-6 year olds), academic tasks often revolved around learning new words by using technologies such as the interactive whiteboard. The children in this class were still very young and teachers reported that they experienced challenges with attention and engagement in class tasks, such as learning new words. Thus, in this class, the teacher and the speech therapist designed activities with MyWord which encouraged the children to be physically represented in the photos pertaining to each word. In one example, Teacher A’s chosen sight word for the day was ‘in’, a word which she noted was a difficult concept to teach as a standalone word. The teacher and speech therapist decided that a fun and play-oriented way to explore this word would be to photograph each child ‘in the box’. This then became a whole-class activity, with each child getting ‘in’ the box in their own way (Fig. 2b). The speech therapist noted: “They've got so many links back to the word [due to] that activity where we actually create the entry”.

These examples illustrate that a key strength of MyWord was that it supported children to represent themselves and their interests in the creation of MyWord entries (e.g. Fig. 2a). When discussing the importance of using selfrepresentation to engage the children’s interest in attending to new words, Teacher B commented: “Children love looking at themselves! It is far more engaging for them. Also, when re-watching something that they are “starring” in, it is bringing back a memory, and is thereby more engaging. Some of my students struggle to attend to information. Using their own photos tends to increase their ability to attend to the information. It’s not always foolproof, but definitely higher chances of them attending if the pictures are of themselves or someone they know very well compared to an unknown person.” Engaging Children in Physical Action

We observed that teachers used the app to support existing classroom strategies for management of engagement and attention, in this case, that of the ‘movement break’. Teachers incorporate small chunks of time (roughly 5 – 10 minutes) in every session (of roughly 45 minutes) for children to expend physical energy doing an activity they enjoy, such as bouncing on the trampoline. This then allows the children to refocus more clearly on the class task at hand. The speech therapist described this as “emotion regulation”.

Both teachers and the speech therapist noted that they found benefit in using a digital dictionary which affords taking one’s own photos, as opposed to using pre-loaded, preselected images. The concept of the focus moving from third person (e.g. a character or cartoon, as is common in visual dictionaries) to first person (a photo of the self or something that represents their own interests) was discussed by Teacher A: “I think it means more to have actually taken the photo yourself because then you've also got that residual memory of performing the action as well’. This suggests a level of ownership over the content created. The speech therapist suggests that this is a more connected way of learning sight words: “So it's not just actually the word but it's the picture that was associated with that word too. [It’s] the feeling or the memory of it.”

Teacher B noted on the use of action and movement: “Our students greatly benefit from the opportunity to get up and move around a little between table-top activities. While taking the photos with specific students, the other students were able to explore and play in whatever the different environment was that we were taking the photos in, which built more meaning and memories around the focus word, provided some exercise and movement, resulting in them feeling more regulated and ready to attend and focus when they returned to their desk for more tasks in the classroom”.

In Class 2, while the children still featured in the photos, it was more common to see the images of items or activities of interest to them. For example, Freddy chose to photograph the world map in the classroom, having just used it in a previous lesson learning about The World. In a later interview, Teacher B reported that Freddy often travelled overseas with his family and, as such, was interested in maps, specifically in linking countries and their names. Although minimally-verbal, Freddy used the app to take photos of different sections of the map, carefully copying the names of the countries into the task bar and quietly recording himself attempting to say the word into the audio recorder. Upon successfully doing so, he became very excited, jumping, clapping and laughing. Upon review with Teacher B, he was similarly motivated and physically engaged, especially when being praised for his excellent spelling. Further, the teacher used this event as inspiration to begin encouraging the children to integrate curriculum-based content into their MyWord apps. “[He’s]

The speech therapist described how the teaching team incorporated these movement breaks into the process of creating entries for MyWord. In this case, the class was creating their own entries for the word “goes” and the speech therapist commented: “We had them all lined up at the beginning of the [pirate ship] bridge and then we just let them loose going back and forth and the bridge is really rocky and fun and we were recording on the iPads “Alan goes over the bridge, Alan goes. Goes". Teacher B postulated that: “Because auditory processing of verbal information is often limited for children with ASD, I feel that “doing” the word demonstrates the meaning of a word in a way that “telling” them would never get across. It also creates a solid, visual reference for the students to refer back to so that they have created a “visual” word

111

Designing for Different Abilities

IDC 2018, June 19–22, 2018, Trondheim, Norway

meaning when they re-look at their picture, which they can link to a real event, thereby clarifying the meaning of the word.”

like, maybe it's actually reinforcing the language in the word, it's also encouraging kids to go back and look over, because it's pictures of them and it's fun and funny!”

Social Interaction with Peers and Teachers to Create Entries

Teacher A notes: “Any lesson that we do which involves movement around the room is a lot more engaging. That’s because there's a lot of awareness of ‘Well when's it going to be my turn? What are the other people doing? Where have I got to be?’”

In order to create the MyWord entries, interaction was required on several levels such as social, environmental, child-to-child and child-to-adult, all of which depended on individual differences and how the children chose to represent a new word in their dictionary. Teacher A relayed: “When taking photos for My Word, it usually gave us a good excuse to create a social event – everyone going out to the playground for a play while photos were being taken”.

So, we see that the action afforded by the dynamic activities involved in creating MyWord entries can be inherently social, leading to organic engagement in activities such as turn-taking and social negotiation between peers, which are classically challenging for children on the spectrum. Further, the speech therapist commented on the importance of play and sociality in relation to Donald, a 5 year-old boy who is minimally-verbal. She describes how this was supported by the app when playing on the pirate bridge: “He just runs from one side to the other and he's hearing the language that goes along with that, repeatedly. And watching the others do it, waiting his turn. And the phrase structure as well as the word itself is really important, it's repeated over and over. So I think that, when we've actually got them in an action that's already fun, you've created joint attention and more meaning”. Here we see that watching and listening to others create their MyWord entries may provide greater connection to its meaning, through repetition of the word in a social context.

As the children in Class 1 were younger and in need of greater structure and guidance, each entry was created oneto-one between child and either their teacher or the speech therapist, rather than independently. This meant that the other children had to socially navigate and to wait their turn while the teacher and speech therapist were engaged with other children. In Class 2, the children were more interested in recording items of interest independently. This could involve interaction with the teacher, (e.g. to convey the desire to go to the playground to take a photo of the swing), other children in the class (e.g. to ask them to take the photo of them on the swing) and the environment (e.g. to navigate the busy playground and convey the intent to others). Teacher B reported that she found that using MyWord was “a nice way to wake up that little social mind and awareness without it having to be forced and structured all the time”.

Self-Expression of Interests

As previously discussed, children on the spectrum may often have areas of very specific interest. Indeed, throughout the duration of the study we found that child participants would use MyWord to capture, explore and convey these interests. For example, Matthew is very interested in garden gnomes and he used them throughout his MyWord entries, expanding from ‘gnome’ to also incorporate them into his entries for ‘garden’ and ‘toys’ and ‘green’ (Fig. 2c).

A critical element in these interactions was the choice given to children in the creation of MyWord entries. This was particularly relevant for Class 2, where children were interacting with their peers, taking photos of each other and adding these as entries to their own MyWord apps. For example, when given free use of the app, Freddy asked to take a photo of his friend, Toby. This enabled social negotiation between the two boys as they decided where to take the photo and what to call it, settling for ‘My friend Toby’. Again we see that personal, lived social experience can play an important role in engaging a child’s interest in the learning of words, specifically in working together to spell the words and record the audio.

In Class 2, the older class, each child was found to create entries and use MyWord in their own individualised way, e.g. making entries showing their favourite toys and activities such a “fire truck”, “my boots” or “outside”. The children often chose to capture and describe objects or activities of interest. The teacher often provided broad class instruction, and children then autonomously searched for and selected their own item, object or person of interest to photograph and add to MyWord. Teacher B notes that the children’s agency and independence was built up over time, beginning with more structured teacher input, to a point where now children’s individual strategies for use are becoming clear: “That's the first session that Freddy has just gone off on his merry little way. Prior to that it was very adult-driven "what are you going to take a photo of next?" and "choose a letter". So, they're starting to come up with their own ways of using it I think. And you see the

Combining Physical Action, Play, and Social Interaction

Teachers and speech therapists found that the app provided an opportunity to create fun activities around attending to new words, which in turn provided potential for increased social interaction. We observed that, in the typical learning environment when the app was not in use, children sat at their desks while the teacher ran through a number of sight words on the interactive whiteboard. However, through the dynamic action afforded by the use of the app, fun, play and social opportunities arose. The speech therapist noted: “I think that the fun element, it does just cement it. So I feel

112

Designing for Different Abilities

IDC 2018, June 19–22, 2018, Trondheim, Norway

difference between how they all use it”. Teacher A summarised the differences in choice and agency between the two classes: “The ones who are independent with it are definitely choosing their own things, they aren't in the photo so much because they are choosing the items.” These individual differences are central to the initial conception of the MyWord app; that each child finds different words motivating and that we can use technologies to identify and engage these interests in learning.

new ways. Familiarity and confidence may explain this progression of use, however, further analysis is beyond the scope of this paper. Self-Representation and Personal Content Creation

One of the main benefits of this approach is that it enables individualised content-creation, either child- or teacher-led [43]. We found that teacher and speech therapist strategies were key to structuring certain aspects of engagement, however, in addition, children employ their own strategies for meaning-making through use of the app. Typically, new words are introduced to our child participants through use of icons and cartoons on the interactive whiteboard. We found that including images of the child as first-person protagonist in their dictionaries helped them to contextualise the word and to encourage greater attention and engagement in the task at hand. By using personally and contextually relevant photos, it became apparent that children displayed greater ownership of the entries they were creating, something which the expert participants reported was very important in engaging with meaningmaking around a new word. This is in line with [5, 7, 19] who discuss the importance of flexibility in the adoption of a technology, particularly for those with additional needs.

Use in Class 1, the younger class, was much less individualised on the surface, with teachers providing the majority of instruction to the class as a whole, something the teachers reported as highly common when children are still so young and experiencing developmental delays. However, we found that children explored their own methods of personal representation in the content. Although the teacher provided instruction on which word was to be the focus for the activity and helped direct children in how to enact it, the children provided input commensurate with their abilities. For example, when instructed to take a photo of “in” the box, Rory chose to sit in the box (Fig. 2b), Donald stepped one foot in the box, while Andrew chose to take an action shot of him jumping into the box. Although a simple example, this may suggest that each child is developing their own personal understanding of the word through their self-expression in the course of the action and content creation.

Design recommendation: We suggest that designs may benefit from allowing space for minimally-verbal children, and indeed all children, to create their own content, instead of pre-defined or third-person materials.

DISCUSSION

We note that MyWord enables children on the spectrum and the people who support them to see how their interests and engagement with words change over time. Rather than comparing to norms and metrics made across populations, it looks at individual vocabulary and creative content from a holistic perspective, telling us who this person is and what motivates them. This research extends Frauenberger et al.’s work which shows the importance of technology which is personal and personalisable [12]. It suggests how openended technologies which afford customisation and appropriation [5, 13, 19] can help us move beyond assistance and intervention [11] to allow children on the spectrum to record, express and share their interests on a personal level.

We found that the processes involved in using MyWord in autism-specific classrooms included: a) Representation of the self, others, and interests; b) Dynamic action and play in the creation of entries; and c) Varying levels of personal choice and agency in the creation of entries. We see these processes as interlinked and equally important in the creation of meaning around the words represented in the audio-visual dictionary entries. Teachers and speech therapists reported that undertaking these processes while using the app led to; enhanced engagement in class and social tasks (particularly in engagement in the learning of new words); enhanced child-to-child and child-to-adult interaction and; enhanced self-expression of personal interests. We reflect upon the use of MyWord and consider how these lessons might generalise to design recommendations for other technologies for autism-specific classroom settings and beyond.

Design recommendation: We highlight the potential for holistic tracking of interests, strengths and abilities over time, as opposed to quantitative metrics and normalised measurements.

In terms of use over time, we found that the app was used in increasingly integrated (for teachers) and independent (for children) ways as the study progressed. Teachers and speech therapists naturally integrated the app with classroom activities, specifically in the teaching of sight words, and progressively relied on MyWord as the main support tool in this activity over time. Children became more independent in using the app as they became more familiar with it and began to experiment with using it in

Dynamic Action and Play

Beyond being represented in the app, we found that the actions which led to the creation of the representation were key [34]. Teachers and speech therapists suggested that the doing of the word in creating the entry served to strengthen connection to the meaning of the word itself, providing a shortcut to attention and engagement. We found that providing this physical connection to the doing of the word made sight words more personally relevant and therefore

113

Designing for Different Abilities

IDC 2018, June 19–22, 2018, Trondheim, Norway

memorable to the child, something that teachers and speech therapists noted is difficult to achieve in rote learning of sight words. The connection became much stronger to the words as they were enacted and embodied. This connection to ‘residual memory’ and embodiment echoes research which suggests physical movement helps humans express language and to retrieve mental items [1, 3, 4, 29, 30, 31].

We found that children of all ages and stages found ways in which to showcase their personal interests and abilities. For some it was showing that they were learning how to attend to the teacher’s instructions, for others it was showing independent decision-making and creativity. We also found differences in how it was used across the two age groups. Children in the older group took more initiative and took photos of things important to themselves. In the younger group, in contrast, they took more images of themselves doing the word. Approaches used by early years therapists, such as Greenspan et al. [14] encourage parents to follow the child’s lead in order to establish joint attention and greater cycles of interaction and offer great potential in the design of technologies. We think this is particularly so when technologies are designed to support children to interact socially through the technology. The possibility to share word entries in a class dictionary is a natural extension of the approach taken here. At present each child has their own dictionary on their own iPad and they show and share using their personal device, but there is potential to facilitate further sharing by e.g. submitting words to a class dictionary or enabling swapping of words.

Through regular movement breaks and physical activities around the use of the app, children’s social interaction was seen by teachers and speech therapists to increase. Children were involved in many more instances of social negotiation, social play and social integration; the teachers noted that, through using the app, these forms of interaction became an organic or natural part of their daily routines. This is important as children on the spectrum often face challenges in social interaction and benefit from increased practice at it. Our experts reported that this was certainly the case for the child participants in our study. Malinverni et al. [29, p.332] emphasise the importance of physicality in children’s engagement with the world and highlight Ackerman’s [1] notion of “’acting-in-the-world’ as key to knowledge construction. Indeed, there are many interesting mobile technologies to encourage play and social interaction [21, 22] and learning through touch-based games [7]. We see potential for further exploration of these tangible and environmental elements.

Design recommendation: Extending personal apps to broader whole-class tools may be useful in understanding and extending the sociality afforded by technology. Through use of the app in an exploratory setting, children were supported to build a clear catalogue of their interests. These included; items (e.g. fire engine), people (e.g. my friend Toby), themselves (e.g. I am going across the bridge), activities (e.g. playing ball), personal strengths (e.g. I am good at maths) and environments (e.g. the playground, my house etc). We observed many instances of children’s very specific interests being conveyed through the app e.g. gnomes, countries, door handles, favourite clothing. We also note that the word of specific interest opened up scope for the child to engage in other words related to the topic, e.g. ‘gnome’ led to ‘garden’ and ‘green’. In line with Greenspan et al [14], we suggest that personal interest may be leveraged in order to expand their area of interest and designs should allow space for interests to be explored. MyWord itself did not dictate with whom agency and choice of subject for the entries would lie, however its open-ended nature allowed for different ways of initiating the process of content creation, depending on each individual user’s own choices. As [19] note, providing increased control, autonomy and empowerment over design elements of technologies can improve the adoption and integration of such technologies into daily life.

We found that teacher and speech therapist strategies for implementation involved many play-based activities, as these made the connection to the word most salient. Existing work on play-based and embodied learning technologies support this view [3, 4, 7, 21, 22, 31, 35]. Perhaps critically different here, is that the app itself is not particularly playful nor is it gamified. It is simply an audiovisual dictionary. It is simple and expressive enough that teachers and children can appropriate it in playful ways to their own ends. Design Recommendation: Simple and expressive designs enable minimally-verbal children and their adult proxies to use them in playful, embodied ways. Although designed as a personal app, because it was sufficiently simple, the app afforded child-centred flexible approaches to its use. It thus became incidentally social, engaging children in turn taking, joint attention, waiting, listening and playing with peers during the content creation activities. Personal Choice and Agency

Previous work shows that providing children with technologies through which to make personal choices [39] and with a platform through which to direct their own learning makes learning relevant to the child [43]. Through building a personal catalogue of words, images and audio recordings, the children have claimed centrality and autonomy in their own engagement, interaction and expression, shaping and building a learning resource in line with their interests and abilities.

Design recommendation: Supporting minimally-verbal children to exercise their choice and agency is possible. However, tools and technologies often fail to support thoughtful choices, leading to a lot of random clicking and selecting of complex visual functions. We suggest to support choice and to design in ways to encourage thoughtful selection through following the child’s lead.

114

Designing for Different Abilities

IDC 2018, June 19–22, 2018, Trondheim, Norway

Extending Ability-based Design Principles

to capture the moment and the movement. Teachers suggested this would then would give a greater embodied sense of what e.g. ‘in the box’ meant and provide greater understanding of the social interactions at play and a clear link back to the dynamic action at the point of data creation. An important constraint was that teachers did not always have time to review existing content in MyWord with their students. To help address this issue, teachers suggested to design a ‘class dictionary’, a way to aggregate entries from the whole class so that teachers could share and discuss them together on the interactive whiteboard. We feel this will enable MyWord to further support child-to-child social interaction through a group review process in the classroom.

From an ability-based design perspective, it appears that MyWord satisfies the relevant criteria outlined by Wobbrock et al. [46]. Firstly, through actively encouraging the inclusion of children’s own interests and strengths in audio-visual formats, MyWord inherently focuses on the child’s ability not disability and “strives to leverage all that users can do” [46, p.8]. Secondly, it satisfies the accountability stance as it is iteratively updated and changed in line with live responses from participants, rather than requiring the users to change in line with the technology. Thirdly, its focus on user-created content and personalisation make it user-adaptable. Fourthly, it is transparent in that it is fully customisable by the user. Fifthly, it satisfies the performance principle by “regarding user performance, and may monitor, measure, model, or predict that performance” [ibid]; MyWord provides a tool through which to regard development over time. Finally, it satisfies the commodity principle as it “comprise[s] lowcost, inexpensive, readily available commodity hardware and software” [ibid] i.e. the iPad and app software, with potential for use on cheaper platforms.

We previously posed the question; can using a child’s own personal interests and choices motivate engagement in curricula activities such as engaging with spoken and written words? Our small steps towards answering this find that encouraging the inclusion of the self and of personal interests in digital technologies may help to convey the needs, interests and strengths of minimally-verbal children on the spectrum in a classroom setting. MyWord supported minimally-verbal children to show their abilities, to create content, to interact socially, and to express themselves.

Extending this, our work with MyWord also leads us to consider the importance of motivation, as well as ability. Ability is contextual, whereas motivation is personal. While the above principles tend to focus on elements of accessibility, which are undoubtedly important, these principles do not necessarily take into account a user’s motivations for use, interests, nor their agency in using.

CONCLUSION

Our study presents insights into the ways in which minimally-verbal children on the spectrum used a personal audio-visual dictionary app to capture, explore and convey their personal interests and choices, while engaging with the written and spoken word, a core part of the Australian Curriculum. Through processes of self-representation through photos and audio, dynamic action and play, and expression of personal choice, the children in our study were found to exhibit enhanced engagement in class tasks, enhanced social interaction, and enhanced self-expression. We suggest that the leveraging of personal interests and encouragement of personal choice and agency can motivate minimally-verbal children on the spectrum to be creative, engaged and enthusiastic about words, engaging those for whom ‘A’ may not stand for ‘Apple’, but for ‘Air Conditioner’ instead.

Design recommendation: The implications here may be that we ought to build on accessibility guidelines to further include concepts of motivation, agency, engagement, social interaction and self-representation which look beyond concepts of physical and intellectual competency and towards the individual self and what this means for design. Limitations and Opportunities

This type of approach can extend to other important parts of the curriculum, such as building sentences, numeracy, planning and so on. Pragmatically, there may be limits to personal creation and sharing of content, but this is not well understood and is a topic for further research. It is difficult to do comparative studies in autism. A limitation of this study is that it is hard to know how much the learning experience is improved through personally invested approaches. We have discussed our findings in terms of interaction as opposed to "learning" per se, as providing a quantifiable and measureable tool for learning new words was not our goal, although further work may consider this.

ACKNOWLEDGEMENTS

We would like to thank our child and adult participants for their continuing enthusiasm and insights in this project. SELECTION AND PARTICIPATION OF CHILDREN

The child participants in this study were selected as they attend an autism-specific primary school which is a member of the Autism Cooperative Research Centre, Australia. Those teachers who expressed an interest in collaborating in the project were asked to identify which children they thought would most benefit from the study. In each case, the teacher felt the whole class would benefit, regardless of age or ability, and thus, the children were selected. Parental consent was obtained through information and consent forms which adhere to the QUT ethical consent procedures.

We identified several usability issues with the app. In reviewing words, teachers reported that the way in which the word appeared on the entry screen was too small. Additionally, a scrolling function would allow children to flick through their words more conveniently and see the big word and picture together to consolidate their memories and learning. Finally, a video function would allow participants

115

Designing for Different Abilities

IDC 2018, June 19–22, 2018, Trondheim, Norway

In addition, child information sheets using pictures were provided to the children and the researcher and teachers provided demonstrations of what was to be expected through use of the app and involvement in the study for each child.

on Human Factors in Computing Systems (CHI EA '16). ACM, New York, NY, USA, 3373-3378. 12. Christopher Frauenberger, Julia Makhaeva, and Katharina Spiel. 2016. Designing Smart Objects with Autistic Children: Four Design Exposès. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (CHI '16). ACM, New York, NY, USA, 130-139.

REFERENCES

1.

2.

3.

Edith Ackermann. 2004. Constructing Knowledge and Transfroming the World. In A learning zone of one’s own, M Tokoro and L Steels (eds.) IOS Press. American Psychiatric Association. 2013. Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition (DSM-5). Arlington VA: American Psychiatric Association Alissa N. Antle, Ylva Fernaeus, and Paul Marshall. 2009. Children and embodied interaction: seeking common ground. In Proceedings of the 8th International Conference on Interaction Design and Children (IDC '09). ACM, New York, NY, USA, 306-308.

4.

Alissa N. Antle. 2009. LIFELONG INTERACTIONS: Embodied child computer interaction: why embodiment matters. interactions 16, 2 (March 2009), 27-30.

5.

Saskia Bakker, Elise van den Hoven, and Berry Eggen. 2012. FireFlies: supporting primary school teachers through open-ended interaction design. In Proceedings of the 24th Australian ComputerHuman Interaction Conference (OzCHI '12), ACM, New York, NY, USA, 26-29.

6.

Simon Baron-Cohen. 2017. Editorial Perspective: Neurodiversity–a revolutionary concept for autism and psychiatry. Journal of Child Psychology and Psychiatry 58, 6: 744-747.

7.

Laura Bartoli, Franca Garzotto, Mirko Gelsomini, Luigi Oliveto, and Matteo Valoriani. 2014. Designing and evaluating touchless playful interaction for ASD children. In Proceedings of the 2014 conference on Interaction design and children (IDC '14). ACM, New York, NY, USA, 17-26.

8.

Virginia Braun and Victoria Clarke. 2006. Using thematic analysis in psychology. Qualitative research in psychology 3, 2: 77-101.

9.

13. Franca Garzotto and Manuel Bordogna. 2010. Paper-based multimedia interaction as learning tool for disabled children. In Proceedings of the 9th International Conference on Interaction Design and Children (IDC '10). ACM, New York, NY, USA, 79-88. 14. Stanley Greenspan, Serena Wieder, and Robin Simons. 1998. The child with special needs: Encouraging intellectual and emotional growth. Addison-Wesley/Addison Wesley Longman 15. Foad Hamidi and Melanie Baljko. 2017. Engaging Children Using a Digital Living Media System. In Proceedings of the 2017 Conference on Designing Interactive Systems (DIS '17). ACM, New York, NY, USA, 711-723. 16. Gillian R. Hayes, Sen Hirano, Gabriela Marcu, Mohamad Monibi, David H. Nguyen, and Michael Yeganyan. 2010. Interactive visual supports for children with autism. Personal and ubiquitous computing 14, 7: 663-680. 17. Bart Hengeveld, Caroline Hummels, Kees Overbeeke, Riny Voort, Hans van Balkom, and Jan de Moor. 2009. Tangibles for toddlers learning language. In Proceedings of the 3rd International Conference on Tangible and Embedded Interaction (TEI '09). ACM, New York, NY, USA, 161-168. 18. Juan Pablo Hourcade, Stacy R. Williams, Ellen A. Miller, Kelsey E. Huebner, and Lucas J. Liang. 2013. Evaluation of tablet apps to encourage social interaction in children with autism spectrum disorders. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '13). ACM, New York, NY, USA, 3197-3206. 19. Amy Hurst and Jasmine Tobias. 2011. Empowering individuals with do-it-yourself assistive technology. In Proceedings of the 13th international ACM SIGACCESS conference on Computers and accessibility (ASSETS '11). ACM, New York, NY, USA, 11-18. 20. Brigette Jordan and Austin Henderson. 1995. Interaction analysis: Foundations and practice. The journal of the learning sciences. 4, 1: 39-103.

Clicker. 2014. Retrieved April 1 2018 from https://www.autismapps.org.au/language/clickersentences/

10. Dynavox. 2015. Retrieved April 1 2018 from https://www.tobiidynavox.com/ 11. Christopher Frauenberger, Judith Good, and Narcis Pares. 2016. Autism and Technology: Beyond Assistance & Intervention. In Proceedings of the 2016 CHI Conference Extended Abstracts

21. Wendy Keay-Bright. 2006. ReActivities: Autism and Play. Digital Creativity 17, 3: 149-156.

116

Designing for Different Abilities

IDC 2018, June 19–22, 2018, Trondheim, Norway

22. Wendy Keay-Bright. 2009. ReacTickles: playful interaction with information communication technologies. International Journal of Arts and Technology, 2, 1: 133-151.

Interaction design and children (IDC '14). ACM, New York, NY, USA, 115-124. 32. Mohamad Monibi and Gillian R. Hayes. 2008. Mocotos: mobile communications tools for children with special needs. In Proceedings of the 7th international conference on Interaction design and children (IDC '08). ACM, New York, NY, USA, 121-124. 33. Proloquo2Go. 2008. Retrieved July 16 2016 from https://itunes.apple.com/gb/app/proloquo2go/id308 368164?mt=8

23. Lorcan Kenny, Caroline Hattersley, Bonnie Molins, Carole Buckley, Carol Povey, and Elizabeth Pellicano. 2015. "Which terms should be used to describe autism? Perspectives from the UK autism community." Autism 20(4), 442-462. 24. KnowjiVocab. 2018. Retrieved November 20 2017 from https://itunes.apple.com/us/app/knowjivocab-3-6-audio-visual-vocabularyflashcards/id868123899?mt=8

34. Toni Robertson. 2012. Actual bodies are ageing bodies. In Proc. of the 2nd International Body in Design Workshop. 2012. Held at OzCHI’12, November 26–30, 2012 35. Yvonne Rogers, Sara Price, Gerladine Fitzpatrick, Rowanne Fleck, Eric Harris, Hilary Smith, Cliff Randell et al. 2004. Ambient wood: designing new forms of digital augmentation for learning outdoors. In Proceedings of the 2004 conference on Interaction design and children: building a community (IDC '04). ACM, New York, NY, USA, 3-10. 36. Howard C. Shane, Sarah Blackstone, Gregg Vanderheiden, Michael Williams, and Frank DeRuyter. 2012. Using AAC technology to access the world. Assistive Technology. 24, 1: 3-13. 37. Spectronics Clicker. 2016. Retrieved November 20 2017 from http://www.spectronics.com.au/product/clicker-7anz-australian-new-zealand-version

25. Peter Leijdekkers, Valerie Gay and Frederick Wong. 2013. CaptureMyEmotion: A mobile app to improve emotion learning for autistic children using sensors. In Proc. IEEE International Symposium on Computer-Based Medical Systems, CBMS 2013, Porto, 381 – 384. 26. Janice Light and David McNaughton. 2014. Communicative competence for individuals who require augmentative and alternative communication: A new definition for a new era of communication? Augmentative and Alternative Communication. 30, 1: 1- 18. 27. Lingraphica Smalltalk. 2015. Retrieved April 1 2018 from https://itunes.apple.com/au/app/smalltalkconversational-phrases/id403058584?mt=8 28. Margaret Lubas, Jennifer Mitchell and Gianluca De Leo. 2014. User-centered design and augmentative and alternative communication apps for children with autism spectrum disorders. SAGE Open, 4, 2.

38. Jennifer Stephenson. 2016. Using the Choiceboard Creator™ app on an iPad© to teach choice making to a student with severe disabilities. AAC: Augmentative and Alternative Communication, 32, 1: 49-57.

29. Laura Malinverni, Edith Ackermann, and Narcis Pares. 2016. Experience as an Object to Think with: from Sensing-in-action to Making-Sense of action in Full-Body Interaction Learning Environments. In Proceedings of the TEI '16: Tenth International Conference on Tangible, Embedded, and Embodied Interaction (TEI '16). ACM, New York, NY, USA, 332-339. 30. Laura Malinverni, Brenda Lopez Silva, and Narcis Pares. 2012. Impact of embodied interaction on learning processes: design and analysis of an educational application based on physical activity. In Proceedings of the 11th International Conference on Interaction Design and Children, ACM, New York, NY, USA. 31. Brenna McNally, Mona Leigh Guha, Leyla Norooz, Emily Rhodes, and Leah Findlater. 2014. Incorporating peephole interactions into children's second language learning activities on mobile devices. In Proceedings of the 2014 conference on

39. Katharina Spiel, Julia Makhaeva, and Christopher Frauenberger. 2016. Embodied Companion Technologies for Autistic Children. In Proceedings of the TEI '16: Tenth International Conference on Tangible, Embedded, and Embodied Interaction (TEI '16). ACM, New York, NY, USA, 245-252. 40. Helen Tager-Flusberg and Connie Kasari. 2013. Minimally Verbal School-Aged Children with Autism Spectrum Disorder: The Neglected End of the Spectrum. Autism research 6, 6: 468 – 478. 41. Jennyfer Taylor, Alessandro Soro, Paul Roe, Anita lee Hong, Margot Brereton. 2018. “Debrief O’Clock”: Planning, Recording, and Making Sense of a Day in the Field in Design Research Proc. of the 36th Annual ACM Conference on Human Factors in Computing Systems, CHI2018, ACM, Montreal, Canada. April 21-26.

117

Designing for Different Abilities

IDC 2018, June 19–22, 2018, Trondheim, Norway

42. Tracy Westeyn, Gregory Abowd, Thad Starner, Jeremy Johnson, Peter Presti and Kimberley Weaver. 2012. Monitoring children's developmental progress using augmented toys and activity recognition. Personal and Ubiquitous Computing, 16, 2: 169-191.

Interest-based Learning of Words through a Personal Visual Dictionary. In Proceedings of the 2017 ACM Conference Companion Publication on Designing Interactive Systems (DIS '17 Companion). ACM, New York, NY, USA, 132137.

43. Cara Wilson, Margot Brereton, Bernd Ploderer, Laurianne Sitbon, and Beth Saggers. 2017a. Digital Strategies for Supporting Strengths- and Interests-based Learning with Children with Autism. In Proceedings of the 19th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS '17). ACM, New York, NY, USA, 52-61. 44. Cara Wilson, Margot Brereton, and Bernd Ploderer. 2017b. MyWord: Supporting the

45. Mary Ann Winter-Messiers. 2007. From tarantulas to toilet brushes: Understanding the special interest areas of children and youth with Asperger syndrome. Remedial and Special Education. 28, 3: 140-152. 46. Jacob O. Wobbrock, Shaun K. Kane, Krzysztof Z. Gajos, Susumu Harada, and Jon Froehlich. 2011. Ability-Based Design: Concept, Principles and Examples. ACM Transactions on Accessible Computing 3, 3: 1–27.

118