Multimodal Application for the Perception of Spaces (MAPS) Richard J. Adams
Dianne T.V. Pawluk
Barron Associates, Inc. 1410 Sachem Place, Suite 202
Virginia Commonwealth University 401 West Main Street
Charlottesville, Virginia 22901 001-434-973-1215
Richmond, Virginia 23284 001-804-828-9491
[email protected]
[email protected]
Margaret A. Fields
Ryan Clingman
Virginia Department for the Blind and Vision Impaired 397 Azalea Avenue
Virginia Commonwealth University 401 West Main Street
Richmond, Virginia 23227 001-804-371-3776
Richmond, Virginia 23284 001-804-828-7839
[email protected]
[email protected] activities outside of the home take place. This knowledge is best acquired by accessing a map that reveals the spatial layout and content of a venue. A few groups have considered providing audio-tactile maps for mobility purposes (e.g., [1],[2]) on tablet computers. Here we consider the use of audio feedback with different modes of tactile feedback on the resulting performance of the user in acquiring and demonstrating survey knowledge of unfamiliar interior spaces.
ABSTRACT MAPS (Multimodal Application for the Perception of Spaces) is a tablet App, with associated hardware, for providing on-the-go access to audio-tactile maps of unfamiliar indoor venues to individuals who are blind or visually impaired. Performance of the system was assessed using a survey knowledge task in which participants were exposed to three different cue combinations: audio only, audio with haptic cues provided by the tablet’s built-in vibrational motor (built-in tactile), and audio with haptic feedback provided by special vibrating rings worn on two fingers (stereotactile). Of the three conditions, the combination of audio and built-in tactile feedback resulted in superior user performance in judging the relative direction to points of interest. Results indicate that the audio-tactile display improves survey knowledge both when used for a priori (pre-visit) map learning and on-the-go (within the environment) to provide just-in-time information.
2. SYSTEM DESIGN The system developed includes an Android App that renders map content through audio (tonal and speech) feedback and tactile cues. Users exploring the touchscreen on the tablet computer receive feedback corresponding to the spatial feature immediately below each finger. These maps are graphically represented by a two-dimensional image (Figure 1) in which each color is keyed to unique audio and tactile content. When a finger passes over a feature, audio feedback (in the form of a tone, defined by frequency and amplitude) provides an indication of the underlying feature type. When the user lifts up a finger at a particular point, an audio clip is played with a short text narrative describing that map feature. Examples include hallways, walls, stairways, and exterior spaces; as well as special points of interest unique to each map (e.g., a door to a computer lab or a specific work of art in a museum). The audio feedback is rendered through the tablet’s built-in speaker or via bone-conducting headphones. Tactile feedback is provided alternatively through the built-in vibrator in the tablet (built-in tactile) or as stereo-tactile information rendered through a specially developed pair of “tactile finger buds.” Built-in tactile feedback functions by vibrating the entire mobile device using the tablet’s internal motor. In this condition, tactile cues are generated when a finger touch point traverses designated points of interest on the map. Stereo-tactile feedback results in independent vibrations directly at the pad of each of the user’s fingers in the form of pulsed vibrations with different frequency, amplitude, and pulse characteristics (duration and inter-pulse delay). In the subsequent study feedback was provided to the index fingers on both hands. In this condition, tactile cues are generated both for points of interest as well as for other relevant map features. The tactile
Categories and Subject Descriptors K.4.2 [Computing Milieu]: Social Issues – Assistive technologies for persons with disabilities, Handicapped persons/special needs.
Keywords Wayfinding, orientation, spatial perception, tactile maps, visually impaired
1. INTRODUCTION The challenge of acquiring familiarity with new environments remains one of the greatest obstacles to functional independence for individuals who are blind or visually impaired. Effective independence requires survey knowledge of areas in which daily Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author. Copyright is held by the owner/author(s). ASSETS '15, October 26-28, 2015, Lisbon, Portugal ACM 978-1-4503-3400-6/15/10. http://dx.doi.org/10.1145/2700648.2811386
393
finger buds are a derivative of technology previously developed by Barron Associates under contract to NASA Glenn Research Center [2]. Each finger bud consists of a Linear Resonant Actuator (LRA) vibrational motor held within a specially fabricated flexible plastic ring worn on the user’s finger. Driver electronics are housed in a small plastic enclosure that connects to the tablet via its micro-USB port. Figure 2 shows the tactile finger buds in use during map exploration with the Phase I prototype. Figure 2. Pointing device and tablet lying on compass rose (left). Use of the pointing device during a measurement (right). At each location, a second procedure was then executed to assess the potential for on-the-go usage of the map display (i.e. just-in-time map access in an unfamiliar environment). Participants were asked to use the map display to identify their current location and the location of one of the other points of interest, and then to immediately point at the point of interest; repeating this sequence for each of the four points. Each participant performed the JDA sequence three times, once in each of the three above-described tactile cue conditions, each time in a different area of the LRC. The combination of tactile condition, LRC area, and order of presentation was block randomized.
Figure 1. Prototype without tactile finger buds (left). Prototype with tactile finger buds (right).
3. EXPERIMENTAL ASSESSMENT An experimental assessment was conducted to investigate the effectiveness of the different map displays in conveying spatial layout information on unfamiliar indoor spaces. Experiments were conducted in the Library and Resource Center (LRC) of the Virginia Department for the Blind and Vision Impaired (DBVI), Richmond, VA. Fourteen (14) individuals (age 29-69; 7 female) completed the study under the supervision of the Virginia Commonwealth University Institutional Review Board (IRB). All participants had a diagnosis of blindness (light perception only or total blindness) and were assessed to possess at least a basic level of O&M competency. Approximately half of the participants (6) became blind by age 2 and all but one of the participants read Braille. Subjects had no prior familiarity with the LRC. During an initial familiarization period, participants completed a calibration routine designed to ensure audio and tactile “loudness” was optimized for each individual’s perceptual threshold and preference. Participants subsequently completed a training exercise dubbed the “Survey Knowledge Game.” In this training exercise, individuals explored a series of six different interior maps of buildings in the Richmond, VA area; two in each of the three different tactile feedback configurations. For each map, after an exploration period, participants were sequentially tasked to locate each of five points of interest on the display. Success in utilization of the map display was then assessed through a survey knowledge task dubbed the “Judgment of Direction Activity (JDA).” Audio-tactile maps were created for each of three areas in the LRC of approximately equivalent size and complexity. Participants spent three minutes outside the chosen area of study in the LRC using the display to study a map of the chosen area (simulating pre-arrival map exploration) that included five unique points of interest. Individuals were then escorted by a test administrator to the actual location of one of these points of interest and asked to indicate the location of each of the remaining four. A special device was used to accurately assess pointing direction, comprising an eye-safe laser line generator, a battery pack, and the handle from a white cane (shown in 2). Participants were asked to align the handle with the direction of the target, resulting in projection of a red line of light to a compass rose. The magnitude of the difference between this user-identified direction and the true direction to each point of interest provided an absolute angular estimated error measurement (in degrees), used in subsequent analysis.
4. RESULTS To quantify the benefit of map access vs. the absence of map access, actual JDA performance was compared to random guessing of target location. Monte Carlo simulation was employed to generate the predicted null distribution of angular estimation errors for uniform guessing over the angular range of 0 to 180 degrees (targets were not placed behind the subject). In all combinations of feedback and map access, the display provided a significant benefit in the form of improved judgment of direction. Overall, the best performance was observed in the built-in tactile condition, with a mean angular estimation error of 36.8 degrees with pre-arrival map access and 22.6 degrees for on-the-go access.
5. DISCUSSION The provision of refreshable audio-tactile maps on tablets show promise in learning survey information about unfamiliar indoor spaces. In addition, access to on-the-go audio-tactile maps significantly improves performance, even when pre-exploration occurs ahead of time. Future analysis will provide more details and consideration of the effects that occurred.
6. ACKNOWLEDGMENTS The authors wish to thank the Virginia Department for the Blind and Vision Impaired for support throughout the project. This work was funded by the National Institutes of Health, National Eye Institute; grant 1R43EY021978-01A1.
7. REFERENCES [1] C. Goncu and K. Marriott, “GraVVITAS: Generic Multitouch Presentation of Accessible Graphics,” in Proceedings INTERACT, Lisbon, Portugal, 2011. [2] N. Giudice, H. Palani, et al., “Learning Non-Visual Graphical Information Using a Touch-Based Vibro-Audio Interface,” in Proceedings ACM Assets, Boulder, CO, 2012. [3] R. Adams, A. Olowin, et al., "Glove-Enabled Computer Operations (GECO): Design and Testing of an ExtraVehicular Activity Glove Adapted for Human-Computer Interface," in Proceedings of the AIAA International Conference on Environmental Systems, Vail, CO, 2013.
394