A Prototype Haptic EBook System to Support Immersive Remote ...

1 downloads 711 Views 3MB Size Report
A Prototype Haptic EBook System to Support. Immersive Remote Reading in a Smart Space. Abu Saleh Md Mahfujur Rahman, Kazi Masudul Alam and ...
1

A Prototype Haptic EBook System to Support Immersive Remote Reading in a Smart Space Abu Saleh Md Mahfujur Rahman, Kazi Masudul Alam and Abdulmotaleb El Saddik Multimedia Communications Research Lab University of Ottawa, Ottawa, ON, Canada Email:{kafi@mcrlab, malam@discover, abed@mcrlab}.uottawa.ca

Abstract—Interactive book reading contributes profoundly to the language development in preschool students. Experimental research showed that shared book-reading between adults and children provides children with the opportunity to acquire new vocabulary. Moreover, the time spent reading together provides clear evidence to a child of a parent’s love and care [1]. Inspired by such, we propose a remote reading framework in which parents or grandparents remotely participate in remote reading sessions. In this framework, we present an intuitive annotation based approach of hapto-audio-visual interaction with the traditional digital learning materials. We argue that picture and haptic modality enhanced book reading accelerates language development as students can relate the text with a known visual and tactile references. Hence, in the proposed Haptic E-Book system, by integrating the home entertainment system in the user’s reading experience combined with haptic interfaces we examine whether such augmentation of modalities influence the user’s learning behaviour. The proposed Haptic E-Book (HEBook) system leverages the haptic jacket, haptic arm band as well as haptic sofa interfaces to provide haptic emotive signals to the remote story listener in the form of patterned vibrations of the actuators and expresses the learning material by incorporating image based augmented display in order to pave ways for intimate, shared, and immersive reading experience in the popular ebook platform. Index Terms—Haptic Book; Annotation; Augmented Rendering; Haptic Rendering; Shared Reading

I. I NTRODUCTION Book reading sessions are particularly effective for the preschool children in order to assist them to increase their vocabularies [2][3]. Senechal et. al. [4] reported that storybook exposure accounted for unique variance in preschool childrens expressive and receptive vocabulary after controlling for parents education, parents own level of literacy, and childrens analytic intelligence. Experimental research showed that shared book-reading between adults and children provides children with the opportunity to acquire new vocabulary. The idea that storybook reading promotes language development is supported by correlational, experimental, and intervention studies. In general, children loves story book reading sessions, specially when they hear it from their grand parents or parents [5]. The time spent reading together provides clear evidence to a child of a their love and care [1]. However, it is not always possible to share the reading experience with the child or a grandchild due to separate geographical locations and distances. In this paper, we address a remote reading system that engages the reader and the listener to participate in an 978-1-4577-0499-4/11/$26.00 ©2011 IEEE

entertaining reading experience and strengthen their emotional bonds. We propose a remote reading framework in which parents or grandparents remotely participate in remote reading sessions in this paper. Our ways of readings are continuously moulded with the advent of new technological innovations, devices and platforms. For example, e-book is continuously challenging the existence of print book and at times ebooks are replacing the former in schools [6]. In our daily life, we are getting more and more explored to electronic materials than printed books as now-a-days most of the advanced hand held mobile devices provide e-book reading facilities [7]. Also, today’s user community is widely accepting haptic interfaces; the effect of tactile feedbacks from the reading materials has started to form a new genre of research interest. Moreover, picture book reading accelerates language development [8][9][10] as learners can relate the text with a known visual reference. Colourful illustrations provided the opportunity for narrating the story without complete reliance on the text [4]. Haptic feedback and visual images relating to the reading content also assist the learner to memorize words by associating their viewing and emotional experiences with the learned words [11]. Hence, in the remote reading system, in order to leverage the use of such modalities we incorporate intuitive approach of annotation based hapto-audio-visual interaction with the traditional digital learning materials. A complete overview of the components of the remote reading system that supports learning material centric haptic and visual content rendering is depicted in Figure 1. Our proposed remote Haptic E-Book (HE-Book) is a special type of e-book reader, which is capable of delivering content related visual images as well as vibrotactile feedbacks to user’s remote reading experience. Specially, in a home entertainment scenario, where the grandparents read the content remotely by using their customized eBook clients and the children view the visual images and receive haptic emotional feedbacks that are related to the learning content on their television screen and wearable haptic jackets respectively [12]. We incorporate the haptic stimulations defined in our previous work [12]. Moreover, the haptic sofa1 provides content dependent vibrotactile feedbacks such as boat or train movements. In order to deliver such experiences, we present a prototype remote HE-Book system exploring the suitability of 1 Haptic

sofa, http://d-box.com/en/home-theatre/d-box-ready-seating/

2

Fig. 1.

High level architecture of the proposed HE-Book system

haptic feedbacks and multimedia contents in various e-book platforms. Our contribution to this research article is two folds. First of all, we propose the annotation based hapticaudio-visual remote feedback integration by authoring haptic as well as multimedia data in the traditional digital text contents. Secondly, we present a generalized framework to support remote reading on that and illustrate the various remote interaction mechanisms. We introduce support for both individual and remote group reading scenarios where ubiquitous home multimedia devices are incorporated within the reading activities. The remainder of this paper is organized as follows. In the beginning, in Section II describe the multimedia annotation scheme. Further in Section III, we illustrate two key components that allows the remote rendering communication in our system. Next in Section IV we present various components of our system that facilitates the annotation based hapto-audiovisual delivery from digital reading materials. We conclude the paper in Section VI and state some possible future work directions. II. M ULTIMEDIA A NNOTATION S CHEME

Fig. 2.

Annotation editor for authoring hapto-audio-visual materials.

We develop an annotation tool to author the image and haptic feedbacks for the ebook contents. The annotation tool is shown in Figure 2. By using the developed annotation scheme an author selects a paragraph of a given page of an e-book and tags those by using various haptic and multimedia properties [13]. For multimedia:image annotation steps, after

selecting the texts of the ebook, the annotator is provided with a list of possible images from the available image providers based on web search results. The images are depicted in an image container for easy selection. Later, an author selects the images that describes the selected texts more appropriately. The selected images are stored in the annotation file. In case of haptic annotation, we currently support three haptic delivery devices. They are haptic jacket, haptic arm-band, and haptic sofa. For each haptic device we create custom vibration patterns, use appropriate names for the patterns and enlist them in the authoring tool. The annotation XML file (Fig. 3) contains image list, vibration types for selected vibration devices for each authored ebook contents. We keep the annotation file separated from the actual ebook contents. The advantage of separate XML based annotation file is that a HE-Book user could decide to either use the multimedia+haptic extension or avoid it. A possible business model for the separate XML file can be a HE-book XML store from where user can consume annotation file for a specific e-book. For the annotated paragraph of an e-book, we have created an XML element Page which has attribute ID that denoted the unique number of the page. Every Page element is divided in Para which also has ID that denoted the paragraph no. Granularity level of our annotation ends at Para i.e. vibrotactile feedback and multimedia playback start and end in a paragraph level, which we term as scene or paragraph based annotation. Under the Para we have elements Haptic and Image for our prototype purpose which is extended to support internal, external audio, video too. Haptic is described using Device Type, Pattern, Repeat and Delay and Image annotation contains Src, URL attributes. Motivated by our past work [12] we annotas the e-book content with Touch, Fun, Hug, Tickle, Kiss, Poke type haptic feedbacks into our HE-Book system. Annotation Retriever is responsible of searching the annotation file when it is necessary and retrieves the requested XML block. In a nutshell this is an XML parser. For a request of {Page, Para} pair, this XML parser returns corresponding Haptic, Image blocks from the XML file. III. R EMOTE EB OOK R EADER S YSTEM Here we describe the reader server and the listener server and how these server handles ebook interactions and provide

3

hapto-audio-visual feedbacks to the listener smart devices. Two components of the system are the reader and listener modules. In a typical reading session, the reader module provides the reader to control the pace and content of the reading. While the listener module renders the reading content through audio, visual and haptic channels. When the computer voice is reading aloud the book content, the listeners view the related pictures on a television and receive haptic feedbacks in their connected haptic devices. A. Reader Interaction Controller Interaction Controller module that resides in the Reader server plays the central role to organize and synchronize necessary work flow in the haptic e-book reading system. The module polls to acquire user touch inputs after certain interval. As soon as a touch interaction is performed the Reader Interaction Controller coordinates the screen location based paragraph identification of the e-book document. The Annotation Manager module takes the page ID and the paragraph keys to retrieve the haptic and visual data associated with the paragraph and returns the xml data to the Interaction Controller. These xml content along with the page and paragraph keys are further sent to the Listener module. The haptic e-book system can work in a touch based mobile device. The user can flip pages and scroll the paragraph of the pages through the touch based interaction. We use the standard touch SDK is used to obtain the screen coordinates by using the GUI Interaction Listener. The Listener uses the Paragraph Locator component in order to match the touched screen coordinate with the displayed e-book page content to deduce the paragraph that the user is currently pointing at. The e-book XML reader software continuously monitors the touched screen coordinate interaction. As soon as the user touches a paragraph of the page that has been previously annotated, the packaged xml annotation keys are sent promptly to the Listener module.

Controller module and provides the needed haptic, audio, visual signals. The implemented Listener module is connected through peer-to-peer based architecture and transfers messages in a socket port. In the beginning the Reader module sends the Listener module the name of the books and transfers the xml annotation file. The xml file is parsed and loaded into the memory in a custom data structure list. In order to access the individual elements of the list the Listener module require page and paragraph keys. At all times the Listener module waits for the page and paragraph keys that it receives from the Reader module. As the Listener module accepts the keys, it locates the annotation element from its loaded data structure. In the data structure list the Listener module locates the needed haptic-audio-visual data. These data are further sent to the Multimedia Rendering Manager. The Multimedia Manager module is responsible for producing the signals to render the haptic, audio, visual devices. The listener user then obtains the assigned haptic, auditory and visual feedback associated with the learning content. The Listener module also polls the surrounding active devices. If any media device is discovered then paris with the devices and performs preliminary message communications in order to maintain the media synchronization. Listener module also checks the status of a device in order to ensure whether the device type is a standalone or intermediate. Standalone device is itself capable of continuing media streaming but for a setup such as group entertainment an intermediate computer device play the role as mediator controller. In addition to coordinating the aforementioned task the Listener module makes sure that the system modules are not blocking or throttling the operation of the other modules. The Listener module employs carrier-sensing algorithm to determine the active/idle states of the other modules. IV. M ULTIMEDIA D ELIVERY S CHEME The multimedia manager module resides in the mobile device client and handles the hapto-audio-visual rendering. The rendering schemes are described as the following. A. Haptic Renderer

Fig. 3.

XML based annotation file for the e-book document.

B. Listener Ambient Media Service The Listener module consists of the Ambient Media Services that connect with haptic devices, audio system, and video system. The Listener module listens to the remote Interaction

Based on the page and paragraph information of Touch Based GUI, Interaction Controller gets haptic description from the Annotation Retriever sub-module and generates the described haptic signal for the targeted haptic device. Haptic Signal Generator plays an important role for device specific haptic signal generation. As the target haptic device can be heterogeneous we considered device specific configuration. In our system architecture, we have considered Bluetooth as a method of communication between our system and the corresponding haptic devices. Such communication can also be extended to other possible personal network communication methods based on the device support. In our prototype system, we have used a Bluetooth enabled haptic jacket [12], a D-Box haptic sofa [11], and a Bluetooth enabled haptic arm-band [14][15]. From our previous work [12] we have selected three basic emotions {Love, Joy, Fear} based on the haptic devices available for our prototype system. Above described emotions are

4

tagged to the e-book content semi-automatically as described earlier. We also have annotated real life scenarios such as riding bike, driving car, airplane, travelling by boat, sea storm tide effects, holding hands, tickle, hug, poke, touch. We define these haptic feedbacks empirically in the following. • Touch: In order to mimic touch haptic interaction we have incorporated funnelling illusion based actuator pattern generation. The patterned vibrations of the set of actuators on the human skin produce the touch sensations. • Poke: Haptic poke incorporates the similar patterned vibrations as the touch. However, the funnelling parameters for the motors are reduced and the vibration strength is higher in this type of haptic simulations. • Tickle: Through the haptic jacket interface, we empirically have positioned the actuator motors around the belly, armpit and neck area and leveraged a set of patterned touch sensations in order to generate tickle haptic feedback. • Holding Hands: By using the haptic armband we have achieved the holding hand haptic feedback. In this type of haptic rendering the haptic touch is focused on the armband area. • Sea Storm, Travelling in a Boat/Airplane/Car, Bike Riding: The haptic sofa is responsible for the generation of such feedbacks. The sea storm haptic feedback is a special type of boat riding haptic feedback. In order to generate boat riding haptic sensation the D-Box interface generates wave of signals to modulate the actuators accordingly. In car travelling haptic feedback we modified the boat travelling feedback and enhanced it with frequent vibrations. In order to simulate bike-riding experience, the haptic sofa reduces the actuator vibrations, however, increases the force of the actuators in the process. • Fear, Joy, Love: We again capitalize the haptic touch and haptic poke feedbacks in order to generate these emotional feedbacks. In case of fear haptic rendering, we enable the periodic patterned actuator touch sensation along the backbone. The butterfly joy effect is produced by incorporating a series of mild-poke and mild touch sensation around the stomach area. Whereas the love effect is produced by enhancing the haptic touch around the left-chest area. The actuator motors were placed around these said areas with varying intensity beforehand. After empirically defining the haptic sensations we use those in the annotation steps. The visual image contents and haptic sensation type, frequency and rendering types are all defined in the xml annotation file. The Listener module dynamically renders these haptic vibrations on the available haptic devices in the smart home environment. B. Image Rendering Subsystem In this section we describe the text-to-speech interaction modalities, 2D image rendering approach and multimedia delivery scheme to the home entertainment system into more detail. The Imager Viewer incorporates augmented visual feedback on the television screen so that a learner can ’see’ images that are related to the learning content. The images

are statically defined during the annotation steps for each paragraphs of a book. A semi-automated process allows the usage of the vast image database and makes those available during the reading process. Seeing an image of a snow storm while reading about it can help the user improve his/her reading and understanding abilities in order to help practice effective learning behaviour [16]. Additionally, the augmented visual display of the learning material can make the intuitive learning process more interactive and entertaining [17]. Home multimedia system is suited for group listening scenario. When the remote reader interacts with the annotated eBook content using his/her eBook client then the remote listeners can enjoy the content related information by utilizing the smart devices surrounding them such as Haptic sofa, haptic jacket, haptic arm-band, stereo sound system, HD television etc. For example, a father from remote travel location reads a travel to Venice city using his Ebook client. The reader’s wife and young kids are listener users. Both mom and the kid are wearing haptic jackets, sitting on a haptic sofa and in front of them there is a HD Samsung LCD 46” 1080p TV. While reading the book, a remote user touches annotated part of the content and consecutively a list of images are processed by the Listener module and queued to be displayed on the TV. At the same time, wearable and surrounding haptic devices of the listener users start to play the command that is sent to it. In our prototype system, haptic signalling and streaming is handled by an intermediate computer, which is connected to the eBook client by using a personal communication network. TTS support of the Listener module makes it possible for a listener users to listen the reading content in a computer voice that is rendered in a computer connected audio playback device.

Fig. 4.

HE-Book component architecture

V. I MPLEMENTATION D ETAILS In this section, we present the implementation details of the haptic e-book system. The implemented system components are shown in Figure 4. One of our prototypes was developed for desktop e-book readers by using Netbeans 6.5 IDE and the primary language was JAVA. In order to develop the ebook reader, we locally build the ICEPdf 2 open source JAVA PDF viewer. We have used a Dell touch screen monitor for our system and added the touch processing part to the ICEPdf. We also have added XML retrieving facilities to the ICEPdf 2 Open

Source Java PDF, http://www.icepdf.org

5

for annotation searching. For serial port based Bluetooth communication we have used GPL based BlueCove3 library which is very handy for J2SE based Bluetooth communication. Our prototype system communicates with a haptic jacket when any annotated e-book part is touched by the user. A Bluetooth device was connected with the PCs USB port, which was virtually configured with the COM port so that the Bluetooth device can send signals to the haptic jacket. Our desktop based test prototype was adequately responsive in a standard Pentium dual core 32-bit machine with 2 GB systems RAM. In order to annotate an e-book we have taken PDF as example and developed a PDF annotator modifying the ICEPdf that we have used as viewer also. We have added tagging window to the system where we have considered various tagging options. An operator can select a paragraph of a PDF file and tag it using our prototype editor. For each tagging an XML block is created which is edited by the human operator for various haptic attributes. Later this XML file is stored to be further used by the ICEPdf viewer while reading annotated e-book. We used RFCOMM based the Bluetooth Serial Port Profile communication with the end haptic devices. Our end bluetooth devices were haptic jacket and haptic armband. The command format was suitable for sending/receiving bluetooth signals to the haptic devices Parani ESD200 Bluetooth kit. For text-to-speech version of the mobile system we have used java.speech package under JSR 13 Java Speech API. We achieved remote communication by using client-server based message delivery mechanisms. The designed the remote Reader module as a client to the Listener module. For our home multimedia prototype, we have combined above described two systems in one system. When a reading event is initiated from the Reader module a specific command is transmitted to the Listener server system. Corresponding desktop system receives the command and distributes various media to related devices. For the home multimedia version, we stream stream annotated pictures from the Listener system. In case of image streaming, picture data is received in the desktop end and displayed to the Samsung LCD 46” 1080p TV. VI. C ONCLUSION Human computer interaction research has gained a pace with the advent of haptic devices. Our research goal is to explore the possibilities and opportunities this modality could bring to the remote shared reading experience. In this regard, we introduced an intuitive remote e-book reader system. The system works by using Reader and Listener interaction modules. The interaction of remote reader is capture by the Reader module and selected reading content is sent to the Listener module. The Listener module looks up the annotation of the current reading content and further delivers hapto-audiovisual feedbacks to the listener users.

3 BlueCove,

http://www.bluecove.org

Current system can meaningfully synchronize all media for parallel devices to form a better orchestra of learning and entertainment. The remote reading system can benefit the learning experience of the remote users and improve their vocabulary and language learning capabilities. In future we want to perform a detailed usability study of the proposed approach to measure the impact of remote delivery of haptoaudio-visual materials in the knowledge acquisition process. R EFERENCES [1] M. Fox and J. Horacek, Reading magic: Why reading aloud to our children will change their lives forever. Mariner Books, 2008. [2] C. J. Lonigan and G. J. Whitehurst, “Examination of the relative efficacy of parent and teacher involvement in a shared-reading intervention for preschool children from low-income back- grounds,” Early Childhood Research Quarterly, vol. 13, no. 2, pp. 263–290, 1998. [3] A. Hargrave and M. S´en´echal, “A book reading intervention with preschool children who have limited vocabularies: The benefits of regular reading and dialogic reading,” Early Childhood Research Quarterly, vol. 15, no. 1, pp. 75–90, 2000. [4] M. Senechal and J. LeFevre, “Long-term consequences of early home literacy experiences.” in The annual meeting of the Canadian Psychological Conference. Edmonton, Canada, 1998. [5] H. Janes and H. Kermani, “Caregivers’ story reading to young children in family literacy programs: Pleasure or punishment?” Journal of Adolescent & Adult Literacy, vol. 44, no. 5, pp. 458–466, 2001. [6] A. Mangent, “Hypertext fiction reading: haptics and immersion,” Research in Reading, vol. 31, no. 4, pp. 404 –419, nov. 2008. [7] B. Schilit, M. Price, G. Golovchinsky, K. Tanaka, and C. Marshall, “The reading appliance revolution,” Computer, vol. 32, no. 1, pp. 65 –73, jan. 1999. [8] D. H. Arnold, C. J. Lonigan, G. J. Whitehurst, and J. N. Epstein, “Accelerating language development through picture-book reading: Replication and extension to a videotape training format.” Journal of Educational Psychology, vol. 86, pp. 235–243, 1994. [9] L. Dunn and L. Dunn, Peabody Picture Vocabulary Test–Revised. Circle Pines, MN: American Guidance Services, 1981. [10] M. F. Gardner, Expressive one-word picture vocabulary test–Revised. Novato, CA: Academic Therapy Publications, 1990. [11] A. S. M. M. Rahman, K. M. Alam, and A. El Saddik, “Augmented he-book: A multimedia based extension to support immersive reading experience,” in Autonomous and Intelligent Systems (AIS), 2011 International Conference on, To be published, 2011. [12] A. S. M. M. Rahman, S. A. Hossain, and A. El Saddik, “Bridging the gap between virtual and real world by bringing an interpersonal haptic communication system in second life,” in IEEE International Symposium on Multimedia, Taiwan, Nov 2010. [13] A. S. M. M. Rahman, J. Cha, and A. El Saddik, “Authoring edutainment content through video annotations and 3d model augmentation,” in IEEE International Conference on VECIMS, Hong Kong, China, May 2009, pp. 370–374. [14] A. S. M. M. Rahman and A. El Saddik, “hkiss: Real world based haptic interaction with virtual 3d avatars,” in IEEE International Conference on Multimedia and Expo, HFM2011, Barcelona, Spain. 2011. [15] A. S. M. M. Rahman, M. Eid, and A. El Saddik, “Kissme: Bringing virtual events to the real world,” in Virtual Environments, HumanComputer Interfaces and Measurement Systems, VECIMS 2008, Istanbul, Turkey, July 2008, pp. 102–105. [16] E. Dale, Audio-Visual Methods in Teaching. Rinehart and Winstor, 1969. [17] M. Billinghurst, H. Kato, and I. Poupyrev, “The magicbook - moving seamlessly between reality and virtuality,” Computer Graphics and Applications, IEEE, vol. 21, no. 3, pp. 6 – 8, 2001.