Developing context-aware and personalised multimodal applications in the MobiLife EU project Péter Pál Boda
Ralf Kernchen, Klaus Moessner
Nokia Research Center Helsinki, Finland Tel.: +358-71800-8000
The University of Surrey Guildford, Surrey, UK Tel.: +44-1483-300-800
[email protected]
[email protected] [email protected]
Michael Sutterer Olaf Droegehorn
Giovanni Giuliani
University of Kassel Kassel, Germany Tel.: +49-561-804-6287
HP Italiana srl Cernusco sul Naviglio (Milano), Italy Tel.: +39 02 9212-1
[email protected] ABSTRACT
Fabiola López Dávalos Jan Storgårds Suunto Oy Vantaa, Finland Tel.: +358 9 875 870
[email protected] [email protected]
[email protected]
The paper introduces three multimodal context-aware mobile demonstrator applications designed and developed within the scope of the EU IST-511607 project MobiLife. The three, familyoriented applications Mobile Multimedia Infotainer, WellnessAware Multimodal Gaming System and FamilyMap, provide advanced multimodal user interactions supported with contextaware functionalities, such as personalisation and profiling. The paper briefly explains the underlying architectural solutions and how the development work fits to the User-Centered Design process. The ultimate intention is to enhance the acceptance and usability of current mobile applications with beyond-state-of-theart user interaction capabilities, by researching how contextual information can affect the functionality of the multimodal user interface and how to provide the users with a seamless, habitable and non-intrusive user interaction experience.
Categories and Subject Descriptors H.5.2 [Information Interfaces and Presentation (HCI)]: User Interfaces – prototyping, user-centered design, user interface management systems
General Terms Design, Experimentation, Performance Verification.
Keywords Context-awareness, Multimodal Interface, Mobile Applications, Personalisation
1. INTRODUCTION MobiLife Project IST-511607 [1] is an EU project, implemented as an Integrated Project in the Information Society Technologies (IST) - Sixth Framework Programme (FP6), as a part of the Wireless World Initiatives (WWI). The ultimate goal of the project is to advance the development, implementation and demonstration of innovative and novel mobile applications and services, to bring them within the reach of users and to gain a better understanding of how these applications are accepted by the end-users, how business models are built around them and how
operators benefit from the value chain. Work Package 2 of the project is focusing on the development and demonstration of mobile applications that benefit from synergetic utilisation of contextual information, personalisation and the ease-of-use offered by advanced multimodal user interface technology. The main target group of users are families: it is MobiLife’s interest to investigate how and in what form applications can be developed, provisioned and provided to families to help their everyday life. The work presented here is part of a larger process, namely UserCentered Design. Potential users are involved, via interviews, mock-up and probe tests, in the entire design and implementation process. At every stage the opinions, views and desires of users are taken into account in the step-wise, iterative implementation process. This paper presents the three mock-up implementations of Work Package 2. The focus is on personalisation and how contextual information gathered from the environment, device capabilities and/or from learning processes can be utilised and enhance the ultimate user experience [2].
2. THE UNDERLYING ARCHITECTURE In mobile situations applications must be enabled to provide the user with all available and sensibly accessible modalities in order to enhance significantly the user experience. These modalities may reside on individual or on multiple devices. The architecture developed in MobiLife enables the integration of multiple input modalities that control the invoked application(s). It also orchestrates the delivery of application specific media output to the (selected or preferred) devices. Figure 1 below describes the MobiLife architecture for user interface adaptation and device I/O utilisation on the functional level. The multimodal user interface consists of the following devices: a user equipped portal device (mobile phone or PDA) and, possibly, several other wirelessly connected user interface devices (e.g.: displays, speakers, cameras, etc.) providing additional modality input and output channels. The approach is based on a work described in [3].
family, friends or a professional athlete, or even against themselves (vs. a previous training log); o
Fitness monitoring: while training, the application can retrieve sensory information such as vertical and horizontal speed, location, altitude, temperature, heart rate and oxygen consumption. Based on previously recorded data and profiles, the application can give advice on the training via preferred a modality, e.g. a voice message telling “training level is low increase the pace”;
o
Sharing wellness and training profile: Training logs and wellness-profile can be sent to a Family content server for a later download to analyse or compare family wellness.
3.3 FamilyMap Figure 1: Physical user interface mapping to system modules On the System Modules side, the functionality is assigned mainly to the Device and Modality Function (DeaMon) residing partly on the device, and to the Device and Gateway Function (DeGan), residing on the portal device (not shown).
3. THE APPLICATIONS 3.1 MobiLife Multimedia Infotainer The Infotainer is an advanced multi-functional application for mobile devices with the following capabilities:
playing and downloading music and videos from multimedia resources (files, streams);
providing user profiled information from configurable resources (RSS, ATOM or web portals);
letting the user communicate with other people via multi channels (internet messengers, short messaging, etc.).
The system brings new look at accessing multimedia devices. The integration with the DeaMon platform allows applications to perform complementary presentation on available devices. The Infotainer supplies the user with an always optimal interface, fitting best to the current environment and situation. This is made possible by DeGan that constantly detects devices around and switches communication to the most efficient input/output channel. Additionally, learning capabilities updates data about user's preferences [2]. The following contextual aspects are important: in the car the user receives personalised news pieces, e.g. traffic related information via audio messages; at lunch time preferred politics and social news are presented on the user’s portal device; back at home news/film/music, etc. is presented on the home entertainment system, whatever is available in the current location within the apartment.
3.2 Wellness-Aware Multimodal Gaming System This is an application that integrates fitness monitoring features and game-racing capabilities in one. It represents family collaboration by way of sharing wellness and environment/location information during or after training, including funny gaming aspects. The main functionalities are: o
Racing capabilities: while the users are training, they can virtually race at the same time against a member of their
FamilyMap is an application that provides geographic and babycare related information for families with babies, especially on the move in unknown cities. It assists families to solve practical problems, to find “logistic” locations, places where a parent can feed a baby (e.g. a microwave is available), change diaper, spend some relaxed time with a baby, can walk easily with the baby carriage: i.e. path having no stairs or construction going on. Promoting social networking, parents can leave and receive virtual-post-its in hotspots about preferred places, e.g. cafeterias. The mock-up is implemented using the following devices: Nokia 9500 Communicator with wireless GPS module and Wayfinder’s navigation software; iPaq PDA to show system messages and contextual information, as well as to give audible guidance; various sensors from Suunto Oy to detect speed, distance, temperature, air pressure, altitude, etc., vibrator engine attached to the bar of the carriage for tactile feedback as system warning.
4. CONCLUSION Three applications, being developed in the MobiLife EU project, are introduced. The demonstrations aim to indicate the combination of contextual information and multimodal user interfaces, and how the end-user can benefit from the synergetic and synchronised use of these resources. The scope is the individual and family sphere of MobiLife. Future work will focus on the utilisation of the mock-up evaluation results in fine tuning the applications, as well as increasing the multimodal interaction capabilities and contextual learning of user behaviour patterns.
5. ACKNOWLEDGMENTS This work has been performed in the framework of the IST project IST-2004-511607 MobiLife, which is partly funded by the European Union. The help and contribution of our colleagues in Work Package 2 of MobiLife are greatly appreciated.
6. REFERENCES [1] MobiLife, 2005. https://www.ist-mobilife.org [2] Kernchen R, Moessner K, Tafazolli R, Adaptivity for Multimodal User Interfaces in Mobile Situations, 7th International Symposium on Autonomous Decentralized Systems - ISADS 2005, Chengdu, China, April 4-8, 2005. [3] Kernchen R, Boda P, Mößner K, Mrohs B, Boussard M, Giuliani G, Multimodal user interfaces for context-aware mobile applications, submitted to PIMRC 2005.