Regard- ing the driver's indispensable attention for the driving task, designing IVIS in- .... Regarding the prototype UI, an interface based on the Android ... the National Foundation for Research, Technology and Development is gratefully.
Tile Touch: Adaptive Touch Control for Distributed In-Vehicle Information Systems Sebastian Osswald and Manfred Tscheligi CDL on Contextual Interfaces, ICT&S Center, University of Salzburg, Sigmund-Haffner-Gasse 18 5020 Salzburg, Austria {sebastian.osswald,manfred.tscheligi}@sbg.ac.at http://www.icts.sbg.ac.at
Abstract. Touch-based interaction in the car is a promising approach for controlling functions of in-vehicle information systems (IVIS). In this paper, we propose the adaption of tile menus in combination with two displays for a more efficient and easier touch interaction in the car. We developed a first interaction concept for a distributed IVIS consisting of a display on the central console and a touch-sensitive display on the steering wheel. To compare our design to existing touch systems, we will run a user study with three different setups to ensure road safety in terms of a driver’s visual demand and distraction. Keywords: automotive UI, touch interaction, steering wheel, prototyp
1
Introduction
Increasing functionalities of in-vehicle information systems provide more and more information as well as entertainment opportunities while driving. Regarding the driver’s indispensable attention for the driving task, designing IVIS inevitably imposes requirements on the interaction modality and position of applied elements. A recent trend for tertiary input devices (e.g. for interaction with entertainment and information devices) is to place them in the available space on the steering wheel. In order to investigate design challenges that arise from multi-touch interaction on the steering wheel, Pfeiffer et al. examined how to deal with input and output on the steering wheel while driving [3]. Based on the proved general utility of this approach, Doering et al. identified a user-defined gesture set for interacting with a multi-touch steering wheel and compared their application to conventional user interaction with IVIS [1]. Besides an overall reduced distraction of the participants, a further result of the user study was that the majority of the participants did not like to control the navigation system with a graphical user interface (UI) on the steering wheel. So what is it that detains drivers from using a touch screen UI on the steering wheel? While touch screens are less accurate in terms of entering tokens than a multifunctional controller (e.g. BMW iDrive), accuracy is not the primary
2
S. Osswald and M. Tscheligi
source of a high error rate. In a previous study, we found the low position of the steering wheel display as the main reason for driver distraction. Upon looking down to choose an UI element on the steering wheel screen, the driver starts having difficulties in handling the driving task and IVIS control at the same time. Unfortunately, the low view angle diminishes the peripheral view and prevents drivers from perceiving driving-related events around them. To address this disadvantage, we propose Tile Touch. In addition to an interface concept that can be adapted to the driver’s needs, Tile Touch uses a distributed IVIS (see 1a). The visual output-only display in the central console does not affect the peripheral road-oriented view of the driver, while the touch-sensitive display on the steering wheel is used to control the IVIS via tiles (see 1b) for an almost blind interaction. (a) Central console display and touch surface on the steering wheel
(b) Tile Touch interaction: prototyp UI design
Fig. 1: Prototype for a distributed in-vehicle information system: using the steering wheel and the central console as interaction space
2
Tile Touch
Figures 2 and 3 show scenarios of the Tile Touch technique according using both displays of the distributed system (a: steering wheel, b: central console). The tiles are input fields that react to driver-initiated touch events in relation to the displayed content in the central console. The four visible tiles on the steering wheel screen are equivalent to the four tiles displayed in the central console, why it is assumed that the driver’s visual demand is significantly decreased. In the following, scenario 1 gives an example for almost blind interaction with the IVIS, allowing the driver to keep the eyes on the road, while scenario 2 shows the adaptivity of the Tile Touch interaction. Scenario 1 (figure 2): the driver touches the screen intending to zoom out of the navigation map. By using a common touch gesture for zooming (bring thumb and index finger together) on the steering wheel screen, the map on the central console screen is downscaled. In order to reduce distraction while driving, with Tile Touch all visual output is presented on the screen in the peripheral view of the driver.
Tile Touch: Adaptive Touch Control for Distributed IVIS
(a) Zoom gesture on the steering wheel screen
3
(b) Zoomed navigation map in central console (downscaled)
Fig. 2: Scenario 1: map navigation
Scenario 2 (figure 3): when input modalities are needed that require increased visual demand on behalf of the driver, Tile Touch works differently. The driver is allowed to drag UI elements (e.g. keyboard, map) from the central console screen into the steering wheel screen for interaction purposes. This function is intended to make high demand interaction (similar to that of smart phones) available to the driver when the car is not moving. (a) Keyboard dragged into the steering wheel screen
(b) Task overview in central console (Email context)
Fig. 3: Scenario 2: draggable keyboard for token input
3
User Study
For the user study, to verify the assumption that the visual demand and the distraction are lowered, we chose three different setups as presented in figure 4 (a-c). All systems used in the setups provide comparable input and output functionalities based on a capacitive touch screen. Only for setup a an additional display is used for visual output. A comparison with existing steering wheel buttons was not taken into account, as only a minority of functionalities could be controlled by standard interaction elements on the steering wheel (e.g. there are no controls for map navigation or browsing at state of the art steering wheel concepts). Regarding the prototype UI, an interface based on the Android
4
S. Osswald and M. Tscheligi
operating system (http://developer.android.com) is used. The proposed prototype provides applications for navigation, music player, e-mail and browsing. For the driving setup we use a fixed-base driving simulator consisting of a driving seat, a steering wheel, a capacitive touch screen, pedals and a 50” monitor. (a) distributed IVIS
(b) touch steering wheel
(c) central console
Fig. 4: Proposed user study setups a-c To analyze the driver’s gaze behavior to support the result comparison we will add an eye tracker (Dikablis - http://www.ergoneers.com). We will measure the driver distraction with the standardized Lane Change Test (LCT), an assessment methodology that is easy to implement and quickly to conduct [2]. The main task of the LCT is to drive and change lanes on a simulated straight three-lane road with a track distance of three kilometers and a constant speed of 60kms/h. Frequently appearing signs (18) mark the correct lane to use. The subjects are instructed to change the lane as soon as they can recognize the designated sign. Simultaneously, they perform IVIS-related tasks on the steering wheel. Following the standardized LCT setup requirements, the driving performance under dual task conditions (driving and interaction) is calculated against a normative model of primary task performance to measure the distraction. Due to the necessary changes in setup, a between-subjects design is employed, with each subject performing a set of predefined tasks in randomized order in one setup.
Acknowledgments The financial support by the Federal Ministry of Economy, Family and Youth and the National Foundation for Research, Technology and Development is gratefully acknowledged (Christian Doppler Laboratory for Contextual Interfaces).
References 1. D¨ oring, T., Kern, D., Marshall, P., Pfeiffer, M., Sch¨ oning, J., Gruhn, V., Schmidt, A.: Gestural interaction on the steering wheel: reducing the visual demand. In: Proc. of human factors in computing systems 2011. pp. 483–492. CHI ’11, ACM (2011) 2. Mattes, S.: The lane change task as a tool for driver distraction evaluation. In: Strasser, H., Rausch, H., Bubb, H. (eds.) Quality of work and products in enterprises of the future, pp. 57–60. ergonomia Verlag, Stuttgart (2003) 3. Pfeiffer, M., Kern, D., Sch¨ oning, J., D¨ oring, T., Kr¨ uger, A., Schmidt, A.: A multitouch enabled steering wheel: exploring the design space. In: Proc. of human factors in computing systems 2010. pp. 3355–3360. CHI ’10, ACM (2010)