ISWC '14 ADJUNCT, SEPTEMBER 13 - 17, 2014, SEATTLE, WA, USA
Designing New Input Modalities for Wearables & Digitized Home Abstract
Sang Ho Yoon C-Design Lab Purdue University West Lafayette, IN 47907
[email protected]
There is a common phenomenon in the development of wearable computers and digitized home system. The computing of these technologies has been developed expeditiously, but they still adopt conventional input methods. The wearable computer uses screen-touch or button and in-home digital objects incorporate switches or remote controllers for input methods. Although a vision based input method has been proposed, it cannot replace previous practices due to occlusion and space constraints. We propose new input modalities which can improve the physical/cognitive cost of interacting with upcoming technologies as well as the social acceptability to lower the barrier of using proposed approach.
Author Keywords Piezoresistive; Mobile Interaction; Wearable Input Device; Unobtrusive Interaction; Eyes-free Interaction; Private Interaction. Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author. Copyright is held by the author/owner(s). ISWC’14 Adjunct, September 13 - 17, 2014, Seattle, WA, USA ACM 978-1-4503-3048-0/14/09. http://dx.doi.org/10.1145/2641248.2642729
151
ACM Classification Keywords H.5.2 [Information interfaces and presentation]: User Interfaces Input devices and strategies; User centered design.
General Terms Design; Human Factors; Experimentation.
ISWC '14 ADJUNCT, SEPTEMBER 13 - 17, 2014, SEATTLE, WA, USA
Introduction With the development of computing power, wearable computers are now commercialized [1]. However, the utilization of the technology through different input modalities are still under development. Previous studies show that introducing new input modalities improve the physical/cognitive loading, the social acceptability, and the privacy during mobile interaction [3, 9]. Moreover, various home automation related products appear on the market where they provide a prompt interaction with smartphones using wireless communication [2]. In order to lower the bar for the end user’s interaction with wearables and digitized home, the introduction of new input device through utilization of various sensing capabilities is likely to be the next step. Previous research propose always-available input methods by exploiting human body for eyes-free and private interaction with low cognitive loading [4]. Among various regions on the body, a finger has been a main interest due to its capability for subtle and accurate interaction. The magnetic field tracking, optical sensing, and physical buttons have been employed to finger-worn devices for new interaction techniques. For example, the magnetic field sensing enables tracking of the fingertip in 3D space while providing subtle spatial interaction [5]. Another example is use of a wrist-worn infrared camera to capture full finger motions [7]. Previous approaches, however, suffer from motion artifact and obtrusiveness. With this in mind, there still exists rooms for suggesting new input modalities for better performance and unobtrusive appearance. For generating rich inputs from the human, merging multiple sensors in the input device has been done [10]. The recent development of vision and magnetic tracking
152
sensors, however, leads to an input device to possess only single sensor element. Although these input devices provide high performances, the lack of multiple sensing elements causes scarce and less intuitive interactions. For instance, the use of magnetic tracking as an only input resource faces hard time providing enough functions for everyday mobile applications. The vision based input method does not contain sensor to measure . New techniques of merging sensors are required in the future step in order to design an user-friendly input device as with rich interactions. We propose a new design approach of interacting with newly developed fields including wearables and digitized home. With the aid of a printing fabrication, new design can encompass unobtrusive, rich, and less physical/cognitive cost characteristics. Cooperating with wearable computers and digitized home, the whole new application areas will open up. Proposed new input modalities could improve end user experience where wearable computers and digitized home become prevalent like how smartphone triggers the spread of ubiquitous computing.
Expected Contribution The expected contributions of the current research are followings: • Proposing new input modalities based on human’s natural tactile feedback and proprioception. • Lower the barrier of interacting with upcoming technologies including wearable computers and digitized home system • Exploring the end user feedback in terms of evaluating social acceptability and physical/cognitive cost on different input modalities.
SESSION: DOCTORAL SCHOOL
Completed Research
Figure 1: Piezoresistive sensor with matrix design topology
Figure 2: Data glove with a simultaneous bending and pressure sensing capability
Figure 3: Single layer textile sensor for proposed input device
At first, we focused focusing on finding proper sensors for new input modalities. Among various materials, we chose piezoresistive materials for sensor fabrication. The main reason to select this material was due to its capability of directly representing physical properties of the human body. With the help from the DIY community, we fabricated a simple bending and a pressure sensor using 3M’s Velostat. With addition of matrix design topology, we implemented a sensor which simultaneously measured both pressure and bending. By encapsulating fabricated prototype, we developed a data glove incorporating both pressure and bending sensing capabilities. However, types of materials used should be changed since the metallic wire and Velostat limited our design space. Successive work on developing piezoresistive sensor continued. A carbon elastomer has been widely used for the fabrication of the pressure and strain sensor [8]. Previous studies mostly implement only a single sensing element in the system due to the possibility of the physical crosstalk between different elements. We approach the new input modality with multiple sensing elements by exploiting natural human-oriented sensing elements like bending and pressing. With the conductive elastomer and conductive thread, we implemented single layer textile sensor which was designed to separate bending and pressure sensing capability when used with fingers. Figure 3 describes the sensor used and this was fabricated using a manual stencil painting process. As previously mentioned, the proposed textile sensors could entitle a new input modality when used with fingers. The aim of proposed system was to interact with the mobile or wearable device while supporting the eyes-free and private interaction environments. We implemented
153
several applications using this finger-worn device. For example, we successfully performed an interaction with smartphone’s music player during daily activities like walking, running, or even driving. Another example was private smart eyewear control using proposed system. We analyzed perceived workload of proposed system using NASA Task Load Index [6].
Figure 4: Proposed finger-worn input device
Work In Progress and Future Plans Currently, we are in the process of improving hardware and conducting various user studies. We plan to develop a platform for connecting the input device with home objects since the input device for home objects will become crucial. This indicates that a development of an optimal input modality for both mobile and digitized home will be our future step. We are trying to improve the fabrication because current setup is based on manual operations. Adopting an ink-jet printing technology will benefit the quality and the speed of fabrication. Future plan is summarized to following items: • Improve the resolution and the durability of the proposed system by utilizing printing technology. • Analyze eyes-free and private aspect of new input modality with both quantitative and qualitative index.
ISWC '14 ADJUNCT, SEPTEMBER 13 - 17, 2014, SEATTLE, WA, USA
• Extensive user study on finger-worn input device to analyze its impact on the social acceptability as well as the cognitive/physical cost. • Develop an input platform for digitized home system. (e.g controlling home appliances)
References [1] [2] [3] [4]
Objective for attending the Doctoral School By presenting at the Doctoral School, I want to get a feedback from professional researchers working in the same field. I expect to enhance the context and the direction of my research after all. Also, I hope to get a chance to foresee the possible problems in my research by discussing with other attendants. Thus, the Doctoral School will flourish my insight on the current research.
Brief biographical sketch Sang Ho Yoon received his BA and MS degrees in mechanical engineering at Carnegie Mellon University. After 5 years of industrial experience, he started a PhD program in mechanical engineering at Purdue University. He has broad research interests in wearable computers, tangible interaction, and embedded system. He is especially interested in proposing user-friendly input modalities for digitized world comprising of wearables and smart home system. He entered the program in August 2013 and tentatively plans to graduate in 2017. Dr. Karthik Ramani is the Donald W. Feddersen Professor in the School of Mechanical Engineering at Purdue University and the Director of the C-Design Lab. His research is at the cross-roads of mechanical engineering and computer sciences driven by geometry and design inspired areas. In particular, his core areas are machine learning, human-computer natural interaction, computational geometry/modeling, and design.
154
[5]
[6]
[7]
[8]
[9]
[10]
Google glass. http://www.google.com/glass. Wemo home automation. http://www.belkin.com. Ashbrook, D. L. Enabling mobile microinteractions. Chan, L., Liang, R.-H., Tsai, M.-C., Cheng, K.-Y., Su, C.-H., Chen, M. Y., Cheng, W.-H., and Chen, B.-Y. Fingerpad: private and subtle interaction using fingertips. In Proceedings of the 26th annual ACM symposium on User interface software and technology, ACM (2013), 255–260. Chen, K.-Y., Lyons, K., White, S., and Patel, S. utrack: 3d input using two magnetic sensors. In Proceedings of the 26th annual ACM symposium on User interface software and technology, ACM (2013), 237–244. Hart, S. G., and Staveland, L. E. Development of nasa-tlx (task load index): Results of empirical and theoretical research. Advances in psychology 52 (1988), 139–183. Kim, D., Hilliges, O., Izadi, S., Butler, A. D., Chen, J., Oikonomidis, I., and Olivier, P. Digits: freehand 3d interactions anywhere using a wrist-worn gloveless sensor. In Proceedings of the 25th annual ACM symposium on User interface software and technology, ACM (2012), 167–176. Lorussi, F., Rocchia, W., Scilingo, E. P., Tognetti, A., and De Rossi, D. Wearable, redundant fabric-based sensor arrays for reconstruction of body segment posture. Sensors Journal, IEEE 4, 6 (2004), 807–818. Profita, H. P., Clawson, J., Gilliland, S., Zeagler, C., Starner, T., Budd, J., and Do, E. Y.-L. Don’t mind me touching my wrist: a case study of interacting with on-body technology in public. In Proceedings of the 17th annual international symposium on International symposium on wearable computers, ACM (2013), 89–96. Tsukada, K., and Yasumura, M. Ubi-finger: Gesture input device for mobile use. In Proceedings of APCHI, vol. 1 (2002), 388–400.