Gaze-Based Drawing Assistant
Mon-Chu Chen University of Madeira Funchal, PORTUGAL
[email protected]
Yi-Ching Huang National Taiwan University Taipei, TAIWAN
[email protected]
Kuan-Ying Wu National Chiao-Tung University Hsinchu, TAIWAN
[email protected]
Figure 1. Left: The system setup consists of an eye-tracker and a pico projector. Middle: Gaze patterns are projected onto the canvas in green color. Right: The comparison between sketches without gaze guidance (top row) and with gaze guidance (bottom row).
1. Introduction Recent works show that digital tools can effectively assist users to create more accurate drawings corresponding to the reference images. iCanDraw uses face recognition to understand the given face image, compares with the sketch analysis of the user’s drawing, and then provides corrective feedbacks [Dixon et al. The Drawing Assistant provides further corrective 2010]. guidance by extracting construction lines from any photos [Iarussi et al. 2013]. Instead of a given photo, ShadowDraw offers freeform drawing guidance by providing shadows derived from the blends of relevant images from a large image database [Lee et al. 2011]. While the user continues to sketch, ShadowDraw adapts to the users’ drawings in real-time and dynamically generates suggestions accordingly. Although those systems provide useful drawing instructions, the guidance seems to focus on the fitness of the user’s drawing and the reference image. In a perfect scenario, two users following the guidance exactly may result in two very similar sketches. In this work, we aim to provide drawing guidance that could accommodate different ways of observations, interpretations and drawing styles. Rather than providing guidance on how to draw, we project the eye gaze patterns onto the canvas to support visual memory aids on what to draw. Users are able to keep their usual way of observation and drawing, while improving the proportion and perspective, which are typical problems for beginners.
2. Implementation The whole system consists of an eye gaze server, a projection server, a Tobii REX eye tracker, a pico projector, objects and photos, and a canvas. The eye tracker was positioned in front of the photo or the live models to be sketched. On the eye gaze server, special software was developed to track eye gaze positions on live objects instead of a computer screen. The gaze positions were converted and forwarded to the projection server over OSC Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author. SIGGRAPH 2014, August 10 – 14, 2014, Vancouver, British Columbia, Canada. 2014 Copyright held by the Owner/Author. ACM 978-1-4503-2958-3/14/08
connection. The projection was calibrated so that the eye gaze patterns of the observed objects/photos are projected onto the center of the canvas. All equipment, objects and the user were carefully positioned so the tracking and projection will not be blocked. With the artificial decay of the gaze projection, recent visited gaze points are weighted and highlighted in the visualization. To test the system, we conducted a preliminary user study with both a photo of Lenna and few real objects. 11 users with different levels of drawing skills participated in the study. Each user tried both conditions with and without the guidance. From a general observation of the results, users seem to draw better in terms of proportion and perspective. Most importantly, we found users developed different guiding strategies, such as using eye gaze to trace the contour, create bounding box and skeletons, etc.
3. Conclusions We present a gaze-based drawing assistant system. Eye gaze patterns of the observation are projected onto the canvas to provide visual aids on what to draw, which allows different interpretation and drawing styles. Further quantitative study is required to analyze the effectiveness of the system.
References DIXON, D., PRASAD, M., AND HAMMOND, T. 2010. iCanDraw? – Using sketch recognition and corrective feedback to assist a user in drawing human faces. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, ACM, New York, NY, USA, CHI’10, 897–906. IARUSSI, E., BOUSSEAU, A., AND TSANDILAS, T. 2013. The drawing assistant: automated drawing guidance and feedback from photographs. In ACM Symposium on User Interface Software and Technology (UIST). LEE, Y. J., ZITNICK, C. L., AND COHEN, M. F. 2011. Shadowdraw: real-time user guidance for freehand drawing. In ACM Transactions on Graphics (TOG) (Vol. 30, No. 4, p. 27). ACM.