Particle-Flow Interactive Animation for Painting Image Nutchaphon Rewik∗ ,Kittipop Peuwnuan† , Kuntpong Woraratpanya‡ , Kitsuchart Pasupa§ , Faculty of Information Technology King Mongkut’s Institute of Technology Ladkrabang Bangkok 10520 Thailand ∗ † Email:
[email protected],
[email protected], {‡ kuntpong,§ kitsuchart}@it.kmitl.ac.th Abstract—In this paper, we present an interactive multimedia artwork, which awakens motionless images to interactive animations. This animation simulates the colour-flow movement from painting images. Particle movement and interaction present the rhythm of the brushstrokes. This artwork conveys lively image feelings. Human colour perception is the main idea of this work. We use image-processing techniques to extract colour distinction from painting images. After that, we render numerous particles from distinct colour areas information. Steering behaviour directs particle movement to flow smoothly on a touch-screen monitor. This artwork attracts audiences to admire painting in the new aspect that audiences play with it rather than just watching it. This works designed for arbitrary painting images, unlike previous work which ties to a particular image.
I.
animate others. To animate arbitrary images, hard coding every image could achieve the task; however, it is a laborious work. Therefore, this artwork is different from the app, because any painting images could be turned into the similar kind of animation. This artwork makes use of k-mean clustering algorithm to cluster the image into multiple colour segments [2], [3] together with the inspiration by Van Der Burg (2000)’s work [4]. Van Der Burg (2000) proposed the steps to build a particle system such as fire, smoke and snow. This could be the foundation to design another advanced particle system. It makes particle design to be more dynamic, flexible and maintainable.
I NTRODUCTION
Painting is one of the arts that artists convey emotions and ideas through images. When the artist paints an image, it creates the rhythm of the brushstrokes. However, once they finish their masterpieces, the result is just motionless images. People hardly feel the moment from painting steps. Therefore, we created the animation effect, which simulates the movement of colour from painting images. By using image processing and graphic rendering, we propose an artwork the so-called “Particle-flow Interactive Animation for Painting Image”. The principal concept behind this work is a usage of colour segmentation technique. With this technique, the painting can be partitioned into multiple areas of colour shades. This technique is matched to human visual perception, because human recognizes and distinguishes images by colour with their eyes.
III.
I MPLEMENTATION
A. Framework The framework consists of two main parts, image processing and graphic rendering, illustrated in Figure 1. B. Image Processing The purpose of this part is to extract colour areas from painting images, partitioning image by colour is a focused technique. Many clustering techniques could achieve the task; however, we choose k-means clustering, because it is simple and suitable for this work. Contiguous pixels that have similar colour will be clustered into a colour area group.
After colour areas are extracted, a graphic engine will render many particles to perform the smooth movement defining by steering behaviour, creating intuitive particle interaction.
Steps of colour segmentation are stated as follows: (i) Convert a painting image from RGB to HSV colour space, (ii) Cluster HSV image by k-means clustering algorithm, and (iii) Assign colour areas by labelling the contiguous pixels in the same cluster.
The artwork requires a touch-screen monitor. Audiences can choose a painting image that we provided, after that, the animation will be rendered, and audiences can move their finger on the screen to create their own particle-flow movement.
1) RGB-to-HSV Images Conversion: This step converts RGB to HSV image in order to decompose the intensity component from colour space. The colour vectors of the HSV space are clustered by using k-means clustering technique. This leads to a better colour segmentation result [2], [3].
II.
R ELATED W ORKS
Recently, Starry Night Interactive Animation is an application on App Store [1]. It animates the colour flow of Starry Night painting, which was painted by Vincent van Gogh. However, it presents only for the specific image; it cannot
2) k-means Clustering: In this step, we set k = 2, HSV images will be clustered into two groups by k-means clustering algorithm. In fact, k could be any values greater than one, but for the animation refinement, k = 2 results in better colour areas detection in the next step. The result of this step is a clustered image, illustrated in Figure 2b.
(a)
(b)
Fig. 1: Particle-Flow Interactive Animation Framework
3) Colour Area Labelling: The final step is to label colour areas from the clustered image. Contiguous pixels in the same cluster will be labelled as the same colour area, by using connected-component labelling technique [5]. The result of this step is a colour map image, which has multiple colour areas. Figure 2c illustrates the result of this step.
(c)
Fig. 2: The illustration of colour segmentation result. (a) Original image (b) Clustered image, and (c) Color map image
C. Graphic Rendering After colour areas are extracted from the image, graphic engine will be used for animation rendering. Moving particles perform the animation acting. Therefore, this section explains about particle movement and structure. 1) Particle System: A particle is a small rectangular shape. The animation contains numerous particles depending on the size of image, and many particles could be vary in their attributes; therefore, we implement particle system to manage particles’ states. 2) Particle Emitter: In this work, particle emitter is a vertical line in colour area; it emits particles periodically. Particle system creates the emitters using colour areas information. The emitter are separated equally in horizontal space of colour area, they fill entire animation by generating particles repeatedly.
A particle has the parent emitter and the target emitter attributes. When particles age is expired, particle will be respawned at the parent emitter. Target emitter tells particles where they should move next. Particle will move to the target emitter, until there is no target emitter. In this case, particle is outside of colour area, and it will be re-spawned. Figure 3 illustrates the particle emitter structure.
3) Particle Movement and Interaction: We use steering behaviour to direct particle in a life-like manner [6]. Particles will move smoothly to target position, creating the colour-flow movement like the brushstrokes. Audiences can move their finger to create the path, composing the rhythm of their actions. The illustration of a particle movement is shown in Figure 4.
Fig. 3: The illustration of the particle emitter
Fig. 4: The illustration of a particle movement.
IV.
F EEDBACK
We collected evaluation feedback data by a questionnaire. This questionnaire focuses on aesthetics of the artwork. We installed this artwork on a 55-inch LCD touch-screen monitor during the senior project exhibition held at the Faculty of Information Technology, King Mongkut’s Institute of Technology Ladkrabang in Thailand. Many visitors participated with this artwork; however, twenty of them agreed to participate in the questionnaire. We gather the score using a Likert-scale of 1–5, with (5) being strongly agree, and (1) being strongly disagree. The average scores are shown in Table I. According to the score, participants give the highest score on the second item ¯ = 4.33). Therefore, we can infer that this animation (X reminds audiences to the original image. Participants also made comments as further suggestions, for instances:
•
“The animation could be better if we can make line detection.”
•
“Particles look harsh. I think it could be gentler by using some nice shader.”
•
“This is interesting. It looks like a still image is revived.”
A video document of this work can be found at https://www.youtube.com/watch?v=XnjTxXrHpCc. V.
C ONCLUSION
Particle-flow animation is created by image-processing and graphic rendering techniques. It awakens painting image. The artwork is designed for arbitrary painting images. This is different from the previous related work, which it ties to a particular image. However, it significantly relies on input images. If the image has distinct colour areas, the animation will look fine. In contrast, if the image has vague colour areas,
TABLE I: Summary of user’s satisfaction. Questions 1) The animation ran smoothly. 2) It reminded me to original image. 3) I liked the interaction. 4) The animation was like a painting image being painted. 5) I liked the aesthetics. Average
some artefacts will occur in the animation, because it cannot predict one-hundred-percent precision of artists intent. This problem might be overcome when advanced image processing or machine learning techniques are used. The artwork attracts audiences to admire painting in the new aspect; Audiences can play with it rather than just watching it. It creates the new dimension of aesthetics. We hope that this artwork could aid people in Art studying by tempting them with colour-flow movement and interaction. R EFERENCES [1]
[2]
[3]
[4] [5]
[6]
artof01. (2014) Starry night interactive animation. [Online]. Available: https://itunes.apple.com/us/app/starry-night-interactiveanimation/id511943282 T.-W. Chen, Y.-L. Chen, and S.-Y. Chien, “Fast image segmentation based on K-means clustering with histograms in HSV color space,” in Proceeding of the IEEE 10th Workshop on Multimedia Signal Processing, 8-10 October 2008, Cairns, Australia, 2008, pp. 322–325. A. Irani and B. Belaton, “A K-means based generic segmentation system,” in Proceeding of the 6th International Conference on Computer Graphics, Imaging and Visualization (CGIV’09), 11-14 August 2009, Tianjin, China, 2009, pp. 300–307. J. Van Der Burg, “Building an advanced particle system,” Game Developer Magazine, pp. 44–50, 2000. M. B. Dillencourt, H. Samet, and M. Tamminen, “A general approach to connected-component labeling for arbitrary image representations,” Journal of the ACM, vol. 39, no. 2, pp. 253–280, 1992. C. W. Reynolds, “Steering behaviors for autonomous characters,” in Game Developers Conference, California, 1999, pp. 763–782.
Mean 3.93 4.33 3.80 3.27 3.73 3.81
S.D. 0.70 0.82 0.77 0.88 0.88 0.81