Designing Context-Aware Photography - ACM Digital Library

0 downloads 0 Views 160KB Size Report
Designing Context-Aware Photography. Abstract. Taking a photograph using a digital camera is today still basically the same as using the analog counterpart.
Capturing the Invisible: Designing Context-Aware Photography Maria HÂkansson Future Applications Lab Viktoria Institute Box 620 SE-405 30 Gˆteborg, Sweden [email protected] Sara Ljungblad Future Applications Lab Viktoria Institute Box 620 SE-405 30 Gˆteborg, Sweden [email protected] Lars Erik Holmquist Future Applications Lab Viktoria Institute Box 620 SE-405 30 Gˆteborg, Sweden [email protected]

Abstract Taking a photograph using a digital camera is today still basically the same as using the analog counterpart. We are designing a digital camera that senses its context to explore new possibilities for digital photography. The sensor data produces real-time visual effects on the image displayed in the viewfinder and enables the user to take unique pictures, whose visual qualities reflect the context. Our first prototype is based on a digital camera mounted on a handheld computer. Our development process involves participatory design sessions with possible end users, including a panel of enthusiastic amateur photographers.

Keywords Digital photography, Context Awareness, Participatory Design, Mobile Users, Lomographers, Digital Media, Alternative Photography, Environmental Photography.

Industry/category Entertainment, photography, mobile users

Project statement Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Copyright 2003, ACM.

©2003 ACM 1-58113-728-1 03/0006 5.00

Digital cameras are becoming increasingly popular among everyday users. Our goal is to make digital photography more exciting for amateur photographers, in particular those with artistic ambitions. Even though digital technology has enabled many new possibilities

2

for sharing, storing, copying, editing, and publishing pictures, we believe that more could be done to enhance the user experience. Instead of focusing on what can be done digitally with a picture after it has been taken, we believe that the actual moment of capturing an image can be made more fun.

Figure 1. Light, speed, and focus are parameters in traditional photography. Examples of things that can be sensed: ! Movements ! Sound ! Temperature ! Pollution ! Humidity ! Smell ! Electromagnetic fields StartleCam is a wearable video camera that detects changes in the user’s emotional state with a skin conductivity sensor (galvanic skin response [GSR]). The changes trigger the system to store digital images of the user’s immediate surroundings at the moment of arousal. LAFCam is a video camera that uses GSR and laughter detection to simplify the editing of videos by marking the most interesting parts according to sensor values.

When taking an analog photograph, one can play around with parameters such as light, speed, and focus. But what about letting other parameters influence the image, and viewing the effects in realtime? Would it be possible to capture something in addition to the visuals of the scenery? Sounds in the background, pollution in the air, smell—can such contextual information be somehow reflected in a picture? And if so, would there be users who found this interesting? These were the initial questions when the project was initiated. Together with explorative amateur photographers, socalled Lomographers [5], we are creating a camera that will have real-time visual effects. The effects will be based on sensors on the camera that collect contextual information such as sounds, pollution, temperature, and smell (see sidebar). Sensors have been used in combination with digital video cameras in other research projects, e.g., StartleCam [3] and LAFCam [4] (see sidebar). However, their use of sensors is quite different from ours, since we want to use context awareness to create visual effects. Our approach is more similar to the way music is produced by the context in Sonic City [2], which uses multiple sensors to sense the surroundings and affect the musical output.

Project participants “Context Aware Photography” is an ongoing research project at the Future Applications Lab in Göteborg, Sweden. It is funded by the Swedish Foundation for Strategic Research, through the three-year financed “Mobile Services” project, which focuses on new applications in mobile media. The project described here was initiated in October 2002. The current prototype is part of a B.Sc. thesis in software engineering.

Process The project includes conceptual design and participatory design, as well as the building and testing of a camera prototype. The conceptual work included getting an insight into the vast area of photography, studying several trends and photographers. We found ourselves inspired by spontaneous and explorative photography, such as the experimental desire and humor in the photos taken by the Lomographic Society [5]. The Lomographers take photographs with a “don’t think, just shoot” mentality, using special cameras that create unpredictable color and light effects. Another source of inspiration is photographers and artists that use various techniques to “capture the moment,” for instance the body of work that documents Andy Warhol’s Factory in the ‘60s [6]. Our design method involves workshops with two contrasting groups: a group of traditional photography students, and a group of Lomographers. The first group consisted of seven photography students who participated in a small interview and exercise where they presented some of their pictures. This was done to find out about their experience and interest in

3

photography, as well as what motivates them to take pictures. The second group consisted of three Lomographers. We interviewed them about their interest in photography. By using props and scenarios, we held a discussion about the conceptual ideas in the project. After the meeting, new ideas and directions emerged from the participants’ input. Figure 2: Participatory design with Lomographers.

Figure 3: The current prototype is based on a handheld computer with a digital camera.

Research details

alternative, the current temperature could, for example, be readable in the image by the amount of red or blue colors, which reflect a warmer or colder environment. In the second alternative, the visual effect caused by the temperature would be considered merely artistic instead of informative. In the raw data alternative, sensor values could be presented in terms of figures, e.g., “15°C.” According to the Lomographers, contextual information (such as for example, sound and pollution) might be hard to express, strictly using visual effects. They were not interested in having the contextual information available in raw data as a supplement to the image file, but preferred to explore the visual effects. This led us to take an approach that strived for the exploration of “artistic” effects, rather than trying to make readable ones. The Lomographers could imagine themselves as end users of such a camera and suggested that it would be suitable for recreational pursuits. Rather than appealing to professional photographers, it could allow anyone to take artistic photos anywhere.

The meeting with the two contrasting groups helped us to define the potential end users. The photography students had a strong interest in the traditional photographic process and some were openly negative towards digital photography. The Lomographers also seemed skeptical towards digital technology in general, but were nevertheless interested in applications that could make digital photography fun. We found it useful to involve the Lomographers in the design process, since they were prepared to use their camera in many different contexts and situations, and were openminded in exploring new means of photography.

Results

We were particularly interested in their views about if and how context and picture could be correlated. Was this considered an interesting feature of photography? If so, should each sensor value be readable in a picture? By readable, we mean that the user should be able to draw some conclusions about the sensor conditions under which the image was taken. Or, would it be enough to consider it as an exploration of visual effects? Three different output alternatives were highlighted: readable image effects; “artistic” image effects rather than readable ones; and “raw” context data included as a supplementing file. In the first

The current prototype is based on a handheld computer with a digital camera, NexiCam [7]. It uses the GapiDraw software platform for creating advanced graphics on handheld computers [1]. The prototype currently uses simulated sensor values, which was considered the best approach to be able to rapidly test our conceptual ideas of real-time visual effects. The simulated sensor values affect hue, saturation, and value in the image shown in the viewfinder (see sidebar, Figure 8). These parameters were chosen as a starting point, since they render the image without destroying the scenery (unless they are reflecting an extreme situation).

4

Hue changes the color scale that causes every color to switch e.g., making a face blue. For an example, see Figure 7. Saturation affects the image by making the colors either faded or extremely saturated. For an example, see Figure 5. Value affects the light in the image. If the value is low the image turns black and if it is high the image gets brighter. For an example, see Figure 6.

Figure 4-7: Images taken with the current prototype. From left to right: image without any effects; simulated sensor values affecting the saturation; affecting the value; and finally, an effect where hues are switched.

Hue, saturation and value are correlated and affect each other, which is similar to how light, speed, and focus work in an analog camera. However, we will also explore parameters that are tangential, i.e., they do not affect each other. An example of this would be to manipulate the resolution of an image according to given sensor values, which would not directly affect, for example, the saturation. The Lomographers will be involved in the decisionmaking about which sensors and which visual effects to use, and their correlation. This is still work in progress. Future work involves further participatory design with the Lomographers, including a user test of the first prototype to guide the continuing design process.

Figure 8: Trying out the visual effects in the current prototype.

We believe that the potential of digital photography is not fully explored. Working closely with experimental amateur photographers, we are discovering new, exciting possibilities. We strive for a new, inspiring user experience in digital photography that will let anyone at anytime get an artistic view of the world and capture the moment.

Acknowledgements We thank the Lomographers: Katja Andersson, Andreas Carlsson, and Johan Åberg. We also thank our colleagues at the Future Applications Lab and especially Pontus Munck for developing the prototype.

References

[1] GapiDraw, www.gapidraw.com (last visited 21 Jan. 03). [2] Gaye, L., Holmquist, L.E. and Mazé, R. Sonic City: Merging Urban Walkabouts With Electronic Music Making. In Companion of UIST 2002. [3] Healey, J. and Picard, R. W. StartleCam: A Cybernetic Wearable Camera. In Proceedings of the Second International Symposium on Wearable Computers 1998, Perceptual Computing Technical Report nr. 468. [4] Lockerd, A. and Mueller, F. LAFCam: Leveraging Affective Feedback Camcorder. In Proceedings of CHI 2002, ACM Press, pp. 574-575. [5] The Lomographic Society, www.lomography.com (last visited 21 Jan. 03). [6] Name, B. All Tomorrow’s Parties – Billy Name’s photographs of Andy Warhol’s Factory. London: Frieze, 1997. [7] NexiCam, www.nexian.com (last visited 21 Jan. 03).