Object, function, action for tangible interface design.

4 downloads 1728 Views 367KB Size Report
(tangible) world and the virtual one, the user needs to have an idea of the relationship between the real objects, their function in the. *e-mail: dmarissa@itesm.
Object, function, action for tangible interface design. Marissa Diaz* ITESM – CEM Department of Computer Science

Isaac Rudomin‡ ITESM – CEM Department of Computer Science

Abstract

virtual word, and how his/her actions affects them (OGDEN, P.G. [2001]), particularly when the goal is to represent an object of the

It is now possible to turn almost any object into an interface. What one must do, however is to explore the user’s needs and perceptions in order to create the metaphors necessary for improved interaction. We have created a group of applications allowing us to study the effects of different interfaces on user interaction and perception. The applications involve different user actions such as virtual reference, and different sensations such as tactile stimulus. The interfaces explored are tangible (virtual and augmented reality) and intangible (real virtuality). The interfaces were used in different situations (technical and art exhibits) and this context also affects user reaction. In this paper we describe the applications and the interfaces involved, as well as the reactions of users. We explore the issue of immersion and suspension of disbelief through spatial reference and tangibility.

virtual world with a specific object or allusion.

CR Categories: I.3.6 [Computer Graphics]: Methodology and Techniques - Interaction Techniques, I.3.7 Three-Dimensional Graphics and Realism – Virtual Reality, J.5.0 Arts and Humanities - Arts, fine and performing. Keywords: human-computer interaction, augmented reality hardware, immersive systems and tangible user interface.

1 Introduction When a user interacts with a computer generated (virtual) or a computer enhanced world by using an interface, the user develops biofeedback. This allows him/her to correctly “connect” the actions developed in the real world to the virtual application. If the interface is good then the user can establish a “virtual connection” forgetting that the object he/she is moving is real and assuming that he/she is in effect moving the virtual object. Mixed reality applications strive to combine the real and virtual worlds and create the appearance of immersion (HARADA [1999]). Most work in this area has focused in the visual aspect of the interface, but other important aspects have been somewhat neglected. Tangible interfaces have been shown to make the user feel more connected to the virtual world (Ishii [1999]). The use of such interfaces in even simple applications is a powerful way of creating a strong impression on the user and making him/her believe that he/she is really interacting with the virtual world. But what are the best ways in which we can induce the user to suspend disbelief? Which tools have more impact? To correctly perceive this virtual connection between the real (tangible) world and the virtual one, the user needs to have an idea of the relationship between the real objects, their function in the

*

e-mail: [email protected] e-mail: [email protected]

We want the user to experience a more natural, friendly interaction. As shown by (Ishii [1999]) tangible interfaces are adequate to make the user feel more connected to the virtual world and achieve biofeedback. These interfaces can also resort to natural elements such as sand, air and water to be used as a reinforcing elements for interaction (Ishii [1999]). Another interesting idea is that of transforming ordinary objects into interaction tools, such as cards, lamps, pencils (TAKAGI [2003]) , gloves, etc. The perception of a user when using these interfaces is strongly connected to actions previously learned and performed by the user in the real world, so it’s easier to understand the consequence of using them in the virtual world as well as their primary functions. Finally, to develop appropriate interfaces we sometimes need to introduce new interaction hardware that is both accurate and cheap. It must be easy to use, intuitive and self-explanatory in order to sustain the application and to give the user the idea that the virtual enhancement to reality is the part of the application that delivers immersion to the system. The main idea in applications that have specific interaction devices added to the real world (OGDEN, P.G. [2001]), is that those devices are mostly used for a simple single task but the virtual application translates the resulting signals as a complex set of instructions sent by the users. We developed four applications with different interfaces that expose different aspects of tangible interfaces. They can be classified according to the kind of interaction used. All of them use proxy objects in the real world to affect virtual objects, but the interaction is different and so we can classify the applications by the type of interaction: a)

Tactile: touching real water to affect virtual water on the computer screen.

b)

Indirect: moving a virtual tree on the computer screen by blowing on a proxy real (artificial) tree. The user assumes the role of the wind.

c)

Direct: using a real set of cards that when flipped change the state of virtual cards on the computer screen.

d)

Virtual Reference: a natural environment that includes a real (natural) plant is connected to sensing devices. The application sends e-mail messages to the user depending on sensor state. The user thinks he/she is communicating with the plant. This application is not visual and does not use the screen. It can be called “real virtuality” for lack of a better term.



In the following sections we describe these applications in more detail, including the hardware developed specifically for this

purpose. We discuss user reaction to these applications. Finally we discuss the lessons learned and the implications on interface design.

2 Tactile Interfaces. The sense of touch is complex. It includes feeling pressure, temperature, and haptics. Applications that use some aspects of this sense are a powerful way of creating an illusionary link between what the user is seeing and touching. This tactile reference expands the imagination of the user. We want to explore how touching real water, (one could also use substances such as clay or sand) to change the condition of a virtual equivalent (virtual water) is different from making the same changes in the traditional way by using a point and click interface. As we explain below, it turns out that touching real water is rather dramatic as a tactile interface since it is not only the liquidness of water that affects the perception of the user, also the temperature of the water, the pressure of water, the amount of water: all this can change the user’s reactions. Even simple applications become very complex and immersive. 2.1 Description of the application Seeing the changes in the virtual world when touching and feeling natural elements is a powerful way of achieving biofeedback (Yonezawa [2000] and Marti [2000]) and has already been proven as an effective interface. Because of this, we developed an application where touching real water (figure 1) seems to affect virtual water in a pond (figure 2). That is, our application makes the user believe and feel that he/she is modifying a 3D virtual pond by the interaction with a small water receptacle in the real world using a special purpose wavesensing device we will describe in more detail later.

Figure 1. An installation containing a receptacle with water used to interact with the virtual pond. The users, when moving the water in the receptacle have a real time response in the virtual pond.

Figure 2. virtual pond. The purpose of this application is to obtain biofeedback by using real water. We think that this can give the user the perception of being near the virtual pond. This kind of experience can give the user a very complex tactile reference of the actions performed in the virtual world. 2.2 Implementation A wave sensor (figure 3) attached above the real water receptacle governs the physics of the virtual water in the pond. This wave sensor is a differential sensing device that responds to the alterations on the water surface and that returns a frequency that is proportional to the wave’s frequency in the water. The schematic diagram of this application can be seen in figure 4. This shows how the user generates in an indirect way the waves used to render the virtual pond. Touching the water is immediately related to the function of moving the virtual pond giving instant biofeedback and the impression of immersion.

Figure 3: wave sensor

From this we can conclude that the first reaction of users touching the real water is of surprise, and then the users begin to play with the water and enjoy the experience.

Figure 5: Demo stand at ISMAR03 Figure 4: wave sensor block diagram The application (as well as the one described in section 2.4) is part of an installation that was shown at the demo session in ISMAR 2003 in Japan. For this reason, the design of the installation by Daniel Rivera, the artist working with us, was developed in such a way that even though lightweight and transportable, it would make us think of how it could be implemented in museums and art galleries.

3 Indirect Interaction Interfaces The usage of reference on interface design is useful to make the virtual world coherent with the experience that the user has when interacting with a virtual proxy of something in the real world. Abstract representation gives the user a complete idea of the function of the object and the expected reaction of the objects in the virtual world. 3.1 Description of the application

2.4 User reactions To feel the water and see the instantaneous reaction on the projected virtual world gives the user a better sensation of immersion. One must try it to believe it. It is really a powerful experience because of the complex sensations associated with touching water.

Moving a virtual tree by blowing on a real (artificial) tree, has the user assuming the role of the wind. (Figure 6).

This installation (Diaz [2003]) was tested on our University and also shown as a demo in ISMAR 2003 (figure 5). At our university we applied a poll to determine how the user sees the interfaces. The topics included in the poll are: ease of use, context of the application, the application itself, the achieving of virtual connection and overall performance. At ISMAR we received informal feedback from attendees. The poll results are summarized in the following graph: Figure 6: Real and virtual tree. The user, by blowing on a proxy, a tree that is really a microphone, can interact with a virtual tree by making it move according to his/her wishes. The physics for the tree are calculated to present the user action as a wind force. The microphone is disguised as a tree to create visual concordance between the real and virtual worlds and to allow the user to experience the sensation of blowing as the wind in a higher conceptual way. In this virtual world the normal physical rules are applied but also the forces that the user commands by blowing are followed. High and low frequencies are separated to create the movements at the branches and at the base of the tree respectively and thus achieve a more natural behavior.

This application illustrates an easy, natural way to interact with a virtual element in a higher conceptual manner. It can make the user believe that he/she can do extraordinary things and that he/she really has power over the virtual world. 3.2 Implementation The movement of the tree is calculated by adding to the wind force the noise created by a frequency generated by a circuit that relates the duration and force of sound to the output frequency.

portable module, which could be assembled and taken everywhere for presentations. With this module, the serial port connection is practical and easy to implement because only two connectors are needed for the sensors’ input data streams. Each of the sensors’ connectors are formed by 4 lines, one for ground, one for Vcc and two for signal processing.

The sensing device is made by using a piezoelectric piece attached to a “floating” array of bended iron strips that simulate the tree branches. Vibrations caused by the user blowing on this “tree” run trough it and are registered at its base in a way similar to that of a microphone registering sound. (Figure 7 shows an image of the sensor).

Figure 9: PC interface using a BS2 3.4 User reactions

Figure 7. Tree sensing device

Figure 8 describes how the sensor is implemented, the sound waves arrive at the microphone and are transformed separated into high and low frequencies and changed into digital information read out by the PC application to perform the simulation.

This interface was tested at ITESM and shown as a demo in ISMAR 2003 (Diaz [2003]) where we received informal feedback from attendees. From this we can conclude that users tend to blow at the beginning expecting a significant response but due to the fact that the response of the tree is relatively smooth, they receive instead a subtle and natural bouncing of the branches. This makes the user touch the proxy (real) tree and move it by hand to sense true power over the virtual tree. This leads us to believe that there was a mismatch between a realistic virtual tree and a somewhat unrealistic tree sensor. Maybe if the virtual tree were less realistic, unrealistic physics could lead to the interface accomplishing the expected behavior.

4. Direct Interaction Interfaces. Touching, moving and discovering the function of the objects around us is important in virtual games. In augmented reality games, it is important for us to add something extra to the game, delivering something beyond the normal game in standard reality. 4.1 Description of the application We use a set of cards that when flipped change the virtual world in a game of memory (figure 10). The set of cards is in front of the user. Each card has a sensing device, which lets the application know when the user flips each card. Figure 8: Tree sensing diagram

Since both the tree sensor and the water sensor were used for the same installation for ISMAR 2003, it was decided that both would use the same circuit, shown in figure 9, to communicate to any PC with a serial port. The circuit shown was used to design a

The user must flip pairs of cards as if playing a game called “memoria”, but when each card is flipped, its corresponding image is displayed represented by a 3D model in a virtual board. The goal is to find the 3D model pairs hidden in each virtual card. By manually turning each card, the user can interact with the virtual cards in the game.(figure 11)

Figure 10: Cards

Figure 11: Card game

The objective of this application is to provide the user with a tangible interface with a virtual board. The application makes the user relate the visual stimuli with the physical sense of turning the cards in a way that, after a short time of using it, the interaction becomes natural and easy. After a while, the user doesn’t even notice the fact that the 3D images don’t even exist in the real world. 4.2 Implementation The cards have a magnet attached at their center. Because, of this, a Hall effect sensor can detect when the polarity of the magnet is inverted by flipping the card. The specially designed board allows us to distribute the cards evenly in a way that preserves the nature of the game and allow good readings by the Hall effect sensors. The Hall effect works because there relationship between a current in a conductor and the magnetic field. The way this current is affected causes a potential difference to occur in a plane perpendicular to the current that is proportional to the vector product of the current and the magnetic field. If the direction and magnitude of the current are known the magnitude and direction of the magnetic field can be easily calculated since the variation of the output voltage is representative of vector that gives magnitude and direction of the magnetic field.

4.3 User reactions This interface was tested at our University. We asked the users their impressions and we can conclude that the overall reaction was really amazing. People expect to see a relationship of the position of the real cards with the real cards despite of the fact that part of the memory game is to search spatially which virtual card corresponds to the real card. After a while the users get used to this fact and play naturally. Nevertheless, we are now working in a prototype that has the same visual distribution in the virtual world and in the reference board to see if this can make it easier to the user to understand and use. (Figure 12 interacting with the game).

Figure 12: Virtual Card game

5. Virtual reference interface This kind of interface links real objects to a new action in the virtual world. There is no screen with a graphical representation of the virtual world, so the user may have only a reference that doesn’t really exist but yet relates the application with a real object. The concept could be called “real virtuality”. New actions are performed by common objects and the user is surprised. Even if the user doesn’t have contact with these objects he sees how the real object affects the virtual world and realizes how the action is performed, he still manages to construct a more direct metaphor. 5.1 Description of the application A real plant is used as a virtual reference to the action of sending a mail. A simple humidity sensor is attached to the plant and when it is determined by the software that the humidity is low, and therefore the plant needs to be watered, it sends an e-mail to the user or list of users. (Figure 10).

Figure 10: Plant with sensor

Figure 11. Plant web page and application

Figure 11 shows the plant connected to a PC that sends the emails contained in the list, as shown above, the interface evolved because the plant required more power to communicate the current status to the remote user.

The system sends one or several messages remarking that water is depleting. From this possibility, the specter of utility of the application grows, widens and reaches the regions bounded by the conceptual art. We explored this with different shows in galleries (Mexico and Colombia), which made us reach the following conclusions: o

o o

o

The daily interactions of users with e-mail services made the computer’s screen used to send the e-mails to be overlooked by most of the users. The love and care from the plant surpassed the barriers that that machines still generate. The application draws a wide sector of people who normally withdraws from scientific and technological developments. The reason for this is that the computer can be ignored if the object is not displayed in the screen, in other words, there is no overload or abuse of virtual images. The plant interface is used without images of the virtual world, generating a mental image free from the onscreen conditioning of a normal pixel image.

Figure 11: Plant with computer interface 5.2 Implementation With this simple application we wanted to make the user think that there was communication with the plant and thus the user has the power to interact with another biological entity. Despite of the fact that the user knows that this kind of direct interaction is not possible, he establishes the necessary metaphor making him create stronger emotional bonds with the plant. In more detail, the interface of this plant consists of a humidity sensor that goes from the pot to a circuit that transforms the analog signal into discrete values and sends them to a PC by serial port. From there, a java application sends e-mails from a list of users and attaches a text file describing the status of the plant. The audience/user introduces his/her e-mail address into the PC and adds it to the mailing list, so the plant can send messages to every address on it. (Figure 11)

This application was developed using a very simple humidity sensor and a black box that interprets the sensed status of the plant. Different plants require special humidity, minerals and other different factors such as PH levels, so more complex sensors could be used and the status of these variables could be stored in the micro controller’s memory so the user could change the application specially for his/her plant. However, this application was developed specifically for the context of an art exhibit and thus the effects and metaphors could be achieved even with the simplest of sensors. 5.4 User reactions The application has been exhibited in several different settings and manners: 1. 2. 3. 4.

An art festival in Mexico City (March 19th, 2003) A trendy art gallery in Mexico City (September 12th, 2003) 3. A web exhibit with the plant in Mexico city and a trendy art gallery in Bogotá (November 21, 2003) 4. A government “house of culture” in Mexico City. (December 3rd, 2003 – January 20th, 2004)

In cases 1,2 and 4 the real plant was present. In 1 there was no computer, just a sensor and a LED that was lit when the sensor detected low humidity and one thus could determine that water was needed. In 2 and 4 we attached a computer interface, and in the screen of the computer one could see a java application telling the viewer the status of the plant as well as to what address the mail was being sent. In case 3, since the real plant was elsewhere, a webcam was added, to show that the real plant was present. In all cases, we received informal feedback from the attendees, and we could see that people were really surprised with the application. When pertinent, people immediately requested us to use their e-mail addresses. They really wanted to see whether they would “really” get e-mail from a plant, although they knew that the mail was not “really” from the plant. The users seemed to react to the virtual affection bond and cared for the status of the plant. More sophisticated viewers that had knowledge of conceptual art reacted intellectually, but everybody reacted by becoming emotionally attached to the plant.

6 Conclusion Different interaction techniques and tricks must be developed for each kind of virtual environments. This is so because always using the already standard interfaces available is causing a dangerous homogenization in a creative field. This is not good for visual artists and virtual reality specialists that want to create reactions, cause emotional responses and suspend disbelief. We tested these interfaces with users, both at art and technical exhibits, where we received informal feedback, and at our university where we applied a poll. From this we could see the reaction of users to the interfaces and the applications. To improve the interaction of people in virtual reality or augmented reality applications one needs to break the barriers that prevent the user form getting an experience as close to reality as possible. To achieve virtual biofeedback it is important that the user feels and thinks that his/her actions really affect the virtual world. Including tactile references in the application increases this biofeedback, since using tangible media as an interface one can reproduce an environment as it is “seen” by the brain of the user and give him/her the idea of being immersed in the virtual environment.

Aknowledgements We would like to thank Leonardo Escalona, Eduardo Hernández and Erik Millán for help in the development of the applications and Daniel Rivera for the development of the creative concept of the applications and their fulfillment as art installations. This research was supported by a ITESM-CEM grant (IVE-2003).

References ISHII, HIROSHI, 1999. Tangible Interfaces. In Proceedings of SIGGRAPH 1999, ACM Press / ACM SIGGRAPH, Annual Conference Series, pp 127. ISHII, HIROSHI, BRYGG ULMER.. 1997. Tangible bits: Towards Seamless interfaces between People, Bits and Atoms. In Proceedings of CHI97ACM march 1997.

YONEZAWA, T.,MASE, K. 2000. Interaction of Musical Instruments Using Fluid. (Also in Japanese) VR Society of Japan 2000. MARTI, STEPHAN, DEVA S, HIROSHI ISHII 2000. WeatherTank: Tangible Interface using Weather Metaphors. http://web.media.mit.edu/~nitin/papers/sand.html. CALDWELL, O. KOCAK, U. ANDERSEN 1995. Multi-armed dexterous manipulator operation using glove/exoskeleton control and sensory feedback, International Conference on Intelligent Robots and Systems-Volume 2, IEEE p2567. SAEKO TAKAGI, NORIYUKI M, MASATO S, HIROKAZU T, 2003. An Educational Tool for Basic Techniques in Beginner’s Pencil Drawing In Proceedings of Computer Graphics International, IEEE p.288. HARADA, Y., NOSU. OKUDE.1999. Collaborative Interactive and

Collaborative Learning Environment using 3D Virtual Reality Content, Multi-Screen Display and PCs , in proceedings of WET ICE '99. IEEE 8th International Workshops p. 238 –244. KIYOKAWA, K.; TAKEMURA ; YOKOYA.1999. A collaboration support technique by integrating a shared virtual reality and a shared augmented reality, in Proceedings of .IEEE SMC '99 Conference Volume: 6, p 48 -53. OGDEN, P.G. 2001. Human computer interaction in complex process control: developing structured mental models that allow operators to perform effectively. People in Control. IEE Conf. No. 481 . p 120 -125. DÍAZ ,MARISSA. EDUARDO H LEONARDO E, ISAAC R. DANIEL 2003. R. Capturing Water and Sound Waves to Interact with Virtual Nature. In proceedings of ISMAR 2003