A Wireless Glove to Perform Music in Real Time - wseas

3 downloads 0 Views 1MB Size Report
Abstract: - In this paper, we present a “wireless glove” that we propose as a new musical instrument; it is able to control a real time granular sound synthesis ...
Proceedings of the 8th WSEAS International Conference on APPLIED ELECTROMAGNETICS, WIRELESS and OPTICAL COMMUNICATIONS

A Wireless Glove to Perform Music in Real Time GIOVANNI COSTANTINI1,2, MASSIMILIANO TODISCO1, GIOVANNI SAGGIO1 1 Department of Electronic Engineering University of Rome “Tor Vergata” Via del Politecnico, 1 - 00133 - ROME - ITALY 2 Institute of Acoustics “O. M. Corbino” Via del Fosso del Cavaliere, 100 – 00133 - ROME - ITALY Abstract: - In this paper, we present a “wireless glove” that we propose as a new musical instrument; it is able to control a real time granular sound synthesis process. The glove was adopted to detect intrinsic and extrinsic hand movements, thanks to sensors on back-hands, palm and fingers. Sensors realized with piezoresistive materials change their electrical resistivity when deformed. The piezoresistive coefficient is defined by the ratio of the change of the relative resistivity caused by the change of the relative length of resistor. In addiction to the piezoresistive sensors we also use a kinematic transducer that measure hand motion acceleration, it is composed by three mono-axial accelerometers. Finally, the musical synthesis process was realized by means of a sound synthesizer based on granular additive synthesis algorithm. Key-Words: - Wireless Sensor Interface, Control, Musical Synthesis Process

1 Introduction

2 The Glove and the Gestural Tansducer

Traditional musical sound is a direct result of the interaction between a performer and a musical instrument, based on complex phenomena, such as creativeness, feeling, skill, muscular and nervous system actions, movement of the limbs, all of them being the foundation of musical expressivity. Actually, musical instruments transduce movements of a performer into sound. Moreover, they require two or more control inputs to generate a single sound. For example, the loudness of the sound can be controlled by means of a bow, a mouthpiece, or by plucking a string. The pitch is controlled separately, for example by means of fingering which changes the length of an air column or of a string. The sound produced is characteristic of the musical instrument itself and depends on a multitude of timevarying physics quantities, such as frequencies, amplitudes, and phases of its sinusoidal partials [1]. The way music is composed and performed changes dramatically [2] when, to control the synthesis parameters of a sound generator, we use human-computer interfaces, such as mouse, keyboard, touch screen or input devices such as kinematic and electromagnetic sensors, or gestural control interfaces [3,4]. In this paper, we discuss a gestural sensor interface. It measures the signals relative to the hand movements, by means of some piezoresistive sensors and kinematic transducers. The paper is organized as follows: first we describe the glove, the gestural transducer and the system architecture, then we describe and illustrate our glove wireless sensor interface; finally, we show a real-time musical application using our interface.

The first stage of our system consisted of an instrumented glove realized with a lycra based material as support of 19 bend sensors, one for each finger’s degree of freedom, including flex-extension and adduction-abduction movements. Every sensor consists of a film resistor that changes its resistance when bended. The second system’s stage regarded the circuitry. Thanks to a voltage divider applied to every sensor we get the corresponding voltage which was properly scaled between 0 and 5 Volts. Voltage values were then passed to a multiplexer. One analog value at time was converted into a digital one and furnished to a PIC via SPI protocol. The PIC provided read values to a wireless network based on ZigBee protocol. The wireless network, considered as the third stage, had a classic star-shaped topology, in which sensors represent the so called edge-nodes, while the set of MUX, ADC, PIC and wireless transmitter represent the central-node. As a final stage, the information were transferred to a personal computer that interpreted, classified and utilized them. The entire system is schematically reported in figure 1.

ISSN: 1790-2769

2.1 System Architecture The figure 2 shows a high-level generic model of a system for acquisition and manipulation of biological signals. Acquisition block reads bio-sensors' signals, eventually formats or ciphers them and sends signals to the transmitter. Data transmission could be wired or wireless and a receiver accepts signals for an interpreter which take care of syntactical correctness and, optionally, semantic

58

ISBN: 978-960-474-167-0

Proceedings of the 8th WSEAS International Conference on APPLIED ELECTROMAGNETICS, WIRELESS and OPTICAL COMMUNICATIONS

Figure 1. Block diagram for the hand joint movement data acquisition system

Figure 3. (a) Front and (b) lateral view of the sensors mounted onto the glove

3 Glove Musical Instrument

Figure 2. High-level model of a generic BAN system

The glove musical instrument that we propose has been developed by using the Max/MSP [5] environment. It is constituted by two components: the control unit and the synthesizer unit. The control unit allow the performer to control seventeen parameters. Fig. 1 show the Max/MSP patch that constitutes the control unit. Particularly, the signals supplied by the glove sensor unit, that realizes the interface between the performer and the system. The chosen sensor by means of which the synthesis parameters are controlled, all influence the way the musician approaches the composition process. In Figure 4, the structure of the virtual musical instrument is shown.

interpretation (for example extracting key features, classifying them, etc.). Data can be forwarded to external networks or can be utilized for controlling applications of virtual reality, for moving mechanical devices or, simply, for storing purposes. The sensors were placed onto the glove in correspondence to the dorsal part of all the fingers joints, in order to trace fingers movements, as illustrated in figure 3.

ISSN: 1790-2769

59

ISBN: 978-960-474-167-0

Proceedings of the 8th WSEAS International Conference on APPLIED ELECTROMAGNETICS, WIRELESS and OPTICAL COMMUNICATIONS

Figure 4: The glove musical instrument

4 Sound Synthesizer

5 Conclusion

We tested our interface, developed under the Max/MSP environment, by writing a real-time musical composition. The synthesis process was realized by means of the sound synthesizer “Textures 2.1” [6]. Fig. 5 show “Texture 2.1” standard VST [7] (Virtual Studio Technology from Steinberg ltd) audio plug-in. The sound synthesized with “Texture 2.1” is based on a granular additive synthesis algorithm. There are seventeen sound synthesis parameters [6], showed in Fig. 5, regarding knobs, through which we can shape the sound waveform.

We have developed a wireless glove sensor interface for composing and performing expressive musical sound. We direct our attention to common musical aesthetics as a determinant factor in musical expressivity. The sensor interface we have presented is arranged by a sensor unit that supplies kinematics physical parameters. Particularly, these parameters are intrinsic and extrinsic hand movements. The experiences made by working with our wireless glove sensor interface have shown that the mapping strategy is a key element in providing musical sounds with expressivity. References: [1] Neville H. Fletcher, Thomas D. Rossing, The Physics of Musical Instruments, Springer, 2nd edition (July 22, 2005). [2] Curtis Roads, The computer music tutorial, The MIT Press, (February 27, 1996). [3] Bongers, B. 2000, Physical Interfaces in the Electronic Arts. Interaction Theory and Interfacing Techniques for Real-time Performance, In M. Wanderley and M. Battier, eds. Trends in Gestural Control of Music. Ircam - Centre Pompidou. [4] Orio, N. 1999, A Model for Human-Computer Interaction Based on the Recognition of Musical Gestures, Proceedings of the 1999 IEEE International Conference on Systems, Man and Cybernetics, pp. 333338. [5] Cycling74 Max/MSP, documentation avaible on the web at http://www.cycling74.com/products/maxmsp [6] Giorgio Nottoli, “A sound texture synthesizer based on algorithmic generation of micro-polyphonies”, Proc. of 3nd International Conference “Understanding and creating Music”, Caserta, December 2003, 11-15. [7]Steinberg VST Audio Plug-Ins SDK, 3rdparty developer support site at http://www.steinberg.net/324_1.html

Figure 5: “Texture 2.1” sound synthesizer.

ISSN: 1790-2769

60

ISBN: 978-960-474-167-0