The Development of a Computer-based, Physically Modelled Musical

4 downloads 0 Views 51KB Size Report
With Haptic Feedback, for the Performance and Composition of Electroacoustic ... furthering the effect of physically isolating the musician from their instrument. ... This paper describes the development of a computer based musical instrument .... Control of Computers. cnmat.cnmat.berkeley.edu/Research/CHI2000/wessel.pdf.
The Development of a Computer-based, Physically Modelled Musical Instrument With Haptic Feedback, for the Performance and Composition of Electroacoustic Music.

S.Rimell, D,M.Howard, A,D.Hunt, P,R.Kirk and A,M.Tyrrell

ABSTRACT Conventional computer-based instruments that make use of a piano-style electronic keyboard are creatively limiting for a number of reasons. Expressive compositional and performance potential in such instruments is restricted due to the indirect relationship between the musician’s gesture and the sound produced. Tactile feedback cues that are so important to the performer of acoustic instruments are also lacking, furthering the effect of physically isolating the musician from their instrument. In addition, electronically produced timbres commonly sound unnatural, sterile and lacking in physical causality.

This paper describes the development of a computer based musical instrument called ‘Cymatic’ that addresses these problems.

Cymatic uses techniques of physical modelling to synthesise various musical components, including strings, membranes and volumes (and even shapes comprising more than three dimensions!) which can be joined together to form complex instruments. Various excitations can be applied to the virtual instrument to set it resonating, including bowing, plucking, microphone input, wave oscillators and random forces, producing a stereo output in real-time. The graphical user interface

displays an animation of the resonating structure as it is being played, providing the user with the impression of working with real physical musical components.

User input is currently achieved via a force feedback joystick, the axes and buttons of which can be mapped the any of the instrument’s parameters, providing real-time, multi-parametric gestural control. Tactile feedback is transmitted back to the joystick controller, allowing the user to feel and interact with the vibrations of the instrument. This reconnects the tactile feedback path that is missing from most computer-based musical instruments, increasing the sense of physical connection and causality between the musician and instrument. Musical examples will demonstrate the natural and organic sounding nature of Cymatic’s physical model and its creative potential as a performance and compositional tool will be discussed.

1. INTRODUCTION

Compared to traditional acoustic instruments, the computer in principle offers a virtual infinity of musical potential, representing a form of tabula rasa which can be configured to suit the individual’s needs and aesthetic interests. In practice however, computer based instruments have yet to attain creative equality with their physical counterparts. Despite the musical and creative freedom it offers, current technology places serious limitations on the expressive potential of computer based instruments, particularly in the realm of live performance. Computer instruments, are often criticised as being cold, lifeless or lacking in expression, whereas physical instruments may be described as warm, intimate or organic.

Techniques of physical modelling have gone some way to addressing these criticisms by modelling the sound creation mechanism rather than attempting to animate its waveform. By intrinsically linking the sound creation mechanism with the sound produced, physically modelled instruments more closely resemble their acoustic cousins, accurately and naturally recreating aperiodic and transient sonorities more commonly associated with timbres of physical causality. Manipulating physical parameters (such as string length, tension and mass) is also more intuitive for musicians than dealing with arbitrary and unpredictable low level parameters such as amplitude modulation, oscillator frequency and phasing (as with more conventional synthesis methods).

However, the fact remains that however accurately an instrument can be modelled inside a computer, it remains a separate and untouchable entity for the musician, who is physically divorced from the instrument by the physical-virtual divide. The most common method of attempting to bridge this void is to employ the piano-style MIDI keyboard to control the synthesised instrument’s parameters. The keyboard is a familiar and flexible interface for the control of sound but falls short in a number of respects when it comes to permitting total creative control over multiple instrumental parameters.

A typical keyboard offers only one degree of freedom and its underlying MIDI protocol treats notes as discrete events bound by an onset and an offset. While this may be sufficient to control synthesis models which share its interface (such as the piano, organ or harpsichord), timbres that evolve between the note onset and offset (i.e. stringed and wind instruments) are less well served. MIDI functions such as

aftertouch, modulation and pitch bend have gone some way to addressing this problem but still do not offer anything like the expressive potential of many acoustic instruments.

Also, regardless of the expressive potential of a particular computer interface, the musician still remains physically detached from the sound source, which resides in the virtual domain. After audition itself, the haptic senses provide the most important means for observing the behaviour of musical instruments [1] but as computers have evolved to prioritise visual stimuli over tactile control, the haptic senses have been left sadly undernourished.

Both tactile (vibrational and textural) and proprioceptive

(awareness of one’s body state and the forces upon it) cues are vital in combination with aural feedback to create a complex and realistic musical impression [2]. Haptic feedback has also been shown to improve the playability of computer instruments [3] helping the user to gain internal models of the instrument’s functionality.

Furthermore, computers release the musician from the one gesture, one acoustic event paradigm [4] which characterises acoustic instruments, alienating the musician’s gesture from the sound produced. These factors lead to an overall isolation of the musician from the instrument, exaggerating perceptions of computer instruments as cold, lifeless and lacking in expression. Figure 1 shows a simplified diagrammatical representation of the contrasting input and feedback paths of typical acoustic and computer instruments.

Many attempts have been made to incorporate haptic feedback into computer instrument controllers. Electronic keyboards have been developed to more closely replicate the feel of real pianos [5] and to provide tactile feedback [6]. Haptic feedback bows have been made to simulate the feel and forces involved with playing bowed instruments [7] and finger fitted vibrational devices have been utilised in open air gestural musical instruments [8]. However, haptic control devices have so far been generally restrictive in their potential for application across different computer instruments and inaccessible to the musical masses.

b) Performer Gesture

Auditory

Visual

Tactile

Gesture

Performer

Interface Control data

Interface

Visual and auditory Feedback

a)

Instrument

Instrument

c) Gesture

Haptic Feedback

Interface Control data

Haptic Feedback Data

Visual and auditory Feedback

Performer

Instrument

Figure 1: Diagrammatical representation showing input and feedback paths for a)acoustic instruments b)typical Computer Instruments c) proposed computer instrument incorporating tactile feedback (i.e Cymatic)

2. CYMATIC: AN OVERVIEW

Cymatic is a new computer-based musical instrument in development which attempts to address the problems described above. Taking inspiration from sound synthesis environments such as TAO [9], Mosaic [10] and CORDIS-ANIMA[11], Cymatic makes use of a mass-spring physical modelling paradigm, which is implemented as a library of individual components in 1, 2, 3, or even more dimensions, providing strings, membranes, volumes, and structures of higher dimensionality that could never be physically realised. These structures can be joined together to form a complex instrument by arbitrarily connecting individual components. The instrument can be excited at any desired mass in the model by plucking, bowing, arbitrary synthesised waveforms or via an external audio signal.

The size, shape, tension, mass and damping characteristics of each component can be user defined and can be varied dynamically in real time during synthesis, which is one way of creating an instrument that could never be created in the real physical world.

The resulting sound can be heard by placing any number of virtual microphones at user-defined points on the instrument. The output is the sampled displacement of the mass-spring cell to which it is attached.

Cymatic currently runs on a Silicon Graphics O2 workstation running Irix 6.5. The acoustic output can be produced in real-time for a modest sized structure of around 60 masses (for a 44.1kHz sampling frequency) or off-line for a large structure. The user

has some degree of control over the trade-off between real-time operation and instrument size by selecting the sampling frequency, which can be set at various values between 8kHz and 48kHz.

Cymatic can display real-time OpenGL animations of the resonating structures. These animations are a useful form of visual feedback, which when combined with auditory and haptic feedback creates a complex and realistic impression of working with real physical instruments. Animations can also provide useful feedback as to the authenticity of the physically modelled excitations and can be used to visually demonstrate acoustic principles.

3. THE CONTROL INTERFACE

MIDI controller inputs offer real time control of Cymatic, allowing variation of each of the physical parameters of the instrument i.e. mass, tension, damping, excitation (force, velocity and excitation point) and virtual microphone point. Any device capable of transmitting MIDI controller messages can be employed to control Cymatic. Currently, multi parametric MIDI input is provided via a Microsoft Sidewinder Force Feedback Pro joystick [12] and a Logitech iFeel mouse [13]. The combination of these controllers not only allows simultaneous control of up to six physical parameters but also provides tactile and proprioceptive feedback to the user. The iFeel mouse is a conventional optical mouse containing a vibrotactile device that can create numerous tactile sensations. The Microsoft joystick boasts the ability to output up to six forces simultaneously, enabling the programmer to create realistic

force effects such as vibration, recoil, damping and friction that characterise real instruments.

Both control devices are interfaced with a Windows PC, (as they are non SGI compatible) where their movement data is converted by software to MIDI in order to remotely control Cymatic. The iFeel mouse produces tactile sensations using Immersion’s software (Immersion Touchsense Entertainment [14]) which converts Cymatic’s audio output directly into frequency and amplitude related tactile sensations. The haptic capabilities of the Microsoft joystick can be programmed via MIDI. Cymatic feeds the appropriate haptic commands to the joystick while the instrument is playing in order to recreate the feel and tactile response of a real instrument.

The effect of each controller is user defined so the control interface can be configured to the instrument and excitation that Cymatic is running. For example the velocity of the mouse could be used to control the bow velocity of a Cymatic bowed string instrument while the joystick controls the force of the bow, the excitation point, the tension and the mass of the string. Vibrational forces corresponding to the frequency and amplitude of the sound output would also be felt by both hands and forces corresponding to the bow force and position could be felt by the joystick hand. The joystick can also be used as a virtual drum stick for exciting a Cymatic membrane instrument with the velocity of the y-axis mapped to the force of the drum stick. Recoil forces from the joystick allow the virtual membrane to be felt by the user.

Cymatic’s real-time audio input also offers interesting creative possibilities. An audio input from any sound source can be used to excite a virtual instrument while the joystick and mouse manipulate its physical characteristics.

4. CONCLUSIONS AND FUTURE WORK

Cymatic is not intended to accurately model real instruments, rather its intention is to allow the musician to intuitively and physically interact with new instruments which may not be practical or possible to conceive in the physical world. Its tactile and gestural interfaces reconnect the haptic feedback path that is missing from most computer instruments, immersing the musician more totally into the playing process and enhancing the perception of manipulating actual physical instruments. It is intended that the synthesis method and control interface will make Cymatic a warmer, more organic and more expressive instrument than conventional computer based instruments. Empirical research will be carried out in the near future to guide Cymatic’s development in order to achieve these goals.

Currently, Cymatic’s only limiting factor is that of processing power. Due to the heavy demands of the mass-spring physical modelling technique, Cymatic is currently restricted to running a maximum of around 60 mass-spring cells in real-time at 44.1Khz (though obviously many more at lesser sampling rates). Work is underway to adapt the code to run on an Origin 8-node parallel computer which should increase the speed of the model by up to 16 times, potentially limiting Cymatic’s complexity and creative capacity to the musician’s imagination.

5. ACKNOWLEDGEMENTS The authors thanks the Engineering and Physical Sciences Research Council for their support of this work under grant number GR/M94137.

6. REFERENCES

[1] Cook, P.R. (1999). Music, Cognition & Computerised Sound: An Introduction to Psychoacoustics. MIT Press. London. pp229. [2] MacLean, K.E. (2000). Designing With Haptic-Feedback, www.cs.ubc.ca/~maclean/publics/icra00-DesignWithHaptic-reprint.PDF [3]O’Modhrain, M.S., and Chafe, C. (2000). Incorporating Haptic Feedback Into Interfaces For Music Applications, Proceedings of ISORA World Automation Conference 2000. [4] Wessel, D. & Wright, M. (2000) Problems and Prospects for Intimate Musical Control of Computers. cnmat.cnmat.berkeley.edu/Research/CHI2000/wessel.pdf [5] Gillespie, B. (1992) Proc ICMC San Jose, CA. pp. 447-448. [6] Cadoz, C., Luciani, A. & Florens, J.L.(1984). Responsive Input Devices and Sound Synthesis by Simulation of Instrumental Mechanisms: The Cordis System. Computer Music Journal 8(3): pp 60-73. [7] ] Nichols, C.(2001) The vBow: Haptic Feedback and Sound Synthesis of a Virtual Violin Bow Controller. http://charlesnichols.com/vBow.html [8] Rovan, J. (2000) Typology of Tactile Sounds and their Synthesis in GestureDriven Computer Music Performance. In Trends in Gestural Control of Music. Wanderley, M., Battier, M. (eds). Editions IRCAM, Paris.

[9] Pearson, M., and Howard, D.M. (1995) A musician’s approach to physical modelling., Proc. ICMC., pp.578-80 [10] Morrison, J.D. and Adrien, J-M.(1993) MOSAIC: A Framework for Modal Synthesis., Computer Music Journal 17(1): 45-56 [11] Cadoz, C., Luciani, A. & Florens, J.L.(1993). CORDIS-ANIMA: A Modelling system for sound and image synthesis, the general formalism., Computer Music Journal 17(1):19-29 [12] http://www.microsoft.com/products/hardware/sidewinder/forcefeedback/default.htm [13] http://www.logitech.com [14] http://www.immersion.com

Stuart Rimell: [email protected] David Howard: [email protected] Andy Hunt: [email protected] Ross Kirk: [email protected] Andy Tyrrell: [email protected]

All Affiliated to: University of York.

Music Technology Research Group Dept of Electronics, University of York, York. YO10 5DD UK.

Suggest Documents