Crosscale: A 3D Virtual Musical Instrument Interface Marcio Cabral† , Andre Montes† , Gabriel Roque† , Olavo Belloc† , Mario Nagamura† , Regis R A Faria α , Fernando Teubl‡ , Celso Kurashima‡ , Roseli Lopes† , Marcelo Zuffo† ∗ † ˜ Paulo & ‡ Fed. Univ. of ABC & Music Department / FFCLRP - Univ. of Sao ˜ Paulo α Polytechnic School - Univ. of Sao
A BSTRACT
Inspired by principles for designing musical instruments we implemented a new 3D virtual instrument with a particular mapping of touchable virtual spheres to notes and chords of a given musical scale. The objects are metaphors for note keys organized in multiple lines, forming a playable spatial instrument where the player can perform certain sequences of notes and chords across the scale employing short gestures that minimizes jump distances. The idea of different arrangements for notes over the playable space is not new, being pursued on alternative keyboards for instance. This implementation employed an Oculus Rift and a Razer Hydra for gesture input and showed that customization of instrumental mappings using 3D tools can contribute to ease the performance of complex songs by allowing fast execution of specific note combinations. Index Terms: I.3.6 [Computer Graphics]: Methodology and Techniques—Interaction Techniques 1
I NTRODUCTION
One of the main challenges when learning to play a musical instrument is to master available mappings between interface patterns and the intended music sounds. A clear and intuitive mapping is likely to accelerate the learning curve on most situations, though it does not imply ease of playing. The piano, for instance, has a clear and straightforward layout but it is still one of the most difficult instruments to master [3] for both its limitations and complexity in producing music sound combinations. Mechanical implementation and spatial arrangements of mappings are found to influence the ease and feasibility to perform a given chord or scale, as well as the limitations of one’s body and gesture range associated to the instrument itself. Departing from the expected role of traditional instruments, a novel instrument - particularly a virtual one - must provide a way to perform single isolated notes, chords, and modulation and expressions. Usually notes are obtained by pressing keys (e.g. piano and organ keyboards), plucking or bowing a string (e.g. acoustic guitar and violins), or blowing reeds (e.g. harmonica and flutes). But beyond that, we believe that a virtual instrument could be used to solve some of the mechanical constraints and spatial logic challenges involved in playing music in a more direct and intuitive way, freeing cognitive resources to be used only for creativity and musical expression. Mechanical and spatial plasticity are also found to solve some physical disabilities or inappropriateness that have limited which instruments are learnt by students: for instance, individuals with relatively small hands cannot completely perform specific compositions in a piano. Considering note sequences, musical expressiveness is found to be directly dependent on the mechanical execution of several patterns, from long and expressive single notes to fast scale sequences, intervals and arpeggios. Most current popular and orchestral instruments usually operate with tones and semitones on a 12-note scale. ∗ contact
email:
[email protected]
IEEE Symposium on 3D User Interfaces 2015 23 - 24 March, Arles, France 978-1-4673-6886-5/15/$31.00 ©2015 IEEE
However, many songs use only part of the scale range, which means some notes will be rarely used or will not be used at all at that song. A scale with more notes than necessary for a specific passage often implies in frequent jumping. On a piano keyboard, for instance, spatial jumps across many keys might be necessary. Therefore, it requires musicians to learn and practice a psychomotor skill in order to accurately play a determined song. On the other hand, a virtual instrument could offer a dynamic note table, allowing the player to choose a given scale and the distance between the notes, thus minimizing jumps complexity. We believe that it could enhance the focus on performing the mechanical gestures to access the notes required for a given musical piece, reducing the complexity to play it. Furthermore, one could learn how to play the musical instrument by learning a specific set of gestures patterns, for most intervals can be mapped onto a fixed position. For the sake of flexibility and ease of operation, and interested in investigating the potential advantages of allowing the player to choose a scale and the distance between the notes, we propose a virtual music instrument where the notes of a given music key are arranged and replicated on a grid that minimize jumps across the musical scale. Our concept encapsulates flexibility in distributing notes on the playable space, and is likely to deliver a fluid way for musicians to perform specific gesture patterns that would otherwise require non-trivial motor skills for performing the spatial jumps. 2
R ELATED W ORK
Fels [4] discusses several strategies for designing new instruments and investigates many practical designs, specially considering the intimacy as a guideline for matching device behavior and operation [3]. New interfaces should be appropriate to human minds and bodies in order to permit musical expression. Instruments packed with lots of features are easy to do but may be unstable and difficult to learn or control. Chadabe [1] while investigating mapping in electronic instruments observes that interactive instruments allow for anyone to participate in a musical process, from the most skilled and talented to the most unskilled of the large public. Wessel [6] claims that new instruments should have a low entry fee, with no limits on virtuosity, because some players have spare bandwidth, some do not. Mapping instrument operation to sonic behaviour is in the heart of the effort to learn and play. An ideal mapping shall be complex enough to play, but preserving flow and fun [5]. However, conceiving a good mapping is hard because it is a multi-dimensional problem. It is not a surprise that potential smart instruments are often not that smart at the end. Instead, one should leverage expert technique for progressive success. Furthermore, total freedom is not a virtue, especially for novices. Concerning individual differences and variations, according to Cook [2], customizations should be controlled. 3
S YSTEM D ESCRIPTION
The instrument proposed aims at investigating ways to take advantage of virtual mappings to improve playability. Goals for this instrument include to provide a less steep learning curve, to ease song execution and to reduce requirements for special human physical characteristics such as large hands and fingers. The adopted strategy to achieve these goals is to pursue smart mapping coupled with controlled customization. Our scheme tried to match gesture
199
Figure 2: Comparison of hand movements for a Piano (top row) and our instrument (botton row). The middle row shows the music sheet. single notes. This allows for interactivity similar to a piano: left hand is normally used to play chords while the right hand activates notes individually, in a sequence. 4
(a)
(b)
Figure 1: (a) System Setup: Oculus Rift for visualization and Hazer Hydra for gesture input and (b) First Person View of the Instrument’s interface: the user sees the instrument floating in front of him while immersed in a living room scenario. patterns with musical patterns. Musical notes were mapped onto a virtual grid to ease the execution of gesture patterns with fluid movements, minimizing jumping between distant notes. The resulting layout can be pictured as a stack of displaced piano keyboards, stripped of unused notes of a specific scale. The displacement defines the tuning of the instrument and how easy is to play certain intervals. In order to maximize the intervals available at any given note, we propose a specific tuning. This layout allows one to seamlessly play straight sequences (Figure 2 (a)), third and fourth interval sequences (Figure 2 (b)) and alternate (third to seventh) intervals (Figure 2 (d)-(h)). However, the user may choose a different tuning (by sliding the lines), depending on the frequency of specific intervals in a given song. Additionally, the user could add as much lines he wants, limited by the available space or limited by the range of interaction and visualization capabilities of the 3D devices. To support this mapping, instead of keys, notes are represented by equally spaced small spheres, which allow deviation and ease the execution of larger intervals without exaggerated spatial jumps. Notes are arranged in mid air, appearing in front of the player, covering approximately 160 degrees of the field of view, in a way that facilitates interaction: every note will be accessible at an arms reach (Figure 1 (b)). The second concept is interaction simplicity. Players use both hands to interact with the system, controlling only two virtual cursors (one in each hand) using Razer Hydra controllers (magnetic tracking and several buttons) - please refer to Figure 1 (a). A HMD (Oculus Rift) is used for stereoscopic visualization, integrated in a Unity 3D scene. To play a note, the user hovers the 3D cursor over a note and presses the trigger. To play a continuous sequence, the player triggers a note and then performs a path over the next notes, visualizing a short trace. Whenever a jump or stop is necessary, one may simply release the trigger. Finally, chords are implemented in 3D space, making use of depth and gestures, using the same layout. To play a chord the user may choose a note and then select one of the six available chord types by moving the hand towards or backwards the reference note-board plane and tilting the hand left or right and pressing the trigger. Currently, available chords include major, minor, major 7th, minor 7th, augmented and diminished chords. The idea is to implement at least 12 fundamental chords. Each hand can independently activate chords or
200
R ESULTS AND C ONCLUSION
The proposed instrument tackles relevant issues in instrumentation design and musical performance, focusing on learning speed, accessibility and joyfulness to play. The resulting solution was conceived motivated by existing constraints in actual traditional instruments and implemented off-the-shelf equipment. It has shown an improvement in matching between musical and gestural patterns by resizing the playable notes to a specific user defined scale, using note redundancy and customizable tuning. This concept is supported by the fact that most popular songs adopt compositional patterns and some parts will not use all of the 12 notes of the tempered scale. If an instrument is able to capture such patterns, execution may be extremely simplified and song structure grasped easier. Preliminary tests suggest that playing advantages and simple interaction surpass some flexibility and completeness losses in the actual design. Inexperienced subjects reported fast learning and enjoyment while experimenting with the interface. Musicians were able to fast perform some complex note sequences and they also enjoyed the 3D trace of the cursor, arguing that it induces better fluency and gracious gestures. Also, interaction simplicity allows seamless porting to other devices (e.g. Kinect). Finally, one could point that several songs may, for a moment, call for intermittent or persistent modifications on scale, such as sharps and flats, reducing or augmenting one or more notes in a half tone. This can be easily solved in our system, for instance, with gestures or by exploring depth. Still, the great advantages of virtual instruments over traditional instruments are based on flexibility and customization, reducing the load imposed on the player. ACKNOWLEDGEMENTS
The authors wish to thank Luiz Paulucci and Rose Sacashima for video editing and 2D/3D animation. This work has been partially funded by the Brazilian Agencies CNPq, FAPESP and FINEP. R EFERENCES [1] J. Chadabe. The limitations of mapping as a structural descriptive in electronic instruments. In Proceedings of the 2002 Conference on New Interfaces for Musical Expression. Nat. Univ. Singapore, 2002. [2] P. R. Cook. Laptop orchestras, robotic drummers, singing machines, and musical kitchenware: Learning programming, algorithms, user interface design, and science through the arts. J. Comput. Sci. Coll., 2012. [3] S. Fels. Designing for intimacy: creating new interfaces for musical expression. Proceedings of the IEEE, 92(4):672–685, Apr 2004. [4] S. Fels and M. Lyons. Creating new interfaces for musical expression: Introduction to nime. In ACM SIGGRAPH 2009 Courses, SIGGRAPH ’09, New York, NY, USA, 2009. ACM. [5] A. Hunt, M. M. Wanderley, and M. Paradis. The importance of parameter mapping in electronic instrument design. In NIME ’02, Singapore, Singapore, 2002. [6] D. Wessel and M. Wright. Problems and prospects for intimate musical control of computers. Comput. Music J., 26(3), Sept. 2002.