The usage of contemporary multitouch devices such as Apples iPad as musical instruments needs to focus on the design of a special interface that incorporates the fact of missing kinesthetic feedback by the device. To form a conclusive instrument this interface has then to be linked directly to the sound generation to create an intuitive feel of control. The Orphion uses virtual pads in a certain layout that represent individual voices. Pitch and timbre of each voice depend on the initial point of touch, the size and variation of size of the touchpoint, and the variation of the position after the initial touch. Parameters control a physical model for sound generation.
The development of musical instruments during the last centuries mainly aimed on the improvement of sound generation rather than finding new interfaces. Especially for electronic instruments, apart from exceptional inventions like the Theremin in 1920, the most commonly used interface still is the 18th century approved interface of the piano keyboard.
For the Orphion the goal was to develop an instrument that deliberately departs from that concept and takes advantage of the haptic and technological possibilities of multi-touch devices; it’s sound should be a direct representation of the actions the fingers perform on the pads. My aim was also to create an instrument that feels “natural” to someone familiar with the behavior of existing acoustical instruments like drums and string instruments.
Interfaces for musical instruments
Acoustical instruments have a natural coupling between inter-face and sound generation, which is defined by the materials used for their construction. For most electronic instruments these two components are separated. The requirements of the interface part for all kinds of musical instruments, however, can be generally defined:
- allow virtuosity/expression
- intuitive playing
- traceability (for the audience)
- predictability (for the player)
- give feedback (acoustic/visual)
Instrument vs. Controller
Since its mechanical structure and the materials used for build-ing the enclosure of electronic instruments normally do not contribute to their sound whilst played, which is especially true for software based instruments, these instruments needs a strong logical link between the action of the player and the generated sound. This strong link between a very specific inter-face and the sound generation is what defines that structure as an instrument.
Alternatively the interface is an open structure with different functions or sounds for different situations, and can be seen more as a controller rather than as an instrument. Such an interface can work very well for a variety of situations, however the above mentioned properties of an instrument are hard or impossible to achieve. Additional layers of complexity arise, if the interface does not control the sound generation directly but drives e.g. a sequencer. Such a “sequencing instrument” points in a new direction but also marks the strongest possible departure from a traditional musical instrument. For my purposes I decided to focus on the idea of a strong coupling between interface and sound generation.
How does a touchscreen sound and what gestures allow expressive playing whilst keeping control? Most applications for touchscreens use the finger or stylus as a replacement for the mouse to control knobs and buttons. Since multitouch has been established, new gestures were created, for example pinching two fingers to resize objects, but in order to express musical ideas, more specific gestures and input models have to be developed.
Finding a logic interaction model and thus a suggested way of playing seems to be the main part in the development of a new instrument. The Orphion should allow polyphonic playing of defined pitches with different articulations (staccato, legato) and timbres for each individual voice. When developing Orphion I had to take into account:
- haptic properties of touchscreens (size and tactile or kinesthetic ways of interaction)
- musical playability (recognition of initial touchpoint and matching of pitches), musical expression (dynamics, intonation, vibrato, timbre)
- intuitive and natural feel
- technical possibilities (precision of control data, processing power)
As guiding models for the behavior of the Orphion I looked at two types of instruments: Drums and string instruments.
- drums: round playing area with different timbres, re-lease time and damping depend on the velocity and du-ration of touch.
- string instruments: multiple individually tuned strings plus ability to play the tuning via tapping the strings, control of tone and articulation during sustain-phase (intonation/vibrato, damping)
See figure 1 for the connection between the interface and the sound generation by different parameters:
Figure 1. Simplified diagram of interface and sound generation
Interface and sound
The interface of the Orphion represents virtual pads, which are capable of sounding either plucked like a guitar string or produce timbres closer to a slap on a conga drum depending on the size of the touch point. The timbre changes when hit closer to the rim like on a real drum, and its pitch is a function of distance from the center hit point, in order to model something that comes close to pulling a string (the range varies by the size of the touch point). Every parameter is controlled by a single finger per note. The iPad currently supports up to eleven touch points. However, internal polyphony of the instrument is de-fined by the number of pads present on the touch screen.
Visually the interface is represented by a defined set of pad drawn as circles of variable size and position. The different sets of layouts allow the instrument to be adapted to multiple musical situations, genres and to match the virtuosity of the player. The arrangement structure with symmetrical intervals (e.g. fig. 2) in each axis can be used to find new harmonic structures by advanced musicians, pad layouts with only pentatonic tone material or other simplified musical concepts (e.g. fig. 3) make it also interesting for musical beginners. Layouts with less pads can give you the feel of a percussion instrument played with a fixed assignment between finger and pad (e.g. fig. 4).
Figure 2. Symmetrical major 3rds horizontally,
minor 3rds and semitones vertically, 4ths and 5ths diagonally
Figure 4. blues-scale layout
Figure 5. five-finger layout, e.g. as tuned bass drums
The sound synthesis is based on physical model that simulates a string (Karplus-Strong algorithm). I use a combination of a pulse of filtered noise and a sustained excitation sound created by a two operator FM synthesis structure. The low-pass filtering of the feedback path is controlled in realtime for lively articulation of the sound after initial touch. The complex excitation model allows a wide spectrum of different sounds from gently plucked strings to xylophone-like hits or damped attack of muted drums. As long a finger is touching the surface of a pad, the distance from the pad’s center controls slight detuning (intonation) and variation of timbre towards the rim.
We achieve a very natural feeling by dynamical adjusting parameters within a single synthesis model rather than switching between different models for different playing situation, as it would be the case with sample based instruments.
The visual representation of the pads on iPad is straight for-ward and functional: A pad is defined as a outlined blue circle with it’s note name written in the center. When touching a circle, it is filled with color, ranging from red to yellow depending on touch size, thus indicating the amount of damping.