In broad terms, AriVibes is designed as a portable augmented reality system. As such, it “adds virtual information to a user’s sensory perception by merging real images, sounds, and haptic sensation with virtual ones.”
The core design goal of AriVibes is to offer a portable and self-contained musical augmenter that enables a wide range of users to use any object as a musical instrument. While it will remain equally accessible to the general public as well as amateur and professional musicians, it should offer each user the possibility to build a wide and personal vocabulary. In other words, it should have a “low entry fee” (in all senses of the term) together with “no ceiling on virtuosity”.
Below is an analysis of the app’s core design goals.
The app must musically augment an object’s sound as the user is playing it. Because opinions as to what sounds are considered musically viable inevitably vary, the app will offer a set of presets (i.e. prefabricated augmentation parameters) that a user can select and augment through, and take inspiration from to build his or her own. A user must be able to have enough control over the parameters of the augmentation so as to be able to obtain a personally satisfying timbre out of the object played.
Expressive augmentation is controllable. Changing the parameters of the augmentation shall allow the user to create perceptible expressive variations and contrasts in the timbres produced to allow the production of musical phrases. Among the essential properties of sound that should be controllable are, as Paine, Stevenson, and Pierce suggest in their requirements capture for a new digital musical instrument: timbre, dynamics, pitch, vibrato, articulation, and note envelope.
The repetition of musical phrases is common in music, and serves as bedrock for many musical pieces. Looping users’ sounds will facilitate improvisation and composition: one can record and loop a beat made with a preset that augments percussively, to then overlay sounds with a pronounced bass, etc.
Latency in an audio context is a measure of the delay between the time at which an audio signal enters a system, and the time at which it emerges from it. The importance of avoiding perceptible latencies cannot be overestimated. Digital musician Tim Perkis eloquently explains this concern: “there does need to be an appealing kinesthetic quality, the sense that the sounds, like the sounds made by an acoustic instrument, are somehow a trace of a plausible physical process”. Echoing this concern, Paine and his team found that “successful instruments have a direct, discernable relationship between the excitation moment and the difference that gesture has on the sound”. Digital instrument designers strive at latencies less than 5ms. Latency is inevitable in DSP, but for the purposes of this project it is a concern only if it is perceptible.
The app must seamlessly meld with any object in a user’s environment. The app must be portable in order to enable a user to augment any object when musical inspiration strikes. It should offer the freedom to play whenever inspiration comes. This also means that it should be self-contained and not require the addition of external peripherals such as audio interfaces or speakers.
Signal processing environments such as Max/MSP and Pure Data are second nature to few people outside of the NIME community, DSP practitioners, and experimental musicians. Their power and versatility is contrasted with a very steep learning curve and a high financial cost that neither musicians nor the general public are willing to incur.
Hence the requirement that the app be readily usable by people with no specialized knowledge of DSP theory, tools, or practices.
New digital instruments made by Korg reflect this concern, as little to no DSP knowledge is required to operate them, and their user interfaces are noted for their user-friendliness. But these tend to command high prices that AriVibes must compete against, and therefore the app and the device it is on must not have a prohibitive cost.
A player of AriVibes should be able to take part in collective music-making. Thus the app shall reproduce sound that can be heard over drums, bass, etc. Nevertheless, the sensors should pick up every nuance of its performer’s gestures without allowing fellow musician’s acoustic output to undesirably bleed into it.
While AriVibes can be a toy to many of its users, it remains a musical instrument at heart and should be usable to perform music with. Musicians perform instruments in front of audiences. Technical failures of equipment horribly mar performances and give computers a bad name. Successful instruments are dependable. AriVibes must not crash during performance. Period.