Realtime Audio on iOS Tutorial: Making a Mandolin

Update


As the MoMu toolkit has become out of date, I have re-written this tutorial using The Amazing Audio Engine as the audio engine and the latest Synthesis Toolkit in C++ code. Please visit the updated tutorial here:

Wouldn’t it be great to pull out an all-powerful musical instrument out of your pocket whenever inspiration strikes? This attracts musically-minded programmers towards the iPhone, as it has all the computing power needed to become that instrument. We soon find out that getting the thing to produce a sound isn’t difficult: just insert an audio file into your project and tell an AVAudioPlayer object to play it. Five lines of code, at most.

But there are a lot of disadvantages with relying on premade audio files as the sound sources for your mobile instrument. Firstly, you don’t have the flexibility to control the sound in realtime: you can’t, for instance, change its pitch or insert a reverb on it while it plays. And modifying the audio file in the background will take time, meaning your users will have to put up with latency between their input and the resulting sound. There are also issues of performance, as good quality audio files take up a lot more disk space and memory.

What you want is your app to generate and control its own sounds, and for that you need to be able to process the audio at the sample level. But it turns out that the setup required to access and control audio samples on iOS involves a considerable amount of time (and an in-depth knowledge of the mushrooms Apple engineers were eating at the time they came up with the API for the iPhone’s audio hardware).

This tutorial will show you how to access, generate, and control audio samples on an iOS app, using two freely available open-source libraries, one that will set up our low-latency Audio Session (the MoMu Toolkit), and another that will generate (i.e. synthesize) the sounds (the Synthesis Toolkit in C++).

Setting up the Xcode Project

1)   Open up Xcode and create a new project.

2)   In the column on the left, under the heading “iOS”, pick “Application”, and then choose “Single View Application”. Click “Next”.

3)   In the next screen:

  1. Enter “Mandolin” as the “Product Name”
  2. “Company Identifier” can be your name.
  3. No need to enter something for “Class Prefix”
  4. Pick “iPhone” as “Device Family”.
  5. Tick “Use Automatic Reference Counting”.

4)   Click “Next” and save the project somewhere dear to you.

5)   We need to import iOS’s audio processing frameworks for the MoMu code to work. Click on the new project’s icon (top left), scroll down to where it says “Linked Frameworks and Libraries”, click the ‘+’ button.

6)   Click on AudioToolbox.framework and then “Add”.

7)   The MoMu Toolkit is a hybrid of C, C++, and Objective-C (also known as Objective-C++), so we need to tell Xcode that our project’s source files are in that language too. To do this, rename AppDelegate.m to AppDelegate.mm and rename ViewController.m to ViewController.mm.

 

 

 

 

 

 

 

 

 

 

8)   Download the MoMu Toolkit from http://momu.stanford.edu/toolkit/. Unzip it.

9)   Drag the following files from the folder you’ve unzipped into the Xcode project.

  1. momu.h
  2. mo_def.h
  3. mo_audio.mm
  4. mo_audio.h

10)   Download the MoMu release of the Synthesis Toolkit from http://momu.stanford.edu/stk/. Unzip it. Drag the whole MoMu-STK-1.0.0 folder into your Xcode project, tick “Copy items into destination’s group folder”.

11)   Open ViewController.h, under #import <UIKit/UIKit.h> write

#import <AudioToolbox/AudioToolbox.h>
 
#include "Stk.h"
#include "Mandolin.h"

12)   Open ViewController.mm, under #import “ViewController.h” write

#import "mo_audio.h"

13)   Build (Command + B), make sure the build succeeds. If the build fails, try cleaning (Command + Shift + K) and build again. (The STK files are likely to generate warnings about a series of compilation issues, but it’s safe to ignore them)

Setting up the Audio Callback

14)   Our app will make use of the Remote I/O Audio Unit, a native iOS audio plugin that allows realtime audio input and output, as well as access to its render callback, i.e. the function that feeds the actual audio samples to the device’s headphone jack.
In ViewController.h, under #include “Mandolin.h” type the following:

using namespace stk;
struct AudioData{
	Mandolin *myMandolin;
};

Then add an opening curly bracket after the line @interface ViewController : UIViewController

Skip a line and add:

struct AudioData audioData;

Close curly brackets (“{” and “}”).

We’ll want to have a button that will strike the mandolin, so let’s get a function ready for that: type

-(IBAction)pluckMyMandolin;

After the @interface declaration.

Your ViewController.h file should now look like this:

#import <UIKit/UIKit>
#import <AudioToolbox/AudioToolbox.h>
 
#include "Stk.h"
#include "Mandolin.h"
 
using namespace stk;
struct AudioData{
    Mandolin *myMandolin;
};
 
@interface ViewController : UIViewController{
    struct AudioData audioData;
}
-(IBAction)pluckMyMandolin;
 
@end

The AudioData struct will contain all the sound-generating or sound-modifying objects active in the callback function. For now, it contains a Mandolin object that will generate the mandolin sounds.

15)   Time to set up MoMu. MoMu needs us to define our sample rate, how big the audio processing buffer is, and if it’s stereo we’re dealing with or not. Go to ViewController.mm and type the following under #import “mo_audio.h”:

#define SRATE 44100
#define FRAMESIZE 128
#define NUMCHANNELS 2

16)   And just below that, paste the declaration of our Audio Callback function:

void audioCallback( Float32 * buffer, UInt32 framesize, void* userData )
{}

17)   Now let’s create our mandolin and fire up MoMu. In your viewDidLoad method, type the following:

audioData.myMandolin = new Mandolin(400);
audioData.myMandolin->setFrequency(400);
 
// init audio
NSLog(@"Initializing Audio");
 
// init the MoAudio layer
bool result = MoAudio::init(SRATE, FRAMESIZE, NUMCHANNELS);
 
if (!result)
{
    NSLog(@"cannot initialize real-time audio!");
    return;
}
 
// start the audio layer, registering a callback method
result = MoAudio::start( audioCallback, &audioData);
if (!result)
{
    NSLog(@"cannot start real-time audio!");
    return;
}

18)   Now that MoMu has set up our audio processing and we have the Mandolin object ready, let’s insert it into the callback function we wrote above:

void audioCallback( Float32 * buffer, UInt32 framesize, void* userData)
{
    AudioData * data = (AudioData*) userData;
 
    for(int i=0; i<framesize; i++)
    {
        SAMPLE out = data->myMandolin->tick();
 
        buffer[2*i] = buffer[2*i+1] = out;   
    }
}

This function is our render callback. It gets called hundreds of times a second and processes in real-time the frames that contain the samples. The magic happens inside the for loop: we assign the output of the mandolin to the sample that will be output (to our headphones). We’ll thus hear silence if the mandolin is not struck, and the mandolin’s sound when the mandolin’s is plucked.

19)   It’s now time to pluck the mandolin! We’ll strike it by pressing a button we’ll insert in our nib file or storyboard, so let’s define the code for that IBAction:

-(IBAction)pluckMyMandolin{
    audioData.myMandolin->pluck(1);
}


We’re calling the function pluck() on our instance of the  STK’s Mandolin. As you can see from the documentation for it, this method takes at least one parameter, in our case it’s a float defining the amplitude with which we want to pluck the strings. Be familiar with the docs for the STK classes if you want to be a good instrument-maker, son.

20)  Now open ViewController.xib (or MainStoryBoard.storyboard), drag a simple Round Rect Button into the view. Ctrl+drag from the button to the File’s Owner, let go, and select the pluckMyMandolin method.

21)  Run the app and pluck away.

If you run the app on the Simulator, the console may, on some older versions of Xcode, print out a long and menacing Error loading /System/Library/Extensions/AudioIPCDriver.kext/Contents/Resources/AudioIPCPlugIn.bundle/Contents/MacOS/AudioIPCPlugIn:. This sometimes happens when the Simulator can’t find a framework that’s only included in iOS (in our case it’s the AudioToolbox framework we imported earlier). The Simulator will then use its Mac-based counterpart, so the mandolin sound should play nonetheless.

Recommended Reading:

Thanks to Jieun Oh for her kind help.

Has this tutorial been helpful? Please leave a comment below (send suggestions for improvements to ariel@arivibes.com)

21 Responsesso far.

  1. Kevin Irlen says:

    Any else notice the mandolin doesn’t seem to be in tune? E.g. new Mandolin(440) yields tones that are, according my functioning instrument tuners, many cents flat. Is there some calibration required?

    • Ariel says:

      Indeed, you also have to call audioData.myMandolin->setFrequency(400); after initializing it. Also look at the parameters you have for audioData.myMandolin->setDetune().

  2. Demi says:

    hey, many thanks for this!

    one question: when it loads, first thing it does it plucks the mandolin. how can i prevent this. i tried it with other instruments and it’s even worse, because it just runs continuously…

    and another one: do you have any other examples? :)

    cheers,
    demi

    • Ariel says:

      I’ve noticed the initial plucking too, it’s probably due to the constructor of the STK’s Instrmnt class. One workaround is to do data->myMandolin->tick() * someInt, where someInt is initially zero and then set to 1 on the first pluck.

  3. Ted says:

    Thank you for the tutorial, a G tone of the mandolin comes out!
    How can we change parameters, duration of the play, etc?

  4. Frank says:

    How would I go about creating a class for another instrument, say a guitar or ukulele? It looks like all I’d need to do would be to create another class derived from PluckTwo but is there any info available as to what the contents to the corresponding raw sound files should be?

  5. Tim Johnson says:

    How would I go about adding an effect to this class? For example, if I wanted to add a pitch shift, how would that interact with the AudioBuffer and what not?

    • Ariel says:

      That’s very simple, but I’ll leave it for another tutorial. Stay posted.

      • Ariel says:

        Ah, whatever, here it is:

        1) Instantiate the effect.
        audioData.pitShift = new PitShift();
        2) send the sample through the Pitch Shifter:
        SAMPLE out = data->myMandolin->tick();
        out = data->pitShift->tick(out);

        But wait for the more complete tutorial!

  6. Paul says:

    Some of the methods in the MoMu Toolkit have been deprecated in iOS 7. Could you please update them?

  7. Matt says:

    Thanks for the tutorial! Its really great :)

    Where are you getting the frequencies for the instruments? Is there a resource online?

  8. Paul says:

    amazing tutorial. Could you post a tutorial on something like a basic version of arivibes using an open source engine like “The Amazing Audio Engine” or “Novocaine” ?

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>