Receiving Audio

So far we've covered creating and processing audio, but what if you want to do something with the microphone/device audio input, or take the audio coming from your app and do something with it?

The Amazing Audio Engine supports receiving audio from a number of sources:

  • The device's audio input (the microphone, or an attached compatible audio device).
  • Your app's audio output.
  • One particular channel.
  • A channel group.

To begin receiving audio, you can either create an Objective-C class that implements the AEAudioReceiver protocol:

@interface MyAudioReceiver : NSObject <AEAudioReceiver>
@implementation MyAudioReceiver
static void receiverCallback(__unsafe_unretained MyAudioReceiver *THIS,
__unsafe_unretained AEAudioController *audioController,
void *source,
const AudioTimeStamp *time,
UInt32 frames,
AudioBufferList *audio) {
// Do something with 'audio'
-(AEAudioReceiverCallback)receiverCallback {
return receiverCallback;
id<AEAudioReceiver> receiver = [[MyAudioReceiver alloc] init];

...or you can use the AEBlockAudioReceiver class to specify a block to receive audio:

id<AEAudioReceiver> receiver = [AEBlockAudioReceiver audioReceiverWithBlock:
^(void *source,
const AudioTimeStamp *time,
UInt32 frames,
AudioBufferList *audio) {
// Do something with 'audio'

In both cases, your callback or block will be passed:

  • An opaque identifier indicating the audio source,
  • A timestamp that corresponds to the time the audio hit the device audio input. Timestamp will be automatically offset to factor in system latency if AEAudioController's automaticLatencyManagement property is YES (the default). If you disable this setting and latency compensation is important, this should be offset by the value returned from AEAudioControllerInputLatency .
  • the number of audio frames available, and
  • an AudioBufferList containing the audio.

Then, add the receiver to the source of your choice:

Playthrough/Audio Monitoring

For some applications it might be necessary to provide audio monitoring, where the audio coming in through the microphone or other device audio input is played out of the speaker.

The AEPlaythroughChannel located within the "Modules" directory takes care of this. This class implements both the AEAudioPlayable and the AEAudioReceiver protocols, so that it acts as both an audio receiver and an audio source.

To use it, initialize it then add it as an input receiver using AEAudioController's addInputReceiver: and add it as a channel using addChannels: .


Included within the "Modules" directory is the AERecorder class, which implements the AEAudioReceiver protocol and provides simple but sophisticated audio recording.

To use AERecorder, initialize it using initWithAudioController: .

Then, when you're ready to begin recording, use beginRecordingToFileAtPath:fileType:error: , passing in the path to the file you'd like to record to, and the file type to use. Common file types include kAudioFileAIFFType, kAudioFileWAVEType, kAudioFileM4AType (using AAC audio encoding), and kAudioFileCAFType.

Finally, add the AERecorder instance as a receiver using the methods listed above.

Note that you can add the instance as a receiver of more than one source, and these will be mixed together automatically.

For example, you might have a karaoke app with a record function, and you want to record both the backing music and the microphone audio at the same time:

- (void)beginRecording {
// Init recorder
self.recorder = [[AERecorder alloc] initWithAudioController:_audioController];
NSString *documentsFolder = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES)
NSString *filePath = [documentsFolder stringByAppendingPathComponent:@"Recording.aiff"];
// Start the recording process
NSError *error = NULL;
if ( ![_recorder beginRecordingToFileAtPath:filePath
error:&error] ) {
// Report error
// Receive both audio input and audio output. Note that if you're using
// AEPlaythroughChannel, mentioned above, you may not need to receive the input again.
[_audioController addInputReceiver:_recorder];
[_audioController addOutputReceiver:_recorder];

To complete the recording, call finishRecording.

- (void)endRecording {
[_audioController removeInputReceiver:_recorder];
[_audioController removeOutputReceiver:_recorder];
[_recorder finishRecording];
self.recorder = nil;

Multi-Channel Input Support

The Amazing Audio Engine provides the ability to select a set of input channels when a multi-channel input device is connected.

You can assign an array of NSIntegers to the inputChannelSelection property of AEAudioController in order to select which channels of the input device should be used.

For example, for a four-channel input device, the following will select the last two channels as a stereo stream:

_audioController.inputChannelSelection = [NSArray arrayWithObjects:
[NSNumber numberWithInt:2],
[NSNumber numberWithInt:3,

You can also assign audio input receivers or filters for different selections of channels. For example, you can have one AEAudioReceiver object receiving from the first channel of a stereo input device, and a different object receiving from the second channel.

Use the addInputReceiver:forChannels: and addInputFilter:forChannels: methods to do this:

[_audioController addInputReceiver:
[ABBlockAudioReceiver audioReceiverWithBlock:^(void *source,
const AudioTimeStamp *time,
UInt32 frames,
AudioBufferList *audio) {
// Receiving left channel
forChannels:[NSArray arrayWithObject:[NSNumber numberWithInt:0]]];
[_audioController addInputReceiver:
[ABBlockAudioReceiver audioReceiverWithBlock:^(void *source,
const AudioTimeStamp *time,
UInt32 frames,
AudioBufferList *audio) {
// Receiving right channel
forChannels:[NSArray arrayWithObject:[NSNumber numberWithInt:1]]];

Note that the numberOfInputChannels property is key-value observable, so you can use this to be notified when to display appopriate UI, etc.

Next, read on to find out how to interact with other audio apps, sending, receiving or filtering audio with Audiobus.