Filtering

The Amazing Audio Engine includes a sophisticated and flexible audio processing architecture, allowing you to apply effects to audio throughout your application.

The Engine gives you three ways to apply effects to audio:

  • You can process audio with blocks, via the AEBlockFilter class.
  • You can implement Objective-C classes that implement the AEAudioFilter protocol.
  • You can use Audio Units.

Block Filters

To filter audio using a block, create an instance of AEBlockFilter using filterWithBlock:, passing in a block implementation that takes the form defined by AEBlockFilterBlock.

The block will be passed a function pointer, producer, which is used to pull audio from the system. Your implementation block must invoke this function when audio is needed, passing as the first argument the opaque producerToken pointer also passed to the block.

self.filter = [AEBlockFilter filterWithBlock:^(AEAudioFilterProducer producer,
void *producerToken,
const AudioTimeStamp *time,
UInt32 frames,
AudioBufferList *audio) {
// Pull audio
OSStatus status = producer(producerToken, audio, &frames);
if ( status != noErr ) return;
// Now filter audio in 'audio'
}];

Objective-C Object Filters

The AEAudioFilter protocol defines an interface that you can conform to in order to create Objective-C classes that can filter audio.

The protocol requires that you define a method that returns a pointer to a C function that takes the form defined by AEAudioFilterCallback. This C function will be called when audio is to be filtered.

If you put this C function within the @implementation block, you will be able to access instance variables via the C struct dereference operator, "->". Note that you should never make any Objective-C calls from within a Core Audio realtime thread, as this will cause performance problems and audio glitches. This includes accessing properties via the "." operator.

As with block filters, above, the callback you provide will be passed a function pointer, producer, which is used to pull audio from the system. Your implementation block must invoke this function when audio is needed, passing as the first argument the opaque producerToken pointer also passed to the block.

@interface MyFilterClass <AEAudioFilter>
@end
@implementation MyFilterClass
...
static OSStatus filterCallback(__unsafe_unretained MyFilterClass *THIS,
__unsafe_unretained AEAudioController *audioController,
void *producerToken,
const AudioTimeStamp *time,
UInt32 frames,
AudioBufferList *audio) {
// Pull audio
OSStatus status = producer(producerToken, audio, &frames);
if ( status != noErr ) status;
// Now filter audio in 'audio'
return noErr;
}
-(AEAudioFilterCallback)filterCallback {
return filterCallback;
}
@end
...
self.filter = [[MyFilterClass alloc] init];

Audio Unit Filters

The AEAudioUnitFilter class allows you to use audio units to apply effects to audio.

To use it, call initWithComponentDescription: , passing in an AudioComponentDescription structure (you can use the utility function AEAudioComponentDescriptionMake for this):

AudioComponentDescription component
= AEAudioComponentDescriptionMake(kAudioUnitManufacturer_Apple,
kAudioUnitType_Effect,
kAudioUnitSubType_Reverb2)
self.reverb = [[AEAudioUnitFilter alloc] initWithComponentDescription:component];

Once you have added the filter to a channel, channel group or main output, you can then access the audio unit directly via the audioUnit property. You can also add your own initialization step via the initWithComponentDescription:preInitializeBlock: initializer.

AudioUnitSetParameter(_reverb.audioUnit,
kReverb2Param_DryWetMix,
kAudioUnitScope_Global,
0,
100.f,
0);

Adding Filters

Once you've got a filter, you can apply it to a variety of different audio sources:

You can add and remove filters at any time, using addFilter: and removeFilter: , and the other channel, group and input equivalents.


Now you're producing audio and applying effects to it. But what if you want to record, process audio input, or do something else with the audio? Read on: Receiving Audio.