Wish iOS audiounit

daniel69

Member
Licensed User
Longtime User
Hi Erel, thank you for your answer!

As far as I know
>> AVAudioPlayer has metering functions for the output signal (such as peakPowerForChannel and averagePowerForChannel)
>> AVAudioRecorder has metering functions for the input signal (such as peakPowerForChannel and averagePowerForChannel)
but
AVAudioPlayer and AVAudioRecorder are the "high level" ways to do this
>> If you want the "low level" way then you need a tap on the engine's inputNode

...
let inputNode = audioEngine.inputNode
inputNode.installTapOnBus(....
...

Maybe there are some more hints in
http://stackoverflow.com/questions/...ponbusbuffersizeformatblock/27343266#27343266

Some interesting Xcode:
...

AVAudioEngine* sEngine = NULL;
- (void)applicationDidBecomeActive:(UIApplication *)application
{
/*
Restart any tasks that were paused (or not yet started) while the application was inactive. If the application was previously in the background, optionally refresh the user interface.
*/

[glView startAnimation];

AVAudioSession *audioSession = [AVAudioSession sharedInstance];

NSError* error = nil;
if (audioSession.isInputAvailable) [audioSession setCategory:AVAudioSessionCategoryPlayAndRecord error:&error];
if(error){
return;
}

[audioSession setActive:YES error:&error];
if(error){
return;
}

sEngine = [[AVAudioEngine alloc] init];

AVAudioMixerNode* mixer = [sEngine mainMixerNode];
AVAudioInputNode* input = [sEngine inputNode];
[sEngine connect:input to:mixer format:[input inputFormatForBus:0]];

__block NSTimeInterval start = 0.0;

// tap ... 1 call in 16537Frames
// It does not change even if you change the bufferSize
[input installTapOnBus:0 bufferSize:1024 format:[input inputFormatForBus:0] block:^(AVAudioPCMBuffer* buffer, AVAudioTime* when) {

if (start == 0.0)
start = [AVAudioTime secondsForHostTime:[when hostTime]];


// why does this work? because perhaps the smaller buffer is reused by the audioengine, with the code to dump new data into the block just using the block size as set here?
// I am not sure that this is supported by apple?
NSLog(@"buffer frame length %d", (int)buffer.frameLength);
buffer.frameLength = 1024;
UInt32 frames = 0;
for (UInt32 i = 0; i < buffer.audioBufferList->mNumberBuffers; i++) {
Float32 *data = buffer.audioBufferList->mBuffers.mData;
frames = buffer.audioBufferList->mBuffers.mDataByteSize / sizeof(Float32);
// create waveform
///

}
NSLog(@"%d frames are sent at %lf", (int) frames, [AVAudioTime secondsForHostTime:[when hostTime]] - start);
}];

[sEngine startAndReturnError:&error];
if (error) {
return;
}

}

...
 
Top