I have been trying to record from a RemoteIO unit directly to AAC in a renderCallback in iOS 5 on an iPad 2. I have seen conflicting info saying it is not possible & that it is possible (in the comments here). My reason for wanting to do it is because recording to PCM requires so much disk space for a recording of any length - even if it is later converted to AAC.
I'm about ready to give up though. I've scraped through Google, SO, the Core Audio book and the Apple Core-Audio mailing list & forums and have reached the point where I am not getting any errors - and am recording something to disk but the resulting file is unplayable. This is the case with both the Simulator and on the device.
So... if anyone has experience with this, I'd really appreciate a nudge in the right direction. The setup is that the RemoteIO is playing output from AUSamplers & that is working fine.
Here is what I am doing in the code below
Specify the AudioStreamBasicDescription
formats for the remoteIO
unit to kAudioFormatLinearPCM
Create and specify the destination format for the ExtAudioFileRef
Specify the client format by getting it from the RemoteIO unit
Specify the renderCallback for the RemoteID unit
In the renderCallback, write data in the
kAudioUnitRenderAction_PostRender
phase
As I said, I am not getting any errors, and the resulting audio file sizes show something is being written, but the file is unplayable. Perhaps I have my formats screwed up?
Anyway, this is my message in a bottle and/or "Be Here Dragons" flag to anyone else braving the dark waters of Core-Audio.
//The unhappy msg I get when trying to play the file:
// part of remoteIO setup
// Enable IO for recording
UInt32 flag = 1;
result = AudioUnitSetProperty(ioUnit,
kAudioOutputUnitProperty_EnableIO,
kAudioUnitScope_Input,
kInputBus, // == 1
&flag,
sizeof(flag));
if (noErr != result) {[self printErrorMessage: @"Enable IO for recording" withStatus: result]; return;}
// Describe format - - - - - - - - - -
size_t bytesPerSample = sizeof (AudioUnitSampleType);
AudioStreamBasicDescription audioFormat;
memset(&audioFormat, 0, sizeof(audioFormat));
audioFormat.mSampleRate = 44100.00;
audioFormat.mFormatID = kAudioFormatLinearPCM;
audioFormat.mFormatFlags = kAudioFormatFlagIsSignedInteger | kAudioFormatFlagIsPacked;
audioFormat.mFramesPerPacket = 1;
audioFormat.mChannelsPerFrame = 1;
audioFormat.mBitsPerChannel = 16;
audioFormat.mBytesPerPacket = 2;
audioFormat.mBytesPerFrame = 2;
result = AudioUnitSetProperty(ioUnit,
kAudioUnitProperty_StreamFormat,
kAudioUnitScope_Output,
kInputBus, // == 1
&audioFormat,
sizeof(audioFormat));
result = AudioUnitSetProperty(ioUnit,
kAudioUnitProperty_StreamFormat,
kAudioUnitScope_Input,
kOutputBus, // == 0
&audioFormat,
sizeof(audioFormat));
// Function that sets up file & rendercallback
- (void)startRecordingAAC
{
OSStatus result;
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *recordFile = [documentsDirectory stringByAppendingPathComponent: @"audio.m4a"];
CFURLRef destinationURL = CFURLCreateWithFileSystemPath(kCFAllocatorDefault,
(__bridge CFStringRef)recordFile,
kCFURLPOSIXPathStyle,
false);
AudioStreamBasicDescription destinationFormat;
memset(&destinationFormat, 0, sizeof(destinationFormat));
destinationFormat.mChannelsPerFrame = 2;
destinationFormat.mFormatID = kAudioFormatMPEG4AAC;
UInt32 size = sizeof(destinationFormat);
result = AudioFormatGetProperty(kAudioFormatProperty_FormatInfo, 0, NULL, &size, &destinationFormat);
if(result) printf("AudioFormatGetProperty %ld
", result);
result = ExtAudioFileCreateWithURL(destinationURL,
kAudioFileM4AType,
&destinationFormat,
NULL,
kAudioFileFlags_EraseFile,
&extAudioFileRef);
if(result) printf("ExtAudioFileCreateWithURL %ld
", result);
AudioStreamBasicDescription clientFormat;
memset(&clientFormat, 0, sizeof(clientFormat));
result = AudioUnitGetProperty(ioUnit, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Output, 0, & clientFormat, &size);
if(result) printf("AudioUnitGetProperty %ld
", result);
result = ExtAudioFileSetProperty(extAudioFileRef,kExtAudioFileProperty_ClientDataFormat,sizeof(clientFormat),&clientFormat);
if(result) printf("ExtAudioFileSetProperty %ld
", result);
result = ExtAudioFileWriteAsync(extAudioFileRef, 0, NULL);
if (result) {[self printErrorMessage: @"ExtAudioFileWriteAsync error" withStatus: result];}
result = AudioUnitAddRenderNotify(ioUnit, renderCallback, (__bridge void*)self);
if (result) {[self printErrorMessage: @"AudioUnitAddRenderNotify" withStatus: result];}
}
// And finally, the rendercallback
static OSStatus renderCallback (void * inRefCon,
AudioUnitRenderActionFlags * ioActionFlags,
const AudioTimeStamp * inTimeStamp,
UInt32 inBusNumber,
UInt32 inNumberFrames,
AudioBufferList * ioData)
{
OSStatus result;
if (*ioActionFlags == kAudioUnitRenderAction_PostRender){
MusicPlayerController* THIS = (__bridge MusicPlayerController *)inRefCon;
result = ExtAudioFileWriteAsync(THIS->extAudioFileRef, inNumberFrames, ioData);
if(result) printf("ExtAudioFileWriteAsync %ld
", result);
}
return noErr;
}
See Question&Answers more detail:
os