Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
385 views
in Technique[技术] by (71.8m points)

avfoundation - How to convert AudioBufferList to CMSampleBuffer?

I have an AudioTapProcessor attached to AVPlayerItem. which will call static void tap_ProcessCallback(MTAudioProcessingTapRef tap, CMItemCount numberFrames, MTAudioProcessingTapFlags flags, AudioBufferList *bufferListInOut, CMItemCount *numberFramesOut, MTAudioProcessingTapFlags *flagsOut) when processing.

I need to convert the AudioBufferList to CMSampleBuffer so I could use AVAssetWriterAudioInput.appendSampleBuffer to write it into a movie file.

So how to convert AudioBufferList to CMSampleBuffer? I tried this but got -12731 error:Error cCMSampleBufferSetDataBufferFromAudioBufferList :Optional("-12731")

func processAudioData(audioData: UnsafeMutablePointer<AudioBufferList>, framesNumber: UInt32) {
    var sbuf : Unmanaged<CMSampleBuffer>?
    var status : OSStatus?
    var format: Unmanaged<CMFormatDescription>?

    var formatId =  UInt32(kAudioFormatLinearPCM)
    var formatFlags = UInt32( kAudioFormatFlagIsSignedInteger | kAudioFormatFlagIsPacked )
    var audioFormat = AudioStreamBasicDescription(mSampleRate: 44100.00, mFormatID:formatId, mFormatFlags:formatFlags , mBytesPerPacket: 1, mFramesPerPacket: 1, mBytesPerFrame: 16, mChannelsPerFrame: 2, mBitsPerChannel: 2, mReserved: 0)

    status = CMAudioFormatDescriptionCreate(kCFAllocatorDefault, &audioFormat, 0, nil, 0, nil, nil, &format)
    if status != noErr {
        println("Error CMAudioFormatDescriptionCreater :(status?.description)")
        return
    }


    var timing = CMSampleTimingInfo(duration: CMTimeMake(1, 44100), presentationTimeStamp: kCMTimeZero, decodeTimeStamp: kCMTimeInvalid)

    status = CMSampleBufferCreate(kCFAllocatorDefault,nil,Boolean(0),nil,nil,format?.takeRetainedValue(), CMItemCount(framesNumber), 1, &timing, 0, nil, &sbuf);
    if status != noErr {
        println("Error CMSampleBufferCreate :(status?.description)")
        return
    }
    status =   CMSampleBufferSetDataBufferFromAudioBufferList(sbuf?.takeRetainedValue(), kCFAllocatorDefault , kCFAllocatorDefault, 0, audioData)
    if status != noErr {
        println("Error cCMSampleBufferSetDataBufferFromAudioBufferList :(status?.description)")
        return
    }

    var currentSampleTime = CMSampleBufferGetOutputPresentationTimeStamp(sbuf?.takeRetainedValue());
    println(" audio buffer at time: (CMTimeCopyDescription(kCFAllocatorDefault, currentSampleTime))")

    if !assetWriterAudioInput!.readyForMoreMediaData {
        return
    }else if assetWriter.status == .Writing {

        if !assetWriterAudioInput!.appendSampleBuffer(sbuf?.takeRetainedValue()) {
            println("Problem appending audio buffer at time: (CMTimeCopyDescription(kCFAllocatorDefault, currentSampleTime))")
        }

    }else{
        println("assetWriterStatus:(assetWriter.status.rawValue), Error: (assetWriter.error.localizedDescription)")
        println("Could not write a frame")
    }




}
See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

OK, I've successfully resolved this problem.

The problem is I should not construct the AudioStreamBasicDescription struct myself. But use the one provided by prepare callback of AudioProcessorTap .

static void tap_PrepareCallback(MTAudioProcessingTapRef tap, CMItemCount maxFrames, const AudioStreamBasicDescription *processingFormat) //retain this one


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...