Setting the scene
I am working on a video processing app that runs from the command line to read in, process and then export video. I'm working with 4 tracks.
- Lots of clips that I append into a single track to make one video. Let's call this the ugcVideoComposition.
- Clips with Alpha which get positioned on a second track and using layer instructions, is set composited on export to play back over the top of the ugcVideoComposition.
- A music audio track.
- An audio track for the ugcVideoComposition containing the audio from the clips appended into the single track.
I have this all working, can composite it and export it correctly using AVExportSession.
The problem
What I now want to do is apply filters and gradients to the ugcVideoComposition.
My research so far suggests that this is done by using AVReader and AVWriter, extracting a CIImage, manipulating it with filters and then writing that out.
I haven't yet got all the functionality I had above working, but I have managed to get the ugcVideoComposition read in and written back out to disk using the AssetReader and AssetWriter.
BOOL done = NO;
while (!done)
{
while ([assetWriterVideoInput isReadyForMoreMediaData] && !done)
{
CMSampleBufferRef sampleBuffer = [videoCompositionOutput copyNextSampleBuffer];
if (sampleBuffer)
{
// Let's try create an image....
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CIImage *inputImage = [CIImage imageWithCVImageBuffer:imageBuffer];
// < Apply filters and transformations to the CIImage here
// < HOW TO GET THE TRANSFORMED IMAGE BACK INTO SAMPLE BUFFER??? >
// Write things back out.
[assetWriterVideoInput appendSampleBuffer:sampleBuffer];
CFRelease(sampleBuffer);
sampleBuffer = NULL;
}
else
{
// Find out why we couldn't get another sample buffer....
if (assetReader.status == AVAssetReaderStatusFailed)
{
NSError *failureError = assetReader.error;
// Do something with this error.
}
else
{
// Some kind of success....
done = YES;
[assetWriter finishWriting];
}
}
}
}
As you can see, I can even get the CIImage from the CMSampleBuffer, and I'm confident I can work out how to manipulate the image and apply any effects etc. I need. What I don't know how to do is put the resulting manipulated image BACK into the SampleBuffer so I can write it out again.
The question
Given a CIImage, how can I put that into a sampleBuffer to append it with the assetWriter?
Any help appreciated - the AVFoundation documentation is terrible and either misses crucial points (like how to put an image back after you've extracted it, or is focussed on rendering images to the iPhone screen which is not what I want to do.
Much appreciated and thanks!
See Question&Answers more detail:
os 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…