Within our iOS app, we are using custom filters using Metal (CIKernel/CIColorKernel wrappers).
Let's assume we have a 4K video and a custom video composition with a 1080p output size, that applies an advanced filter on the video buffers.
Obviously, we don't need to filter the video in its original size, doing so we'll probably terminate the app with a memory warning (true story).
This is the video-filtering pipeline:
Getting the buffer in 4K (as CIImage
) -->
Apply filter on the CIImage
-->
the filter applies the CIKernel
Metal
filter function on the CIImage
-->
Return the filtered CIImage
to the composition
The only two places I can think of applying the resize is before we send it into the filter process or within the Metal
function.
public class VHSFilter: CIFilter {
public override var outputImage: CIImage? {
// InputImage size is 4K
guard let inputImage = self.inputImage else { return nil }
// Manipulate the image here
let roiCallback: CIKernelROICallback = { _, rect -> CGRect in
return inputImage.extent
}
// Or inside the Kernel Metal function
let outputImage = self.kernel.apply(extent: inputExtent,
roiCallback: roiCallback,
arguments: [inputImage])
return outputImage
}
}
I'm sure I'm not the first one to encounter this issue
What does one do when the incoming video-buffer are too large (memory-wise) to filter, and they need to resize on-the-fly efficiently? Without re-encoding the video before?
See Question&Answers more detail:
os 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…