I'm recording the screen from my iPhone device to my Mac. As a preview layer, I am collecting sample buffers directly from an AVCaptureVideoDataOutput
, from which I'm creating textures and rendering them with Metal
. The problem I'm having is that code that worked in macOS prior to 10.13
stopped working after updating to 10.13
. Namely,
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(_currentSampleBuffer);
if (!imageBuffer) return;
CVPixelBufferLockBaseAddress(imageBuffer,0);
size_t width = CVPixelBufferGetWidth(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);
CVMetalTextureRef metalTexture = NULL;
CVReturn result = CVMetalTextureCacheCreateTextureFromImage(nil,
self.textureCache,
imageBuffer,
nil,
self.pixelFormat,
width,
height,
0,
&metalTexture);
if (result == kCVReturnSuccess) {
self.texture = CVMetalTextureGetTexture(metalTexture);
}
Returns result = -6660
, which translates to a generic kCVReturnError
, as can be seen on the official Apple docs, and the metalTexture = NULL
.
The pixel format I'm using is MTLPixelFormatBGRG422
since the samples coming from the camera are 2vuy
.
As a workaround to creating metalTexture
from sampleBuffer
, I am now
creating an intermediate NSImage
like so:
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(_currentSampleBuffer);
NSCIImageRep *imageRep = [NSCIImageRep imageRepWithCIImage:[CIImage imageWithCVImageBuffer:imageBuffer]];
NSImage *image = [[NSImage alloc] initWithSize:[imageRep size]];
[image addRepresentation:imageRep];
and creating a MTLTexture
from that. That is obviously a subpar solution to using CVMetalTextureCacheCreateTextureFromImage
directly.
Once again, the code in question works perfectly fine in macOS < 10.13
, I'd like to know if anyone has similar issues, and if so, do you have any ideas how to overcome this?
See Question&Answers more detail:
os 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…