Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
402 views
in Technique[技术] by (71.8m points)

objective c - Can I use AVFoundation to stream downloaded video frames into an OpenGL ES texture?

I've been able to use AVFoundation's AVAssetReader class to upload video frames into an OpenGL ES texture. It has a caveat, however, in that it fails when used with an AVURLAsset that points to remote media. This failure isn't well documented, and I'm wondering if there's any way around the shortcoming.

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

There's some API that was released with iOS 6 that I've been able to use to make the process a breeze. It doesn't use AVAssetReader at all, and instead relies on a class called AVPlayerItemVideoOutput. An instance of this class can be added to any AVPlayerItem instance via a new -addOutput: method.

Unlike the AVAssetReader, this class will work fine for AVPlayerItems that are backed by a remote AVURLAsset, and also has the benefit of allowing for a more sophisticated playback interface that supports non-linear playback via -copyPixelBufferForItemTime:itemTimeForDisplay: (instead of of AVAssetReader's severely limiting -copyNextSampleBuffer method.


SAMPLE CODE

// Initialize the AVFoundation state
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:someUrl options:nil];
[asset loadValuesAsynchronouslyForKeys:[NSArray arrayWithObject:@"tracks"] completionHandler:^{

    NSError* error = nil;
    AVKeyValueStatus status = [asset statusOfValueForKey:@"tracks" error:&error];
    if (status == AVKeyValueStatusLoaded)
    {
        NSDictionary* settings = @{ (id)kCVPixelBufferPixelFormatTypeKey : [NSNumber numberWithInt:kCVPixelFormatType_32BGRA] };
        AVPlayerItemVideoOutput* output = [[[AVPlayerItemVideoOutput alloc] initWithPixelBufferAttributes:settings] autorelease];
        AVPlayerItem* playerItem = [AVPlayerItem playerItemWithAsset:asset];
        [playerItem addOutput:[self playerItemOutput]];
        AVPlayer* player = [AVPlayer playerWithPlayerItem:playerItem];

        // Assume some instance variable exist here. You'll need them to control the
        // playback of the video (via the AVPlayer), and to copy sample buffers (via the AVPlayerItemVideoOutput).
        [self setPlayer:player];
        [self setPlayerItem:playerItem];
        [self setOutput:output];
    }
    else
    {
        NSLog(@"%@ Failed to load the tracks.", self);
    }
}];

// Now at any later point in time, you can get a pixel buffer
// that corresponds to the current AVPlayer state like this:
CVPixelBufferRef buffer = [[self output] copyPixelBufferForItemTime:[[self playerItem] currentTime] itemTimeForDisplay:nil];

Once you've got your buffer, you can upload it to OpenGL however you want. I recommend the horribly documented CVOpenGLESTextureCacheCreateTextureFromImage() function, because you'll get hardware acceleration on all the newer devices, which is much faster than glTexSubImage2D(). See Apple's GLCameraRipple and RosyWriter demos for examples.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...