Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
165 views
in Technique[技术] by (71.8m points)

ios - OpenGL ES 2.0 to Video on iPad/iPhone

I am at my wits end here despite the good information here on StackOverflow...

I am trying to write an OpenGL renderbuffer to a video on the iPad 2 (using iOS 4.3). This is more exactly what I am attempting:

A) set up an AVAssetWriterInputPixelBufferAdaptor

  1. create an AVAssetWriter that points to a video file

  2. set up an AVAssetWriterInput with appropriate settings

  3. set up an AVAssetWriterInputPixelBufferAdaptor to add data to the video file

B) write data to a video file using that AVAssetWriterInputPixelBufferAdaptor

  1. render OpenGL code to the screen

  2. get the OpenGL buffer via glReadPixels

  3. create a CVPixelBufferRef from the OpenGL data

  4. append that PixelBuffer to the AVAssetWriterInputPixelBufferAdaptor using the appendPixelBuffer method

However, I am having problems doings this. My strategy right now is to set up the AVAssetWriterInputPixelBufferAdaptor when a button is pressed. Once the AVAssetWriterInputPixelBufferAdaptor is valid, I set a flag to signal the EAGLView to create a pixel buffer and append it to the video file via appendPixelBuffer for a given number of frames.

Right now my code is crashing as it tries to append the second pixel buffer, giving me the following error:

-[__NSCFDictionary appendPixelBuffer:withPresentationTime:]: unrecognized selector sent to instance 0x131db0

Here is my AVAsset setup code (a lot of was based on Rudy Aramayo's code, which does work on normal images, but is not set up for textures):

- (void) testVideoWriter {

  //initialize global info
  MOVIE_NAME = @"Documents/Movie.mov";
  CGSize size = CGSizeMake(480, 320);
  frameLength = CMTimeMake(1, 5); 
  currentTime = kCMTimeZero;
  currentFrame = 0;

  NSString *MOVIE_PATH = [NSHomeDirectory() stringByAppendingPathComponent:MOVIE_NAME];
  NSError *error = nil;

  unlink([betaCompressionDirectory UTF8String]);

  videoWriter = [[AVAssetWriter alloc] initWithURL:[NSURL fileURLWithPath:betaCompressionDirectory] fileType:AVFileTypeQuickTimeMovie error:&error];

  NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:AVVideoCodecH264, AVVideoCodecKey,
                                 [NSNumber numberWithInt:size.width], AVVideoWidthKey,
                                 [NSNumber numberWithInt:size.height], AVVideoHeightKey, nil];
  writerInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings];

  //writerInput.expectsMediaDataInRealTime = NO;

  NSDictionary *sourcePixelBufferAttributesDictionary = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithInt:kCVPixelFormatType_32BGRA], kCVPixelBufferPixelFormatTypeKey, nil];

  adaptor = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput                                                                          sourcePixelBufferAttributes:sourcePixelBufferAttributesDictionary];
  [adaptor retain];

  [videoWriter addInput:writerInput];

  [videoWriter startWriting];
  [videoWriter startSessionAtSourceTime:kCMTimeZero];

  VIDEO_WRITER_IS_READY = true;
}

Ok, now that my videoWriter and adaptor are set up, I tell my OpenGL renderer to create a pixel buffer for every frame:

- (void) captureScreenVideo {

  if (!writerInput.readyForMoreMediaData) {
    return;
  }

  CGSize esize = CGSizeMake(eagl.backingWidth, eagl.backingHeight);
  NSInteger myDataLength = esize.width * esize.height * 4;
  GLuint *buffer = (GLuint *) malloc(myDataLength);
  glReadPixels(0, 0, esize.width, esize.height, GL_RGBA, GL_UNSIGNED_BYTE, buffer);
  CVPixelBufferRef pixel_buffer = NULL;
  CVPixelBufferCreateWithBytes (NULL, esize.width, esize.height, kCVPixelFormatType_32BGRA, buffer, 4 * esize.width, NULL, 0, NULL, &pixel_buffer);

  /* DON'T FREE THIS BEFORE USING pixel_buffer! */ 
  //free(buffer);

  if(![adaptor appendPixelBuffer:pixel_buffer withPresentationTime:currentTime]) {
      NSLog(@"FAIL");
    } else {
      NSLog(@"Success:%d", currentFrame);
      currentTime = CMTimeAdd(currentTime, frameLength);
    }

   free(buffer);
   CVPixelBufferRelease(pixel_buffer);
  }


  currentFrame++;

  if (currentFrame > MAX_FRAMES) {
    VIDEO_WRITER_IS_READY = false;
    [writerInput markAsFinished];
    [videoWriter finishWriting];
    [videoWriter release];

    [self moveVideoToSavedPhotos]; 
  }
}

And finally, I move the Video to the camera roll:

- (void) moveVideoToSavedPhotos {
  ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
  NSString *localVid = [NSHomeDirectory() stringByAppendingPathComponent:MOVIE_NAME];    
  NSURL* fileURL = [NSURL fileURLWithPath:localVid];

  [library writeVideoAtPathToSavedPhotosAlbum:fileURL
                              completionBlock:^(NSURL *assetURL, NSError *error) {
                                if (error) {   
                                  NSLog(@"%@: Error saving context: %@", [self class], [error localizedDescription]);
                                }
                              }];
  [library release];
}

However, as I said, I am crashing in the call to appendPixelBuffer.

Sorry for sending so much code, but I really don't know what I am doing wrong. It seemed like it would be trivial to update a project which writes images to a video, but I am unable to take the pixel buffer I create via glReadPixels and append it. It's driving me crazy! If anyone has any advice or a working code example of OpenGL --> Video that would be amazing... Thanks!

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

I just got something similar to this working in my open source GPUImage framework, based on the above code, so I thought I'd provide my working solution to this. In my case, I was able to use a pixel buffer pool, as suggested by Srikumar, instead of the manually created pixel buffers for each frame.

I first configure the movie to be recorded:

NSError *error = nil;

assetWriter = [[AVAssetWriter alloc] initWithURL:movieURL fileType:AVFileTypeAppleM4V error:&error];
if (error != nil)
{
    NSLog(@"Error: %@", error);
}


NSMutableDictionary * outputSettings = [[NSMutableDictionary alloc] init];
[outputSettings setObject: AVVideoCodecH264 forKey: AVVideoCodecKey];
[outputSettings setObject: [NSNumber numberWithInt: videoSize.width] forKey: AVVideoWidthKey];
[outputSettings setObject: [NSNumber numberWithInt: videoSize.height] forKey: AVVideoHeightKey];


assetWriterVideoInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:outputSettings];
assetWriterVideoInput.expectsMediaDataInRealTime = YES;

// You need to use BGRA for the video in order to get realtime encoding. I use a color-swizzling shader to line up glReadPixels' normal RGBA output with the movie input's BGRA.
NSDictionary *sourcePixelBufferAttributesDictionary = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithInt:kCVPixelFormatType_32BGRA], kCVPixelBufferPixelFormatTypeKey,
                                                       [NSNumber numberWithInt:videoSize.width], kCVPixelBufferWidthKey,
                                                       [NSNumber numberWithInt:videoSize.height], kCVPixelBufferHeightKey,
                                                       nil];

assetWriterPixelBufferInput = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:assetWriterVideoInput sourcePixelBufferAttributes:sourcePixelBufferAttributesDictionary];

[assetWriter addInput:assetWriterVideoInput];

then use this code to grab each rendered frame using glReadPixels():

CVPixelBufferRef pixel_buffer = NULL;

CVReturn status = CVPixelBufferPoolCreatePixelBuffer (NULL, [assetWriterPixelBufferInput pixelBufferPool], &pixel_buffer);
if ((pixel_buffer == NULL) || (status != kCVReturnSuccess))
{
    return;
}
else
{
    CVPixelBufferLockBaseAddress(pixel_buffer, 0);
    GLubyte *pixelBufferData = (GLubyte *)CVPixelBufferGetBaseAddress(pixel_buffer);
    glReadPixels(0, 0, videoSize.width, videoSize.height, GL_RGBA, GL_UNSIGNED_BYTE, pixelBufferData);
}

// May need to add a check here, because if two consecutive times with the same value are added to the movie, it aborts recording
CMTime currentTime = CMTimeMakeWithSeconds([[NSDate date] timeIntervalSinceDate:startTime],120);

if(![assetWriterPixelBufferInput appendPixelBuffer:pixel_buffer withPresentationTime:currentTime]) 
{
    NSLog(@"Problem appending pixel buffer at time: %lld", currentTime.value);
} 
else 
{
//        NSLog(@"Recorded pixel buffer at time: %lld", currentTime.value);
}
CVPixelBufferUnlockBaseAddress(pixel_buffer, 0);

CVPixelBufferRelease(pixel_buffer);

One thing I noticed is that if I tried to append two pixel buffers with the same integer time value (in the basis provided), the entire recording would fail and the input would never take another pixel buffer. Similarly, if I tried to append a pixel buffer after retrieval from the pool failed, it would abort the recording. Thus, the early bailout in the code above.

In addition to the above code, I use a color-swizzling shader to convert the RGBA rendering in my OpenGL ES scene to BGRA for fast encoding by the AVAssetWriter. With this, I'm able to record 640x480 video at 30 FPS on an iPhone 4.

Again, all of the code for this can be found within the GPUImage repository, under the GPUImageMovieWriter class.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...