Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
501 views
in Technique[技术] by (71.8m points)

android - Using MediaCodec to save series of images as Video

I am trying to use MediaCodec to save a series of Images, saved as Byte Arrays in a file, to a video file. I have tested these images on a SurfaceView (playing them in series) and I can see them fine. I have looked at many examples using MediaCodec, and here is what I understand (please correct me if I am wrong):

Get InputBuffers from MediaCodec object -> fill it with your frame's image data -> queue the input buffer -> get coded output buffer -> write it to a file -> increase presentation time and repeat

However, I have tested this a lot and I end up with one of two cases:

  • All sample projects I tried to imitate have caused Media server to die when calling queueInputBuffer for the second time.
  • I tried calling codec.flush() at the end (after saving output buffer to file, although none of the examples I saw did this) and the media server did not die, however, I am not able to open the output video file with any media player, so something is wrong.

Here is my code:

MediaCodec codec = MediaCodec.createEncoderByType(MIMETYPE);
        MediaFormat mediaFormat = null;
        if(CamcorderProfile.hasProfile(CamcorderProfile.QUALITY_720P)){
            mediaFormat = MediaFormat.createVideoFormat(MIMETYPE, 1280 , 720);
        } else {
            mediaFormat = MediaFormat.createVideoFormat(MIMETYPE, 720, 480);
        }


        mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, 700000);
        mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, 10);
        mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar);
        mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 5);
        codec.configure(mediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);

        codec.start();

        ByteBuffer[] inputBuffers = codec.getInputBuffers();
        ByteBuffer[] outputBuffers = codec.getOutputBuffers();
        boolean sawInputEOS = false;
        int inputBufferIndex=-1,outputBufferIndex=-1;
        BufferInfo info=null;

                    //loop to read YUV byte array from file

            inputBufferIndex = codec.dequeueInputBuffer(WAITTIME);
            if(bytesread<=0)sawInputEOS=true;

            if(inputBufferIndex >= 0){
                if(!sawInputEOS){
                    int samplesiz=dat.length;
                    inputBuffers[inputBufferIndex].put(dat);
                    codec.queueInputBuffer(inputBufferIndex, 0, samplesiz, presentationTime, 0);
                    presentationTime += 100;

                    info = new BufferInfo();
                    outputBufferIndex = codec.dequeueOutputBuffer(info, WAITTIME);
                    Log.i("BATA", "outputBufferIndex="+outputBufferIndex);
                    if(outputBufferIndex >= 0){
                        byte[] array = new byte[info.size];
                        outputBuffers[outputBufferIndex].get(array);

                        if(array != null){
                            try {
                                dos.write(array);
                            } catch (IOException e) {
                                e.printStackTrace();
                            }
                        }

                        codec.releaseOutputBuffer(outputBufferIndex, false);
                        inputBuffers[inputBufferIndex].clear();
                        outputBuffers[outputBufferIndex].clear();

                        if(sawInputEOS) break;
                    }
                }else{
                    codec.queueInputBuffer(inputBufferIndex, 0, 0, presentationTime, MediaCodec.BUFFER_FLAG_END_OF_STREAM);

                    info = new BufferInfo();
                    outputBufferIndex = codec.dequeueOutputBuffer(info, WAITTIME);

                    if(outputBufferIndex >= 0){
                        byte[] array = new byte[info.size];
                        outputBuffers[outputBufferIndex].get(array);

                        if(array != null){
                            try {
                                dos.write(array);
                            } catch (IOException e) {
                                e.printStackTrace();
                            }
                        }

                        codec.releaseOutputBuffer(outputBufferIndex, false);
                        inputBuffers[inputBufferIndex].clear();
                        outputBuffers[outputBufferIndex].clear();
                        break;
                    }
                }


            }
        }

        codec.flush();

        try {
            fstream2.close();
            dos.flush();
            dos.close();
        } catch (IOException e) {
            e.printStackTrace();
        }
        codec.stop();
        codec.release();
        codec = null;

        return true;

    }

My question is, how can I get a working video from a stream of images using MediaCodec. What am I doing wrong?

Another question (if I am not too greedy), I would like to add an Audio track to this video, can it be done with MediaCodec as well, or must I use FFmpeg?

Note: I know about MediaMux in Android 4.3, however, it is not an option for me as my App must work on Android 4.1+.

Update Thanks to fadden answer, I was able to reach EOS without Media server dying (Above code is after modification). However, the file I am getting is producing gibberish. Here is a snapshot of the video I get (only works as .h264 file).

Video Output

My Input image format is YUV image (NV21 from camera preview). I can't get it to be any playable format. I tried all COLOR_FormatYUV420 formats and same gibberish output. And I still can't find away (using MediaCodec) to add audio.

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

I think you have the right general idea. Some things to be aware of:

  • Not all devices support COLOR_FormatYUV420SemiPlanar. Some only accept planar. (Android 4.3 introduced CTS tests to ensure that the AVC codec supports one or the other.)
  • It's not the case that queueing an input buffer will immediately result in the generation of one output buffer. Some codecs may accumulate several frames of input before producing output, and may produce output after your input has finished. Make sure your loops take that into account (e.g. your inputBuffers[].clear() will blow up if it's still -1).
  • Don't try to submit data and send EOS with the same queueInputBuffer call. The data in that frame may be discarded. Always send EOS with a zero-length buffer.

The output of the codecs is generally pretty "raw", e.g. the AVC codec emits an H.264 elementary stream rather than a "cooked" .mp4 file. Many players won't accept this format. If you can't rely on the presence of MediaMuxer you will need to find another way to cook the data (search around on stackoverflow for ideas).

It's certainly not expected that the mediaserver process would crash.

You can find some examples and links to the 4.3 CTS tests here.

Update: As of Android 4.3, MediaCodec and Camera have no ByteBuffer formats in common, so at the very least you will need to fiddle with the chroma planes. However, that sort of problem manifests very differently (as shown in the images for this question).

The image you added looks like video, but with stride and/or alignment issues. Make sure your pixels are laid out correctly. In the CTS EncodeDecodeTest, the generateFrame() method (line 906) shows how to encode both planar and semi-planar YUV420 for MediaCodec.

The easiest way to avoid the format issues is to move the frames through a Surface (like the CameraToMpegTest sample), but unfortunately that's not possible in Android 4.1.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...