Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
640 views
in Technique[技术] by (71.8m points)

ios - GPUImageMovie Not Support Alpha Channel?

I create Video Effect by GPUImage like this

self.overlayerView = [[GPUImageView alloc] init];
self.overlayerView.frame = self.view.frame;

dispatch_queue_t queue = dispatch_queue_create("queue", NULL);
dispatch_async(queue, ^{
    
    NSURL *sourceURL = [[NSBundle mainBundle] URLForResource:@"212121" withExtension:@"mp4"];
    GPUImageMovie *sourceMovie = [[GPUImageMovie alloc] initWithURL:sourceURL];
    sourceMovie.playAtActualSpeed = YES;
    sourceMovie.shouldRepeat = YES;
    
    sourceMovie.shouldIgnoreUpdatesToThisTarget = YES;
    
    NSURL *maskURL = [[NSBundle mainBundle] URLForResource:@"rose" withExtension:@"mp4"];
    GPUImageMovie *maskMovie = [[GPUImageMovie alloc] initWithURL:maskURL];
    maskMovie.playAtActualSpeed = YES;
    maskMovie.shouldRepeat = YES;
    
    
    NSURL *alphaURL = [[NSBundle mainBundle] URLForResource:@"rose_alpha" withExtension:@"mp4"];
    GPUImageMovie *alphaMovie = [[GPUImageMovie alloc] initWithURL:alphaURL];
    alphaMovie.playAtActualSpeed = YES;
    alphaMovie.shouldRepeat = YES;
    
    
    NSURL *topURL = [[NSBundle mainBundle] URLForResource:@"screen" withExtension:@"mp4"];
    GPUImageMovie *topMovie = [[GPUImageMovie alloc] initWithURL:topURL];
    topMovie.playAtActualSpeed = YES;
    topMovie.shouldRepeat = YES;
    
    filter0 = [[GPUImageThreeInputFilter alloc] initWithFragmentShaderFromString:@"precision highp float;uniform sampler2D inputImageTexture;uniform sampler2D inputImageTexture2;uniform sampler2D inputImageTexture3;varying vec2 textureCoordinate;void main(){vec4 video=texture2D(inputImageTexture,textureCoordinate);vec4 mv=texture2D(inputImageTexture2, textureCoordinate);vec4 alpha = texture2D(inputImageTexture3, textureCoordinate);gl_FragColor = video * (1.0 - alpha.r) + mv;}"];
    

    filter1 = [[GPUImageTwoInputFilter  alloc] initWithFragmentShaderFromString:@"
precision highp float;
uniform sampler2D inputImageTexture; //video
uniform sampler2D inputImageTexture2; //screen
varying vec2 textureCoordinate;
void main()
{
vec4 video = texture2D(inputImageTexture, textureCoordinate);
vec4 screen = texture2D(inputImageTexture2, textureCoordinate);
mediump vec4 whiteColor = vec4(1.0);
gl_FragColor = whiteColor - ((whiteColor - screen) * (whiteColor - video));
}"];


    [sourceMovie addTarget:filter0];
    [maskMovie addTarget:filter0];
    [alphaMovie addTarget:filter0];
    [filter0 addTarget:filter1];
    [topMovie addTarget:filter1];

    [sourceMovie startProcessing];
    [alphaMovie startProcessing];
    [maskMovie startProcessing];
    [topMovie startProcessing];

    [filter0 forceProcessingAtSize:CGSizeMake(480,480)];
    [filter1 forceProcessingAtSize:CGSizeMake(480,480)];
    
    dispatch_async(dispatch_get_main_queue(), ^{
        [filter1 addTarget:self.overlayerView];
    });
});

Code can run and i got the video effect like this enter image description here

The video has a black background that is because the alphaMovie not play the same time with the maskMovie?

This is what i want to create enter image description here

The effect video no black background

Question:

1: How can i remove the black background ?

2: Why i the effect video have black background ?

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

The GPUImage framework does not contain support for an alpha channel feature like that. There is a green screen feature, so if you pre-produce your video against a green screen then it is possible to split the video from the green screen background. But, what you describe here is an alpha channel video and a second video and that it not going to work properly because you are pulling from two different video sources at the same time and they will not stay in sync. Note that even with the green screen feature, there are issues with the exact edges, as described in this blog post (includes source code). The basic problem is that edges that are near green but not exactly green can be treated in odd ways by the filter ramp. Another approach you might want to think about is to just precompose the N frames down to 1 video before attempting to playback the video with the iOS video logic.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...