Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
160 views
in Technique[技术] by (71.8m points)

ios - How to apply filter to Video real-time using Swift

is possible to apply filter to AVLayer and add it to view as addSublayer? I want to change colors and add some noise to video from camera using Swift and I don't know how.

I thought, that is possible to add filterLayer and previewLayer like this:

self.view.layer.addSublayer(previewLayer)
self.view.layer.addSublayer(filterLayer)

and this can maybe create video with my custom filter, but I think, that is possible to do that more effectively usign AVComposition

So what I need to know:

  1. What is simplest way to apply filter to camera video output realtime?
  2. Is possible to merge AVCaptureVideoPreviewLayer and CALayer?

Thanks for every suggestion..

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

There's another alternative, use an AVCaptureSession to create instances of CIImage to which you can apply CIFilters (of which there are loads, from blurs to color correction to VFX).

Here's an example using the ComicBook effect. In a nutshell, create an AVCaptureSession:

let captureSession = AVCaptureSession()
captureSession.sessionPreset = AVCaptureSessionPresetPhoto

Create an AVCaptureDevice to represent the camera, here I'm setting the back camera:

let backCamera = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeVideo)

Then create a concrete implementation of the device and attach it to the session. In Swift 2, instantiating AVCaptureDeviceInput can throw an error, so we need to catch that:

 do
{
    let input = try AVCaptureDeviceInput(device: backCamera)

    captureSession.addInput(input)
}
catch
{
    print("can't access camera")
    return
}

Now, here's a little 'gotcha': although we don't actually use an AVCaptureVideoPreviewLayer but it's required to get the sample delegate working, so we create one of those:

// although we don't use this, it's required to get captureOutput invoked
let previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)

view.layer.addSublayer(previewLayer)

Next, we create a video output, AVCaptureVideoDataOutput which we'll use to access the video feed:

let videoOutput = AVCaptureVideoDataOutput()

Ensuring that self implements AVCaptureVideoDataOutputSampleBufferDelegate, we can set the sample buffer delegate on the video output:

 videoOutput.setSampleBufferDelegate(self, 
    queue: dispatch_queue_create("sample buffer delegate", DISPATCH_QUEUE_SERIAL))

The video output is then attached to the capture session:

 captureSession.addOutput(videoOutput)

...and, finally, we start the capture session:

captureSession.startRunning()

Because we've set the delegate, captureOutput will be invoked with each frame capture. captureOutput is passed a sample buffer of type CMSampleBuffer and it just takes two lines of code to convert that data to a CIImage for Core Image to handle:

let pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)
let cameraImage = CIImage(CVPixelBuffer: pixelBuffer!)

...and that image data is passed to our Comic Book effect which, in turn, is used to populate an image view:

let comicEffect = CIFilter(name: "CIComicEffect")

comicEffect!.setValue(cameraImage, forKey: kCIInputImageKey)

let filteredImage = UIImage(CIImage: comicEffect!.valueForKey(kCIOutputImageKey) as! CIImage!)

dispatch_async(dispatch_get_main_queue())
{
    self.imageView.image = filteredImage
}

I have the source code for this project available in my GitHub repo here.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...