Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
587 views
in Technique[技术] by (71.8m points)

ios - AVCapture Session To Capture Image SWIFT

I have created an AVCaptureSession to capture video output and display it to the user via UIView. Now I want to be able to click a button (takePhoto method) and display the image from the session in an UIImageView. I have tried to iterate through each devices connection and try to save the output but that hasnt worked. The code I have is below

let captureSession = AVCaptureSession()
var stillImageOutput: AVCaptureStillImageOutput!

@IBOutlet var imageView: UIImageView!
@IBOutlet var cameraView: UIView!


// If we find a device we'll store it here for later use
var captureDevice : AVCaptureDevice?

override func viewDidLoad() {
    // Do any additional setup after loading the view, typically from a nib.
    super.viewDidLoad()
    println("I AM AT THE CAMERA")
    captureSession.sessionPreset = AVCaptureSessionPresetLow
    self.captureDevice = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeVideo)
    if(captureDevice != nil){
        beginSession()
    }
}
    func beginSession() {

    self.stillImageOutput = AVCaptureStillImageOutput()
    self.captureSession.addOutput(self.stillImageOutput)
    var err : NSError? = nil
    self.captureSession.addInput(AVCaptureDeviceInput(device: self.captureDevice, error: &err))

    if err != nil {
        println("error: (err?.localizedDescription)")
    }

    var previewLayer = AVCaptureVideoPreviewLayer(session: self.captureSession)
    self.cameraView.layer.addSublayer(previewLayer)
    previewLayer?.frame = self.cameraView.layer.frame
    captureSession.startRunning()
}

@IBAction func takePhoto(sender: UIButton) {
    self.stillImageOutput.captureStillImageAsynchronouslyFromConnection(self.stillImageOutput.connectionWithMediaType(AVMediaTypeVideo)) { (buffer:CMSampleBuffer!, error:NSError!) -> Void in
        var image = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(buffer)
        var data_image = UIImage(data: image)
        self.imageView.image = data_image
    }
}
}
See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

You should try adding a new thread when adding input and outputs to the session before starting it. In Apple's documentation they state

Important: The startRunning method is a blocking call which can take some time, therefore you should perform session setup on a serial queue so that the main queue isn't blocked (which keeps the UI responsive). See AVCam for iOS for the canonical implementation example.

Try using a dispatch in the create session method such as below

dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH, 0), { // 1
        self.captureSession.addOutput(self.stillImageOutput)
        self.captureSession.addInput(AVCaptureDeviceInput(device: self.captureDevice, error: &err))
        self.captureSession.sessionPreset = AVCaptureSessionPresetPhoto
        if err != nil {
            println("error: (err?.localizedDescription)")
        }
        var previewLayer = AVCaptureVideoPreviewLayer(session: self.captureSession)
        previewLayer?.frame = self.cameraView.layer.bounds
        previewLayer?.videoGravity = AVLayerVideoGravityResizeAspectFill
        dispatch_async(dispatch_get_main_queue(), { // 2
                    // 3
            self.cameraView.layer.addSublayer(previewLayer)
            self.captureSession.startRunning()
            });
        });

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...