Illustration of what I'm trying to do
I'm trying to do the following:
- Play music
- Record a square video ( I have a container in the view which shows what you are recording)
- Add a label at the top and the app's icon & name in the bottom left of the square video.
Up to this point I managed to play the music, show the AVCaptureVideoPreviewLayer in a square container in a different view and save the video to the camera roll.
The thing is that I can barely find a few vague tutorials about using AVFoundation and this being my first app, makes things quite hard.
I managed to do these things, but I still don't understand how AVFoundation works. The documentation is vague for a beginner and I haven't found a tutorial for what I specifically want and putting together multiple tutorials (and written in Obj C) is making this impossible. My problems are the following:
- The video doesn't get saved as square. (mentioning that the app doesn't support landscape orientation)
- The video has no audio. (I think that I should add some sort of audio input other than the video)
- How to add the watermarks to the video?
- I have a bug: I created a view (messageView; see in code) with a text & image letting the user know that the video was saved to camera roll. But if I start recording the second time, the view appears WHILE the video is recording, not AFTER it was recorded. I suspect it's related to naming every video the same.
So I make the preparations:
override func viewDidLoad() {
super.viewDidLoad()
// Preset For High Quality
captureSession.sessionPreset = AVCaptureSessionPresetHigh
// Get available devices capable of recording video
let devices = AVCaptureDevice.devicesWithMediaType(AVMediaTypeVideo) as! [AVCaptureDevice]
// Get back camera
for device in devices
{
if device.position == AVCaptureDevicePosition.Back
{
currentDevice = device
}
}
// Set Input
let captureDeviceInput: AVCaptureDeviceInput
do
{
captureDeviceInput = try AVCaptureDeviceInput(device: currentDevice)
}
catch
{
print(error)
return
}
// Set Output
videoFileOutput = AVCaptureMovieFileOutput()
// Configure Session w/ Input & Output Devices
captureSession.addInput(captureDeviceInput)
captureSession.addOutput(videoFileOutput)
// Show Camera Preview
cameraPreviewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
view.layer.addSublayer(cameraPreviewLayer!)
cameraPreviewLayer?.videoGravity = AVLayerVideoGravityResizeAspectFill
let width = view.bounds.width*0.85
cameraPreviewLayer?.frame = CGRectMake(0, 0, width, width)
// Bring Record Button To Front
view.bringSubviewToFront(recordButton)
captureSession.startRunning()
// // Bring Message To Front
// view.bringSubviewToFront(messageView)
// view.bringSubviewToFront(messageText)
// view.bringSubviewToFront(messageImage)
}
Then when I press the record button:
@IBAction func capture(sender: AnyObject) {
if !isRecording
{
isRecording = true
UIView.animateWithDuration(0.5, delay: 0.0, options: [.Repeat, .Autoreverse, .AllowUserInteraction], animations: { () -> Void in
self.recordButton.transform = CGAffineTransformMakeScale(0.5, 0.5)
}, completion: nil)
let outputPath = NSTemporaryDirectory() + "output.mov"
let outputFileURL = NSURL(fileURLWithPath: outputPath)
videoFileOutput?.startRecordingToOutputFileURL(outputFileURL, recordingDelegate: self)
}
else
{
isRecording = false
UIView.animateWithDuration(0.5, delay: 0, options: [], animations: { () -> Void in
self.recordButton.transform = CGAffineTransformMakeScale(1.0, 1.0)
}, completion: nil)
recordButton.layer.removeAllAnimations()
videoFileOutput?.stopRecording()
}
}
And after the video was recorded:
func captureOutput(captureOutput: AVCaptureFileOutput!, didFinishRecordingToOutputFileAtURL outputFileURL: NSURL!, fromConnections connections: [AnyObject]!, error: NSError!) {
let outputPath = NSTemporaryDirectory() + "output.mov"
if UIVideoAtPathIsCompatibleWithSavedPhotosAlbum(outputPath)
{
UISaveVideoAtPathToSavedPhotosAlbum(outputPath, self, nil, nil)
// Show Success Message
UIView.animateWithDuration(0.4, delay: 0, options: [], animations: {
self.messageView.alpha = 0.8
}, completion: nil)
UIView.animateWithDuration(0.4, delay: 0, options: [], animations: {
self.messageText.alpha = 1.0
}, completion: nil)
UIView.animateWithDuration(0.4, delay: 0, options: [], animations: {
self.messageImage.alpha = 1.0
}, completion: nil)
// Hide Message
UIView.animateWithDuration(0.4, delay: 1, options: [], animations: {
self.messageView.alpha = 0
}, completion: nil)
UIView.animateWithDuration(0.4, delay: 1, options: [], animations: {
self.messageText.alpha = 0
}, completion: nil)
UIView.animateWithDuration(0.4, delay: 1, options: [], animations: {
self.messageImage.alpha = 0
}, completion: nil)
}
}
So what do I need to do fix this? I kept searching and looking over tutorials but I can't figure it out... I read about adding watermarks and I saw that it has something to do with adding CALayers on top of the video. But obviously I can't do that since I don't even know how to make the video square and add audio.
See Question&Answers more detail:
os