You'll need at least the iOS 11.1 SDK in Xcode 9.1 (both in beta as of this writing). With that, builtInTrueDepthCamera
becomes one of the camera types you use to select a capture device:
let device = AVCaptureDevice.default(.builtInTrueDepthCamera, for: .video, position: .front)
Then you can go on to set up an AVCaptureSession
with the TrueDepth camera device, and can use that capture session to capture depth information much like you can with the back dual camera on iPhone 7 Plus and 8 Plus:
Turn on depth capture for photos with AVCapturePhotoOutput.
isDepthDataDeliveryEnabled
, then snap a picture with AVCapturePhotoSettings.
isDepthDataDeliveryEnabled
. You can read the depthData
from the AVCapturePhoto
object you receive after the capture, or turn on embedsDepthDataInPhoto
if you just want to fire and forget (and read the data from the captured image file later).
Get a live feed of depth maps with AVCaptureDepthDataOutput
. That one is like the video data output; instead of recording directly to a movie file, it gives your delegate a timed sequence of image (or in this case, depth) buffers. If you're also capturing video at the same time, AVCaptureDataOutputSynchronizer
might be handy for making sure you get coordinated depth maps and color frames together.
As Apple's Device Compatibility documentation notes, you need to select the builtInTrueDepthCamera
device to get any of these depth capture options. If you select the front-facing builtInWideAngleCamera
, it becomes like any other selfie camera, capturing only photo and video.
Just to emphasize: from an API point of view, capturing depth with the front-facing TrueDepth camera on iPhone X is a lot like capturing depth with the back-facing dual cameras on iPhone 7 Plus and 8 Plus. So if you want a deep dive on how all this depth capture business works in general, and what you can do with captured depth information, check out the WWDC17 Session 507: Capturing Depth in iPhone Photography talk.
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…