Integrate camera experience into your SwiftUI apps.
- iOS 17+
- macOS 14+
Important
Make sure you have added NSCameraUsageDescription to your app's Info.plist to access camera.
You can view documentation on:
Add Aperture as a dependency in your Swift Package Manager manifest.
.package(url: "https://github.com/LiYanan2004/Aperture.git", branch: "main"),Include Aperture in any targets that need it.
.target(
name: "MyTarget",
dependencies: [
.product(name: "Aperture", package: "Aperture"),
]
),Create a camera, show the view finder, and place a shutter button at the bottom of the view.
import Aperture
import SwiftUI
struct CameraScreen: View {
@State private var camera = Camera(device: .builtInCamera, profile: .photo())
var body: some View {
ZStack {
CameraViewFinder(camera: camera)
.ignoresSafeArea()
CameraShutterButton(camera: camera) { photo in
// handle captured photo here.
}
.padding()
}
.task {
try? await camera.startRunning()
}
.onDisappear { camera.stopRunning() }
}
}Defines the view in portrait mode and delegate layout adjustments for other interface orientations to the adaptive stack.
CameraAdaptiveStack(camera: camera, spacing: 20) { proxy in
CameraViewFinder(camera: camera, videoGravity: .fill)
.ignoresSafeArea()
CameraAccessoryContainer(proxy: proxy, spacing: 0) {
CameraShutterButton(camera: camera) { photo in
// handle captured photo here.
}
} trailingAccessories: {
CameraFlipButton(camera: camera)
.padding(.vertical, 12)
}
}This produces the user interface shown in the poster image.
CameraCaptureProfile defines the session preset and which output services are attached.
let profile = CameraCaptureProfile(sessionPreset: .photo) {
PhotoCaptureService(options: [.zeroShutterLag, .responsiveCapture])
}
let camera = Camera(device: .builtInCamera, profile: profile)You can update output services at anytime. But keep in mind that this triggers a lengthy capture pipeline reconfiguration.
Use PhotoCaptureConfiguration for per-shot settings such as Live Photo and resolution.
Use of CameraShutterButton is highly recommended, but if you need more customization, you can call takePhoto(configuration:) when you needed.
let configuration = PhotoCaptureConfiguration(
capturesLivePhoto: true
)
CameraShutterButton(camera: camera, configuration: configuration) { photo in
// handles captured photo.
}
// or...
let photo = try await camera.takePhoto(configuration: configuration)Opt-in PhotoCaptureOptions.deliversDepthData to get embedded depth and Portrait Effect Matte in the photo data.
let camera = Camera(device: .builtInCamera, profile: .photo(options: .deliversDepthData))
let photo = try await camera.takePhoto(configuration: PhotoCaptureConfiguration())Enable ProRAW on the output and opt in per shot.
let camera = Camera(device: .builtInCamera, profile: .photo(options: .appleProRAW))
let configuration = PhotoCaptureConfiguration(dataFormat: .raw)
let photo = try await camera.takePhoto(configuration: configuration)Use .rawPlusHEIF or .rawPlusJPEG if you want a processed companion image.
Implement OutputService when you need a custom capture output.
import AVFoundation
struct MetadataOutputService: OutputService {
func makeOutput(context: Context) -> AVCaptureMetadataOutput {
AVCaptureMetadataOutput()
}
func updateOutput(output: AVCaptureMetadataOutput, context: Context) {
// Configure metadata output here.
}
}
let profile = CameraCaptureProfile(sessionPreset: .photo) {
PhotoCaptureService()
MetadataOutputService()
}