NextLevel is a Swift camera system designed for easy integration, customized media capture, and image streaming in iOS.
| Features | |
|---|---|
| 🎬 | "Vine-like" video clip recording and editing |
| 🖼 | photo capture (raw, jpeg, and video frame) |
| 👆 | customizable gestural interaction and interface |
| 💠 | ARKit integration (beta) |
| 📷 | dual, wide angle, telephoto, & true depth support |
| 🐢 | adjustable frame rate on supported hardware (ie fast/slow motion capture) |
| 🎢 | depth data capture support & portrait effects matte support |
| 🔍 | video zoom |
| ⚖ | white balance, focus, and exposure adjustment |
| 🔦 | flash and torch support |
| 👯 | mirroring support |
| ☀ | low light boost |
| 🕶 | smooth auto-focus |
| ⚙ | configurable encoding and compression settings |
| 🛠 | simple media capture and editing API |
| 🌀 | extensible API for image processing and CV |
| 🐈 | animated GIF creator |
| 😎 | face recognition; qr- and bar-codes recognition |
| 🐦 | Swift 6 |
| ⚡ | async/await and modern concurrency support |
| 📖 | structured logging with OSLog |
The library provides powerful camera controls and features for capturing photos and videos, including multi-clip "Vine-like" recording, custom buffer processing, ARKit integration, and extensive device control – all with a simple, intuitive API.
- 🚀 Modern Async/Await API - Native Swift concurrency support with
async/awaitandAsyncStreamevents - 🔒 Swift 6 Strict Concurrency - Full thread-safety with Sendable conformance and actor isolation
- 🛡️ Critical Bug Fixes - Fixed AudioChannelLayout crash (#286, #271), photo capture crash (#280), audio interruption handling (#281), and video timing issues (#278)
- 📝 Enhanced Error Messages - Contextual error descriptions with LocalizedError and recovery suggestions
- ⚡ Better Performance - Proper state management and memory handling for long recordings
- 📐 Multi-Clip Recording Improvements - Fixed timestamp offset bugs for seamless clip merging
- 🎯 Configurable Network Optimization - Control shouldOptimizeForNetworkUse for faster local recording (#257)
- 📱 iOS 15+ AsyncStream Events - Modern reactive event system for camera state changes
- 🔙 Backwards Compatible - Legacy delegate-based API still works
- iOS 15.0+ for async/await APIs and modern concurrency features
- Swift 6.0
- Xcode 16.0+
- Looking for a video exporter? Check out NextLevelSessionExporter.
- Looking for a video player? Check out Player
Add the following to your Package.swift:
dependencies: [
.package(url: "https://github.com/NextLevel/NextLevel", from: "0.19.0")
]Or add it directly in Xcode: File → Add Package Dependencies...
pod "NextLevel", "~> 0.19.0"Alternatively, drop the source files into your Xcode project.
ARKit and the True Depth Camera software features are enabled with the inclusion of the Swift compiler flag USE_ARKIT and USE_TRUE_DEPTH respectively.
Apple will reject apps that link against ARKit or the True Depth Camera API and do not use them.
If you use Cocoapods, you can include -D USE_ARKIT or -D USE_TRUE_DEPTH with the following Podfile addition or by adding it to your Xcode build settings.
installer.pods_project.targets.each do |target|
# setup NextLevel for ARKit use
if target.name == 'NextLevel'
target.build_configurations.each do |config|
config.build_settings['OTHER_SWIFT_FLAGS'] = ['$(inherited)', '-DUSE_ARKIT']
end
end
endBefore starting, ensure that permission keys have been added to your app's Info.plist:
<key>NSCameraUsageDescription</key>
<string>Allowing access to the camera lets you take photos and videos.</string>
<key>NSMicrophoneUsageDescription</key>
<string>Allowing access to the microphone lets you record audio.</string>Import the library:
import NextLevelSetup the camera preview:
let screenBounds = UIScreen.main.bounds
self.previewView = UIView(frame: screenBounds)
if let previewView = self.previewView {
previewView.autoresizingMask = [.flexibleWidth, .flexibleHeight]
previewView.backgroundColor = UIColor.black
NextLevel.shared.previewLayer.frame = previewView.bounds
previewView.layer.addSublayer(NextLevel.shared.previewLayer)
self.view.addSubview(previewView)
}Configure the capture session:
override func viewDidLoad() {
super.viewDidLoad()
// Set delegates
NextLevel.shared.delegate = self
NextLevel.shared.deviceDelegate = self
NextLevel.shared.videoDelegate = self
NextLevel.shared.photoDelegate = self
// Configure video settings
NextLevel.shared.videoConfiguration.bitRate = 6_000_000 // 6 Mbps
NextLevel.shared.videoConfiguration.preset = .hd1920x1080
NextLevel.shared.videoConfiguration.maximumCaptureDuration = CMTime(seconds: 10, preferredTimescale: 600)
// Configure audio settings
NextLevel.shared.audioConfiguration.bitRate = 128_000 // 128 kbps
}Start/stop the session:
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
NextLevel.shared.start()
}
override func viewWillDisappear(_ animated: Bool) {
super.viewWillDisappear(animated)
NextLevel.shared.stop()
}Record and pause:
// Start recording
NextLevel.shared.record()
// Pause recording (creates a clip)
NextLevel.shared.pause()
// Resume recording (starts a new clip)
NextLevel.shared.record()The modern API provides clean async/await support for session operations:
// Merge clips with async/await
do {
if let session = NextLevel.shared.session {
let url = try await session.mergeClips(usingPreset: AVAssetExportPresetHighestQuality)
print("Video saved to: \(url)")
// Save to photo library
try await PHPhotoLibrary.shared().performChanges {
PHAssetChangeRequest.creationRequestForAssetFromVideo(atFileURL: url)
}
}
} catch {
print("Merge failed: \(error.localizedDescription)")
}Subscribe to camera events using AsyncStream for reactive programming:
Task {
for await event in NextLevel.shared.sessionEvents {
switch event {
case .didStart:
print("Camera session started")
case .didStop:
print("Camera session stopped")
case .sessionDidStart:
print("Recording session started")
case .sessionDidStop:
print("Recording session stopped")
case .wasInterrupted:
print("Session interrupted (e.g., phone call)")
case .interruptionEnded:
print("Interruption ended")
}
}
}NextLevel makes it easy to record multiple clips and merge them into a single video:
// Record first clip
NextLevel.shared.record()
// ... wait ...
NextLevel.shared.pause() // Creates first clip
// Record second clip
NextLevel.shared.record()
// ... wait ...
NextLevel.shared.pause() // Creates second clip
// Access all clips
if let session = NextLevel.shared.session {
print("Total clips: \(session.clips.count)")
print("Total duration: \(session.totalDuration.seconds)s")
// Remove last clip (undo)
session.removeLastClip()
// Remove specific clip
if let firstClip = session.clips.first {
session.remove(clip: firstClip)
}
// Remove all clips
session.removeAllClips()
// Merge all clips into single video
session.mergeClips(usingPreset: AVAssetExportPresetHighestQuality) { url, error in
if let outputURL = url {
print("Merged video: \(outputURL)")
} else if let error = error {
print("Merge failed: \(error.localizedDescription)")
}
}
}Capture high-quality photos with extensive configuration options:
// Configure photo settings
NextLevel.shared.photoConfiguration.codec = .hevc // HEVC for better compression
NextLevel.shared.photoConfiguration.isHighResolutionEnabled = true
NextLevel.shared.photoConfiguration.flashMode = .auto
// Set photo resolution/aspect ratio
// By default photos use .high preset (16:9)
// Available presets:
NextLevel.shared.photoConfiguration.preset = .photo // 4:3 aspect ratio (default camera)
// NextLevel.shared.photoConfiguration.preset = .high // 16:9 aspect ratio
// NextLevel.shared.photoConfiguration.preset = .hd1280x720 // 720p
// NextLevel.shared.photoConfiguration.preset = .hd1920x1080 // 1080p
// NextLevel.shared.photoConfiguration.preset = .hd4K3840x2160 // 4K
// Capture photo
NextLevel.shared.capturePhoto()
// Handle result in delegate
extension CameraViewController: NextLevelPhotoDelegate {
func nextLevel(_ nextLevel: NextLevel, didCompletePhotoCaptureFromVideoFrame: Bool) {
print("Photo capture completed")
}
func nextLevel(_ nextLevel: NextLevel, didFinishProcessingPhoto photo: AVCapturePhoto, photoDict: [String: Any], photoConfiguration: NextLevelPhotoConfiguration) {
// Get JPEG data
if let jpegData = photoDict[NextLevelPhotoJPEGKey] as? Data {
// Save photo
if let image = UIImage(data: jpegData) {
UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil)
}
}
// Get HEVC data (if configured)
if let hevcData = photoDict[NextLevelPhotoHEVCKey] as? Data {
// Process HEVC photo
}
}
}Photo Configuration Options:
- Codec:
.jpeg,.hevc- Choose compression format - Preset:
.photo(4:3),.high(16:9),.hd1920x1080,.hd4K3840x2160- Controls resolution and aspect ratio - High Resolution: Enable
isHighResolutionEnabledfor maximum quality - Flash Mode:
.on,.off,.auto - Portrait Effects Matte: Enable
isPortraitEffectsMatteEnabledfor depth effects - Quality Prioritization:
.speed,.balanced,.quality- Balance between capture speed and quality
NextLevel provides comprehensive camera control:
// Focus
try? NextLevel.shared.focusAtAdjustedPoint(CGPoint(x: 0.5, y: 0.5))
NextLevel.shared.focusMode = .continuousAutoFocus
// Exposure
try? NextLevel.shared.exposeAtAdjustedPoint(CGPoint(x: 0.5, y: 0.5))
NextLevel.shared.exposureMode = .continuousAutoExposure
// Zoom
NextLevel.shared.videoZoomFactor = 2.0
// Flash
NextLevel.shared.flashMode = .on
// Torch
NextLevel.shared.torchMode = .on
// Device position (front/back camera)
NextLevel.shared.devicePosition = .front
// Orientation
NextLevel.shared.deviceOrientation = .portrait
// Frame rate
NextLevel.shared.frameRate = 60 // 60 fps for slow motion
// Mirroring
NextLevel.shared.isMirroringEnabled = true
// Stabilization
NextLevel.shared.videoStabilizationMode = .cinematicTo use Bluetooth headsets or external microphones, configure the audio session before starting NextLevel:
override func viewDidLoad() {
super.viewDidLoad()
// Disable automatic audio session configuration
NextLevel.shared.automaticallyConfiguresApplicationAudioSession = false
// Configure audio session for Bluetooth support
let audioSession = AVAudioSession.sharedInstance()
do {
try audioSession.setCategory(
.playAndRecord,
mode: .videoRecording,
options: [.allowBluetooth, .allowBluetoothA2DP, .defaultToSpeaker]
)
try audioSession.setActive(true)
} catch {
print("Failed to configure audio session: \(error)")
}
// Now configure NextLevel
NextLevel.shared.delegate = self
NextLevel.shared.videoDelegate = self
// ... rest of configuration
}Audio Session Options:
.allowBluetooth- Enable Bluetooth HFP (hands-free profile) for voice.allowBluetoothA2DP- Enable Bluetooth A2DP for high-quality audio.defaultToSpeaker- Use speaker when no Bluetooth device is connected.mixWithOthers- Allow mixing with other audio (e.g., music apps)
Note: Choose the options that match your app's requirements. For example, video recording typically uses .videoRecording mode with .allowBluetoothA2DP for better audio quality.
For compatibility with older iOS versions or existing codebases:
extension CameraViewController: NextLevelDelegate {
func nextLevelSessionWillStart(_ nextLevel: NextLevel) {
print("Session will start")
}
func nextLevelSessionDidStart(_ nextLevel: NextLevel) {
print("Session started")
}
func nextLevelSessionDidStop(_ nextLevel: NextLevel) {
print("Session stopped")
}
func nextLevelSessionWasInterrupted(_ nextLevel: NextLevel) {
print("Session interrupted")
}
func nextLevelSessionInterruptionEnded(_ nextLevel: NextLevel) {
print("Interruption ended")
}
}
extension CameraViewController: NextLevelVideoDelegate {
func nextLevel(_ nextLevel: NextLevel, didUpdateVideoConfiguration videoConfiguration: NextLevelVideoConfiguration) {
print("Video configuration updated")
}
func nextLevel(_ nextLevel: NextLevel, didUpdateVideoZoomFactor videoZoomFactor: Float) {
print("Zoom: \(videoZoomFactor)x")
}
}Videos can also be processed using NextLevelSessionExporter, a powerful media transcoding library in Swift.
‘NextLevel’ was designed for sample buffer analysis and custom modification in real-time along side a rich set of camera features.
Just to note, modifications performed on a buffer and provided back to NextLevel may potentially effect frame rate.
Enable custom rendering.
NextLevel.shared.isVideoCustomContextRenderingEnabled = trueOptional hook that allows reading sampleBuffer for analysis.
extension CameraViewController: NextLevelVideoDelegate {
// ...
// video frame processing
public func nextLevel(_ nextLevel: NextLevel, willProcessRawVideoSampleBuffer sampleBuffer: CMSampleBuffer) {
// Use the sampleBuffer parameter in your system for continual analysis
}
Another optional hook for reading buffers for modification, imageBuffer. This is also the recommended place to provide the buffer back to NextLevel for recording.
extension CameraViewController: NextLevelVideoDelegate {
// ...
// enabled by isCustomContextVideoRenderingEnabled
public func nextLevel(_ nextLevel: NextLevel, renderToCustomContextWithImageBuffer imageBuffer: CVPixelBuffer, onQueue queue: DispatchQueue) {
// provide the frame back to NextLevel for recording
if let frame = self._availableFrameBuffer {
nextLevel.videoCustomContextImageBuffer = frame
}
}
NextLevel will check this property when writing buffers to a destination file. This works for both video and photos with capturePhotoFromVideo.
nextLevel.videoCustomContextImageBuffer = modifiedFrameThe 0.19.0 release introduces Swift 6 with modern async/await APIs while maintaining full backward compatibility. Here's how to migrate:
- Minimum iOS 15.0 (was iOS 14.0)
- Swift 6.0 required (was Swift 5.x)
- Xcode 16.0+ required
All existing delegate-based APIs continue to work. You can adopt new features incrementally:
Before (0.x):
// Legacy completion handler
session.mergeClips(usingPreset: AVAssetExportPresetHighestQuality) { url, error in
if let url = url {
print("Merged: \(url)")
} else if let error = error {
print("Error: \(error)")
}
}After (0.19.0):
// Modern async/await
do {
let url = try await session.mergeClips(usingPreset: AVAssetExportPresetHighestQuality)
print("Merged: \(url)")
} catch {
print("Error: \(error.localizedDescription)")
}Before:
extension CameraViewController: NextLevelDelegate {
func nextLevelSessionDidStart(_ nextLevel: NextLevel) {
print("Session started")
}
func nextLevelSessionWasInterrupted(_ nextLevel: NextLevel) {
print("Session interrupted")
}
}After (iOS 15+):
Task {
for await event in NextLevel.shared.sessionEvents {
switch event {
case .didStart:
print("Session started")
case .wasInterrupted:
print("Session interrupted")
default:
break
}
}
}When you update to 0.19.0, these critical bugs are automatically fixed:
- AudioChannelLayout crash (#286, #271) - No longer crashes when audio channel layout doesn't match channel count
- Photo capture crash (#280) - Fixed when
generateThumbnail = true - Missing audio after interruption (#281) - Audio now properly resumes after phone calls
- Video time skips (#278) - Fixed timestamp offset accumulation bug
- Network optimization (#257) - Now configurable via
shouldOptimizeForNetworkUse
No code changes required - just update your dependency version!
Errors now provide more context:
do {
try NextLevel.shared.focusAtAdjustedPoint(point)
} catch let error as LocalizedError {
print(error.localizedDescription) // User-friendly message
print(error.recoverySuggestion ?? "") // How to fix it
}Need Swift 5? Target the swift5 branch:
pod "NextLevel", :git => 'https://github.com/NextLevel/NextLevel.git', :branch => 'swift5'Need Swift 4.2? Target the swift4.2 branch:
pod "NextLevel", :git => 'https://github.com/NextLevel/NextLevel.git', :branch => 'swift4.2'Problem: App crashes with "AudioChannelLayout channel count does not match AVNumberOfChannelsKey channel count"
Solution: Update to NextLevel 0.19.0 or later. This issue has been fixed.
Root Cause: Audio channel layout validation now ensures the layout matches the declared channel count before configuring AVAssetWriterInput.
Problem: Setting generateThumbnail = true causes app crash
Solution: Update to NextLevel 0.19.0 or later. The issue has been fixed.
Root Cause: kCVPixelBufferPixelFormatTypeKey and AVVideoCodecKey are mutually exclusive in AVFoundation. The fix ensures only the appropriate key is set based on thumbnail configuration.
Problem: Video recordings have no audio after receiving a phone call or other interruption
Solution: Update to NextLevel 0.19.0 or later. The library now properly pauses and resumes recording during interruptions.
Root Cause: Audio session interruptions weren't properly handled, causing audio track initialization to fail after resuming.
Problem: Video playback shows unexpected time skips or jumps between clips
Solution: Update to NextLevel 0.19.0 or later. The timestamp offset calculation has been fixed.
Root Cause: Cumulative timestamp offset was being incorrectly accumulated every frame instead of only adjusting clip boundaries.
Problem: Camera preview is black or session doesn't start
Solutions:
- Check permissions in Info.plist:
<key>NSCameraUsageDescription</key>
<string>Allowing access to the camera lets you take photos and videos.</string>
<key>NSMicrophoneUsageDescription</key>
<string>Allowing access to the microphone lets you record audio.</string>- Verify you're calling
start()on the main thread:
DispatchQueue.main.async {
NextLevel.shared.start()
}- Check authorization status:
let authStatus = AVCaptureDevice.authorizationStatus(for: .video)
if authStatus == .authorized {
NextLevel.shared.start()
} else {
AVCaptureDevice.requestAccess(for: .video) { granted in
if granted {
DispatchQueue.main.async {
NextLevel.shared.start()
}
}
}
}Problem: Recording stops on its own without calling pause()
Possible Causes:
- Maximum duration reached - Check
videoConfiguration.maximumCaptureDuration - Disk space full - Monitor available storage
- Memory pressure - Lower resolution or bitrate for long recordings
- Interruption - Phone call, Siri, or other system interruption
Solutions:
// Increase max duration
NextLevel.shared.videoConfiguration.maximumCaptureDuration = CMTime.positiveInfinity
// Monitor session state
extension YourViewController: NextLevelDelegate {
func nextLevelCaptureDurationDidChange(_ nextLevel: NextLevel) {
if let session = nextLevel.session {
print("Duration: \(session.totalDuration.seconds)s")
}
}
}Solutions:
- Lower the resolution:
NextLevel.shared.videoConfiguration.preset = .hd1280x720 // Instead of 1920x1080- Reduce bitrate:
NextLevel.shared.videoConfiguration.bitRate = 3_000_000 // 3 Mbps instead of 6 Mbps- Disable custom buffer processing if not needed:
NextLevel.shared.isVideoCustomContextRenderingEnabled = false- Test on a physical device (simulators have different performance characteristics)
Solutions:
- Use HEVC codec for better compression:
NextLevel.shared.videoConfiguration.codec = .hevc- Enable network optimization for faster writing (default):
if let session = NextLevel.shared.session {
session.shouldOptimizeForNetworkUse = true
}- Remove clips you no longer need:
session.removeLastClip()
session.removeAllClips(removeFiles: true) // Also delete files from diskProblem: App rejected by App Store for linking ARKit without using it
Solution: Only include ARKit compiler flags when you're actually using ARKit features:
# In Podfile - only add if using ARKit
installer.pods_project.targets.each do |target|
if target.name == 'NextLevel'
target.build_configurations.each do |config|
config.build_settings['OTHER_SWIFT_FLAGS'] = ['$(inherited)', '-DUSE_ARKIT']
end
end
endDon't add -DUSE_ARKIT or -DUSE_TRUE_DEPTH flags unless you're actually using those features.
Problem: Concurrency warnings or errors after upgrading
Solutions:
- Clean build folder: Product → Clean Build Folder
- Delete DerivedData:
rm -rf ~/Library/Developer/Xcode/DerivedData - Update all dependencies to Swift 6 compatible versions
- Enable strict concurrency checking in your project if needed
- Issues: Open an issue with device model, iOS version, and NextLevel version
- Questions: Use Stack Overflow with the tag
nextlevel - Discussions: Check GitHub Discussions for community help
NextLevel was initally a weekend project that has now grown into a open community of camera platform enthusists. The software provides foundational components for managing media recording, camera interface customization, gestural interaction customization, and image streaming on iOS. The same capabilities can also be found in apps such as Snapchat, Instagram, and Vine.
The goal is to continue to provide a good foundation for quick integration (enabling projects to be taken to the next level) – allowing focus to placed on functionality that matters most whether it's realtime image processing, computer vision methods, augmented reality, or computational photography.
NextLevel provides components for capturing ARKit video and photo. This enables a variety of new camera features while leveraging the existing recording capabilities and media management of NextLevel.
If you are trying to capture frames from SceneKit for ARKit recording, check out the examples project.
You can find the docs here. Documentation is generated with jazzy and hosted on GitHub-Pages.
NextLevel is a community – contributions and discussions are welcome!
- Found a bug? Open an issue.
- Feature idea? Open an issue.
- Want to contribute? Submit a pull request.
- iOS Device Camera Summary
- AV Foundation Programming Guide
- AV Foundation Framework Reference
- ARKit Framework Reference
- Swift Evolution
- objc.io Camera and Photos
- objc.io Video
- objc.io Core Image and Video
- Cameras, ecommerce and machine learning
- Again, iPhone is the default camera
NextLevel is available under the MIT license, see the LICENSE file for more information.
