Streaming is available in most browsers,
and in the Developer app.
-
Support external cameras in your iPadOS app
Learn how you can discover and connect to external cameras in your iPadOS app using the AVFoundation capture classes. We'll show you how to rotate video from both external and built-in cameras, support external microphones with USB-C, and perform audio routing. Explore telephony support, tunings for optimal echo cancellation, and best practices for external camera adoption.
Resources
-
Download
♪ ♪ Nikolas: Hello and welcome to "Support external cameras in your iPadOS app." I'm Nikolas Gelo from the Camera Software team, and this session about how your iPad app can start using external cameras. Stage Manager's powerful set of features include the ability to extend your iPad's display across multiple screens. And with iPadOS 17, your app can start using external cameras such as the one in the Apple Studio Display. On this iPad Pro, FaceTime is open, and it's using the camera in the display that it's connected to. This is great because now the people on the other side of the call have a better viewing angle of me when I run the app on this big Apple Studio Display. I can also use Center Stage with it, which helps keep me in the frame as I move around.
FaceTime, Code Scanner, and WebKit use external cameras, and they are great examples of what your app can do. When using monitors that don't have built-in cameras, like the Apple Pro Display XDR, people often place a USB camera on top of it. If the USB camera is connected to the monitor, then when the monitor is plugged into the iPad, the camera will also be available to your app. Your iPad app can use external cameras and webcams to take photos and record movies. They also support other system camera features, like the Portrait Blur and Studio Light video effects available from Control Center.
iPads with USB-C connectors support external cameras. Your app can use devices that conform to the USB Video Class, or UVC, specification. It defines a standard for USB devices to support video streaming. And there are many popular cameras your app can use. Some external cameras have built-in microphones, which are also available to your app. Some manufacturers make non-camera devices that conform to the UVC spec, like HDMI switchers that change between multiple inputs to output a single video stream. iPadOS allows your app to use devices like these. External camera support is a great enhancement to iPad's rich media ecosystem. I'll show how your app can use them by starting with discovery and usage. Next, I'll circle back and demystify video rotation. Then I'll cover how your app can use microphones that are included with external cameras. And lastly, I'll discuss best practices for your app. First up, I'll discuss how your iPad app can start using external cameras. iPad apps use the camera for many features, like taking photos, recording movies, or sending camera frames over the network for video calls. The AVFoundation framework allows your app to use built-in and external cameras, specifically, with its AVCapture-prefixed classes. Let's review how an app can use the camera. First, an app uses AVCaptureDevices, which represent cameras and microphones. Then they are wrapped in AVCaptureDeviceInputs, which allow them to be plugged into an AVCaptureSession. The AVCaptureSession is the central control object of the AVCapture graph.
AVCaptureOutputs render data from inputs in various ways. The MovieFileOutput records QuickTime movies. The PhotoOutput captures high-quality stills and Live Photos. Data outputs, such as the VideoDataOutput or AudioDataOutput, deliver video or audio buffers from the camera or mic to your app.
And there are other kinds of data outputs, such as Metadata and Depth. For live camera preview, there's a special type of output, the AVCaptureVideoPreviewLayer, which is a subclass of CALayer.
Data flows from the capture inputs to compatible outputs through AVCaptureConnections. These classes are available on iOS, macOS, and tvOS.
If you're new to AVCapture, I invite you to learn more at the Capture Setup start page on developer.apple.com.
New in iPadOS 17, your app can access external cameras with AVCapture. If your app already uses built-in cameras, you can make simple updates to start using external ones. Discovering them is easy. Each external camera is represented by an AVCaptureDevice instance. And you can find them with existing API from AVCaptureDevice and AVCaptureDeviceDiscoverySession. There are three main attributes of an AVCaptureDevice: Its media type, device type, and position. External cameras provide video media data just like built-in cameras do. And their device type is external. For macOS app developers who are familiar with using external cameras, this deprecates the external unknown device type. Because external cameras can move independently from the iPad, their device position is unspecified. These three attributes can be used to find external cameras with the AVCapture API. It's easy to start using external cameras in your app. In this session, I'll modify the popular sample camera app, AVCam, to stream from the external camera in an Apple Studio Display. You can download the completed version of AVCam with all of the changes I make in this session from developer.apple.com.
Currently, the app uses built-in cameras, and it lets the user switch between a front and rear-facing camera when a button is pressed. When AVCam launches, it starts with a rear-facing camera. I'll change the code to have the app prefer looking for an external camera before a built-in one.
With the iPad connected to the Apple Studio Display, I'll run the app, and when it launches, it will use the external camera.
This is great. AVCam is now using an external camera, and all it needed was a few lines of code. The AVCaptureVideoPreviewLayer mirrors external cameras by default, which is suitable for using the camera in the Apple Studio Display. You can disable this behavior if you prefer. I'll describe how to do this in the best practices section at the end of this session. Now I'll move the app to the iPad's display.
And just for fun, I'll unplug the external camera.
Uh-oh, the app's camera preview is frozen, and it's not using any camera now. AVCam will need more changes to handle connection and disconnection events of external cameras. External cameras require special care because, unlike built-in ones, the user can connect and disconnect them from the iPad at any time. Your app can monitor these events to know when a camera has become available or can no longer be used. If the same physical device is reconnected, it will be represented using a new instance of AVCaptureDevice. There is existing API your app can use to listen for connection and disconnection events.
You can key-value observe the isConnected property on AVCaptureDevice or the devices property on an AVCaptureDeviceDiscoverySession that updates as cameras come and go. AVCaptureDevices also post notifications when their connection status changes, and your app can observe them to monitor a camera's availability. The system calls key value observation code and posts notifications on background queues. So be sure to synchronize your handling with your AVCaptureSession queue and the main thread.
Going back to AVCam, I'll add some code to listen for connection and disconnection events of an external camera. After the app looks for the default device, it observes when the camera is disconnected. And when that happens, AVCam switches to a built-in camera.
Now when the app launches, it still uses an external camera. And when it disconnects, the app switches to a built-in camera.
But when the external camera is reconnected, AVCam doesn't switch to it.
How should AVCam handle external cameras being connected while it's running? Should it switch to it after I've plugged it in? A tricky aspect of adopting external cameras in your iPad app is handling connection and disconnection events. To make this easier, iPadOS is introducing API for automatic camera selection. The API allows your app to integrate with the operating system to use the best available camera. It is another way for your app to change cameras. macOS Ventura introduced API for automatic camera selection to support Continuity Camera. The behaviors I describe in this session are specific to iOS.
For more information on how to use this API for Mac, see our previous session "Bringing Continuity Camera to your macOS app" from 2022 and its section "Building a magical experience." Automatic camera selection works by using two new class properties introduced to AVCaptureDevice on iOS: userPreferredCamera and systemPreferredCamera.
Both of these properties are key-value observable.
userPreferredCamera is a read/write property that indicates the user's choice of what camera should be used. It should be set whenever a user picks a camera in your app. Doing so allows the system to learn the user's preference.
systemPreferredCamera is a read-only property which specifies the best camera to use as determined by the system. By default, the system recommends using the front camera, but if you would like to use a back camera instead, your app can inform the system of its desired behavior. As different cameras are chosen by the user, the recommendation changes. But you might be wondering how the system know which camera is best. I'll dive into that in a bit. I'll first describe AVCaptureDevice's userPreferredCamera property. For this property, the system stores a short history of chosen cameras for each app across launches and system reboots. It allows your app to combine the user's history with the system's knowledge of which cameras are currently connected. So if a camera is disconnected, the system returns the next available camera based on the user's history.
If there is no user selection history, or none of the preferred cameras are connected, the system will always try to return a camera that's ready to use and prioritize cameras that have been previously streamed. Your app can use this property to let the system store your user's camera preference.
AVCaptureDevice's systemPreferredCamera property intelligently returns the best camera to use. It first checks the user's preference. And when the user connects an external camera to the iPad, the system returns the new device. This is because, when a user connects a new camera, they're implicitly indicating their intent to use it. These two inputs determine the system preferred camera.
The automatic camera selection API is flexible for an app to choose how it integrates with the system. While only iPad supports external cameras, iPhone apps can also use the API for its user preferred camera storage. Some apps allow users to change cameras, while others stick to one without giving a way to switch. The API allows apps to choose between automatic and manual camera selection.
FaceTime, Code Scanner, and WebKit are great examples that have different camera selection behaviors to suit their needs. When FaceTime launches, it always uses the front or external camera. And during calls, it allows users to switch between built-in cameras. But when an external camera is used, it hides the camera switch button. FaceTime enables this behavior by setting userPreferredCamera when it switches devices and observing the systemPreferredCamera property for when an external device is plugged in. It also makes its own decisions for when it is appropriate to use the automatic camera selection API. For example, while you can use the back camera in a FaceTime Video call, it always uses the front or external camera on the main screen that shows the list of calls. Code Scanner, available from Control Center, has different behavior. It uses the back-facing camera when it launches, and it doesn't allow the user to change cameras, but it does listen to the systemPreferredCamera property and switches when notified. The WebKit framework allows webpages to access the iPad's cameras. While it allows switching to any camera, it returns the system preferred camera as the first one in its list. Now that I've shown you how automatic camera selection works, I'll add support for it in AVCam. AVCam is a traditional photography app, since you can take photos and record movies with it. It's different from FaceTime and Code Scanner, which are communication and utility apps, respectively, and WebKit, which is a system framework. Now instead of needing a series of "if, else if" statements to find the external camera with fallbacks, AVCam just needs one line to get the system preferred camera. Since this is the first time the app is using the automatic camera selection API, the system returns the built-in front camera. But AVCam prefers to continue starting with the back camera. Before it gets the system preferred camera, the app checks if this is the first time it has queried for it by looking for a value stored in the app's user defaults. If no value is saved, then the app hasn't set its initial state for automatic camera selection. So if this is the first time launching, the app sets the user preferred camera to be the back device. The app finds the back camera using an AVCaptureDeviceDiscoverySession, which sorts the list of the devices using the provided device types. Then it sets the user preferred camera and saves a value in the app's user preferences, so it only does this setup once.
To handle connections and disconnections of external cameras, it doesn't have to observe the connection status of a specific camera anymore. Instead, AVCam key value observes the system preferred camera property. This allows it to automatically switch to the best available camera. In its KVO handling, the app gets the new system preferred camera and switches to it. but what if AVCam is in the middle of a recording? The app shouldn't interrupt the recording by switching cameras. So the app only switches if it is not in the middle of a recording. Then when the movie recording finishes, the app queries for the system preferred camera to see if it is different from what it is currently using. If the system preferred camera did change, the app switches to it. This way it doesn't interrupt the recording. Decisions like these are steps you'll have to make when adopting external cameras and the automatic camera selection API. Do what makes the most sense for your app.
AVCam has a button to switch cameras and behaves by changing between front and back devices. On this iPad Pro, there are multiple cameras that can be used at both positions. So AVCam has some logic for choosing which camera to use at a given position. How should the button work now that the app supports external cameras? I'll choose to treat an external camera like it is front-facing. The camera in this Apple Studio Display is facing me like the iPad's built-in front camera.
If the changeCamera function has no specific device to switch to, the app checks the position of the current device. In the switch statement that checks the current device's position, the app looks for a rear-facing camera if it is currently using one with an unspecified or front position. External cameras report that their position is unspecified. And if the app is using a rear-facing camera, it switches to an external device if one is available. Otherwise, it switches to a built-in front facing camera. To find external cameras, the app creates a AVCaptureDeviceDiscoverySession using the external device type, the video media type, and unspecified device position.
Then in the switch statement when the current device's position is back, it first looks for an external camera. And if one is not found, it switches to a built-in front facing camera.
Then when the app finds the camera it would like to use, it tells the system of the selection by setting the userPreferredCamera class property on AVCaptureDevice. Setting this property allows the system to learn the user's preferences.
You can choose how your app will support external cameras and how to allow a user to switch between them. For AVCam, I chose to allow a user to switch between front, back, and external cameras by treating external cameras like they are front-facing. This way, the camera switch button only changes between two devices. AVCam is almost ready to support external cameras. There is just one more aspect to handle. This whole time I've been using AVCam, the iPad has been mounted in landscape with the USB-C port on the right. If I rotate the iPad, the external camera's preview is now upside down.
But the external camera in the display hasn't moved. Just the iPad has.
AVCam doesn't have this rotation problem with built-in cameras though. This is because the app doesn't know how to orient external cameras whose position is independent from the iPad. AVCam will need further modifications to properly display an external camera's video preview. Next up, I'll discuss why video rotation is important so that live preview and captured photos and movies appear correctly. Video rotation is not a new concept for camera apps. But when using external cameras, it is important to know that they move independently from the iPad. Apps are used to built-in cameras. And because of this, they rely on the iPad's orientation to rotate the camera's video and use the AVCaptureVideoOrientation enum. This is what AVCam was doing in the previous demo. It tried rotating the external camera to match the iPad's orientation. In iPadOS 17, AVCaptureVideoOrientation is deprecated, as well as the API that use this enum. It describes how the iPad is oriented and assumes the camera rotates with the device. It's not expressive enough for orienting external cameras, which move independently. To use this enum, an app typically converts from UIDeviceOrientation, which also describes the iPad and is an indirect signal for orienting a camera. So we're introducing new API to handle video rotation. New to all platforms, including iPadOS, the AVCaptureDeviceRotationCoordinator class can help properly orient any camera. The class's initializer takes an AVCaptureDevice and optionally a CALayer that displays the camera's video preview. Apps often use AVCaptureVideoPreviewLayer or AVSampleBufferDisplayLayer to show camera preview. Both of these are subclasses of CALayer and can be passed to the initializer. Apps that use Metal or other rendering methods can simply pass the layer of the UIView displaying the camera's preview. The coordinator has two properties: A video rotation angle for horizon-level preview and a separate angle for horizon-level capture. Both of these read-only properties return an angle in degrees and are key-value observable.
Previewing and capturing content horizon-level means that the video frames from the camera are always upright relative to gravity, no matter if the device is in portrait, landscape, or upside down.
Use the videoRotationAngleForHorizonLevelPreview to display video frames in the CALayer passed to the coordinator's initializer. It describes how much rotation to apply for preview. The angle is relative to the UIKit and SwiftUI coordinate systems.
The videoRotationAngleForHorizonLevelCapture allows your app to take photos and movies, so they are always upright when someone views them later. This property describes the physical orientation of the camera. And its value may be different than the angle the app needs for preview. These two properties have different purposes. To explain video rotation, I'll start with scenarios you are familiar with when using built-in cameras. Later, when I modify AVCam, I'll explain how these concepts apply to external cameras. The camera app on iPhone is a good example of the differences between the video rotation angles for horizon-level preview and capture. In this example of the app displaying the back camera's preview, the iPhone is in portrait. The UIKit coordinate system's origin is at the top left of the drawing area, where its positive x-axis extends to the right and positive y-axis extends down. The back camera sensor's coordinate system has a different origin.
The camera sensor first scans along the height of the phone, then along the width. To account for the physical orientation of the camera, the app rotates the camera's video frames 90 degrees for preview in the UI. It also rotates photos and movies it captures, so they are upright when viewed later. It has different behavior when the iPhone is in landscape. The app only displays the UI in portrait, no matter the device's orientation. You can tell based on where the home affordance indicator is. For Camera app on iPhone, it always stays on the side with the port. The UIKit coordinate system's origin is still at the top left of the drawing area, and in this case, it stays fixed to a single location on the device, since the app's UI only supports one orientation. And the back camera sensor's coordinate system still differs from the UI's. Since the app only displays the UI in portrait, it applies a constant 90 degrees of rotation for preview, no matter the iPhone's orientation. But unlike for preview, the app applies a different amount of rotation when taking photos and movies when the iPhone is in landscape. When the iPhone is in the camera sensor's native orientation, the app does not need to rotate photos or movies for them to appear upright. All of this talk of rotation really makes your head spin. But the AVCaptureDeviceRotationCoordinator takes care of this complexity and provides correct angles to preview and capture from all cameras. Rely on it to provide angles rather than trying to calculate them yourself. As your app switches cameras, be sure to create a new rotation coordinator. To apply video rotation, use the angles the coordinator provides with new API on AVCaptureConnection. Only connections that deliver video or depth media data support rotation. To check whether a connection supports an angle, you can call its isVideoRotationAngleSupported method. To have the connection perform rotation, set its videoRotationAngle property to a supported value. Use videoRotationAngleForHorizonLevelPreview to display the camera preview. Apps using AVCaptureVideoPreviewLayer can apply the property's value to an AVCaptureConnection instance's videoRotationAngle property. Apps can also use it when displaying buffers from a video data output in a CALayer. To synchronize with system animations, change the preview rotation immediately in your app's key value observation code.
Your app can expect to receive updates to this property on the main queue to update its UI. Not all apps use AVCaptureVideoPreviewLayer for displaying camera preview. Some apps display buffers from a video data output when applying custom effects or filters. One option for displaying custom preview is to use the AVSampleBufferDisplayLayer. Avoid requesting rotation by setting the angle on the video data output's AVCaptureConnection. Changing the connection's angle causes a frame delivery interruption as the capture render pipeline reconfigures itself to apply the new amount of rotation. Instead, rotate the CALayer displaying the camera preview. Doing so allows your app's camera preview to rotate smoothly. Use videoRotationAngleForHorizonLevelCapture when photos and movies, so they're level relative to gravity. Your app can apply the property's value on capture connections to a photo output or movie file output. Alternatively, if your app uses a video data output with an AVAssetWriter for recording custom movies, avoid rotating the video with AVCaptureConnection. Instead, set the rotation with an AVAssetWriterInput instance's transform property, which alters the output file's metadata. With this approach, video apps apply the rotation during playback, which uses less energy than rotating each frame with the capture connection. Your app needs to convert the rotation angle from degrees because an asset writer input uses a CGAffineTransform that applies rotations in radians.
Some outputs efficiently apply rotation without added overhead. For instance, a movie file output applies rotation using a QuickTime track matrix. The photo output handles orientation with Exif tags. And the preview layer transforms its contents to perform rotations. However, your app may increase a device's power consumption if has the video or depth data outputs perform rotation because they use more memory and energy to rotate their buffers. Instead, your app can take a more efficient approach by rotating the CALayer that previews buffers from video or depth outputs.
Use AVCaptureDeviceRotationCoordinator on all available platforms, including iOS, tvOS, and macOS. Mac Catalyst and iOS apps on Mac can also use it.
Your app can use the rotation coordinator to correctly orient photos or movies and display video preview for any camera. And it helps your app handle complex layouts with Stage Manager or when it is on an external display.
Now it's time for the final changes to support external cameras in AVCam. When configuring the capture session, the app sets up its camera preview. So it creates a device rotation coordinator, which gives the app the rotation angles it needs for preview and capture.
When creating a coordinator, the app updates the preview layer with the current rotation angle for preview.
It also observes changes to the angle and updates the preview.
When AVCam switches devices, it also creates a new rotation coordinator, so the preview looks right for the new camera.
When taking photos, the app uses the rotation angle for capture to make sure they're upright when someone views them later. And it does the same when recording movies.
With these changes, AVCam is ready to support external cameras. Now when I rotate the iPad, the external camera appears correctly.
We've covered a lot so far in this session, and I thank you for following along with me. Now that I've shown how your iPad app can use external cameras, I'll discuss how you can also use microphones included with these devices. Some webcams and displays include mics. When they are plugged into an iPad, they can be used by your app. iPadOS 17 has improved support for external microphones on iPads with USB-C. Telephony apps that use Core Audio's AUVoiceIO audio unit can now use external microphones like those included with webcams or displays. Previously, the only external wired devices these apps could use were headset mics. AUVoiceIO is a popular interface, since it performs echo cancellation, and new tunings have been introduced for external mics. Voice Isolation mode available from Control Center removes unwanted background noise, such as typing on keyboards, mouse clicks, or leaf blowers running somewhere in the neighborhood. Your app can use this system feature with external microphones.
The iOS audio routing system allows only one microphone to be used at a time. It also automatically changes to the last connected microphone. This is because, just like when connecting a camera, the user is indicating the newly connected microphone should be used. On iOS, the system returns only one AVCaptureDevice for the microphone. You can find it by searching for the device with the audio media type or with the new microphone device type, which deprecates builtInMicrophone because not all mics are built-in to the iPad. The audio routing system decides which available microphone to use, be it built-in or external. When the system changes the input route, the microphone AVCaptureDevice's localizedName property changes to reflect the device in use.
Your app can use AVAudioSession for more control over the microphone. You can use it to configure your app's audio behaviors by setting a category or mode. And you can choose to use a specific mic, like one included with an external camera, by setting the preferred input. For the final topic of this session, I'll discuss some best practices for your app when using external cameras.
As you begin adoption, consider what makes most sense for your app. Earlier, I showed how FaceTime, Code Scanner, and WebKit chose to support external cameras differently. Use them as examples of different ways your app can adopt. Configure your iPad for wireless debugging with Xcode while the USB-C port is in use by an external camera. Some capabilities that your app can expect from built-in cameras may not be supported by external devices. For example, if your app relies on depth data capture for some feature, you might have to disable it when an external camera is in use. Apps that use multiple cameras at the same time with an AVCaptureMultiCamSession can add external cameras for creative capture setups. iPadOS gives external cameras some treatment that it also applies to front-facing cameras. The AVCaptureVideoPreviewLayer mirrors external cameras by default. This works well when the camera is facing the iPad user. But this isn't suitable for all use cases. If your app's users stream from HDMI switchers or point the external camera away from them, consider allowing users to disable preview mirroring. In a previous section, I described camera rotation. While your app likely won't need to apply video rotation for an external camera, be aware that if you do, the system rotates external cameras clockwise towards the scene it is facing. This is the same way it applies rotation for built-in cameras.
Prepare your app to handle cameras with different capabilities.
For example, some external cameras may only report two formats, like a VGA format of 640x480 and an HD format of 1280x720. And some external cameras support pixel formats that are not typically used on iOS. We've chosen to convert these to more common formats that iOS camera apps are used to handling. Uncompressed formats like yuvs and 2vuy are converted to 420v. And compressed formats like streaming JPEG and H264 are converted to 420f.
Because an external camera can have formats of any size, it may not support all AVCaptureSessionPresets. For example, the HD 4K preset requires the device to have a compatible format.
Your app can check whether it can use a preset by calling the supportsSessionPreset method on AVCaptureDevice. Your app can configure an external camera, including changing its resolution, frame rate, and zoom factor. iPadOS supports a limited set of the camera controls available in the USB Video Class specification. So query the AVCaptureDevice for its capabilities. Let's wrap up everything I just talked about. I showed how you can discover and use external cameras, how to properly rotate a camera's video frames, using external microphones, and lastly, best practices for your app. We're excited to see how you start using external cameras in your iPad app. Thank you, and I hope your app rocks. ♪ ♪
-
-
Looking for something specific? Enter a topic above and jump straight to the good stuff.
An error occurred when submitting your query. Please check your Internet connection and try again.