Dive into the technical aspects of audio on your device, including codecs, format support, and customization options.

Audio Documentation

Post

Replies

Boosts

Views

Activity

AVCaptureDevice exception in setting WhiteBalance
I have the following code adapted from AVCamManual sample code to set white balance. I still see crash reports in analytics where exception is raised in setting WB: *** -[AVCaptureDevice temperatureAndTintValuesForDeviceWhiteBalanceGains:] whiteBalanceGains contain an out-of-range value - red, green, and blue gain Here is my code, it is not clear how things are turning out of range. public func normalizedGains(gains:AVCaptureDevice.WhiteBalanceGains) -> AVCaptureDevice.WhiteBalanceGains { var g = gains if let device = videoDevice { g.redGain = max(1.0, g.redGain) g.blueGain = max(1.0, g.blueGain) g.greenGain = max(1.0, g.greenGain) g.redGain = min(device.maxWhiteBalanceGain, g.redGain) g.blueGain = min(device.maxWhiteBalanceGain, g.blueGain) g.greenGain = min(device.maxWhiteBalanceGain, g.greenGain) } return g } And my code to set WB: public func setTemperatureAndTint( colorTemperature:Float?, tint:Float?) { if let device = videoDevice { var tint = tint var colorTemperature = colorTemperature if colorTemperature == nil { colorTemperature = device.temperatureAndTintValues(for: device.deviceWhiteBalanceGains).temperature } if tint == nil { tint = device.temperatureAndTintValues(for: device.deviceWhiteBalanceGains).tint } let temperatureTint = AVCaptureDevice.WhiteBalanceTemperatureAndTintValues(temperature: colorTemperature!, tint: tint!) NSLog("Setting tint \(temperatureTint.tint)") do { try device.lockForConfiguration() device.setWhiteBalanceModeLocked(with: normalizedGains(gains: device.deviceWhiteBalanceGains(for: temperatureTint)) , completionHandler: nil) device.unlockForConfiguration() wbLockedtoGray = false } catch { NSLog("Unable to change White balance gain \(error)") } } } Is there anything I am doing wrong?
0
0
346
Oct ’23
Quality of Video stream displayed through AVCaptureSession breaks while trying to zoom camera and capture a photo
Hi, We were using Capture Systems of AVKit to take photo's in our app and we need to zoom camera to certain limit. If we configure zoomFactor to AVCaptureDevice we receiving awkward VideoFrames(blurred images) through Camera. Our app works fine in all devices of iPhone/iPad except devices that support Center Stage. We looked into Apple's default Camera app we understood that it was implemented using UIImagePickerController. We tried with multiple combinations of AVCaptureDevice.Format/AVCaptureSession.Preset but nothing helped us. We want's to achieve zoom(front camera) through AVKit, we'll add code snippet we used below please help on this. session.sessionPreset = AVCaptureSession.Preset.photo var bestFormat: AVCaptureDevice.Format? var bestFrameRateRange: AVFrameRateRange? for format in device.formats { for range in format.videoSupportedFrameRateRanges { if range.maxFrameRate > bestFrameRateRange?.maxFrameRate ?? 0 { bestFormat = format bestFrameRateRange = range } } } if let bestFormat = bestFormat, let bestFrameRateRange = bestFrameRateRange { do { try device.lockForConfiguration() // Set the device's active format. device.activeFormat = bestFormat // Set the device's min/max frame duration. let duration = bestFrameRateRange.minFrameDuration device.activeVideoMinFrameDuration = duration device.activeVideoMaxFrameDuration = duration device.videoZoomFactor = 2.0 device.unlockForConfiguration() } catch { // Handle error. } }
0
0
540
Oct ’23
AudioUnit V2 app with RenderCallback glitchy audio on iOS 17 with iPhone 14/15
Hi, I have an app that has been developed with AudioUnit RemoteIO with renderCallbacks. The app has been performing fine, except on iOS 17 with devices like iPhone 14 or iPhone 15. On iPhone 14, the same app (a metronomic device) was performing fine with iOS 16, and when the customer updated to iOS 17, suddenly the audio was glitchy, had ghost sounds and sound artifacts. This does not happen on iPhone 11 Pro with iOS 17 (works fine!). However, I have been able to reproduce it on iPhone 15 Pro with iOS 17. It works ok at lower BPM and when the BPM goes over a certain threshold, the audio starts getting glitchy. The audio buffers are precomputed, so the render callback is relatively straightforward. Has anyone else seen this kind of issue on iPhone 14/iPhone 15 running iOS 17? I'm following up with Apple on this, but thought I would see if others are facing similar issues with their apps. Thanks, Sridhar
2
0
838
Oct ’23
storing AVAsset in SwiftData
Hi, I am creating an app that can include videos or images in it's data. While @Attribute(.externalStorage) helps with images, with AVAssets I actually would like access to the URL behind that data. (as it would be stupid to load and then save the data again just to have a URL) One key component is to keep all of this clean enough so that I can use (private) CloudKit syncing with the resulting model. All the best Christoph
0
0
389
Oct ’23
ShazamKit Cost
We're looking to integrate ShazamKit, but can't find any details of associated costs. Is there a fee or rate limits for matching? And is attribution required to the matched song on Apple Music? Thank you
0
0
574
Oct ’23
iOS/iPadOS 17.1 has new issue with QR scanning
After upgrading to iOS 17.1, I can't scan some QR codes. The same codes can be scanned correctly before. In details iPadOS 17.0 and earlier: Works well iPadOS 17.1: Something goes wrong Sample project code: func metadataOutput(_ output: AVCaptureMetadataOutput, didOutput metadataObjects: [AVMetadataObject], from connection: AVCaptureConnection) { captureSession.stopRunning() if let metadataObject = metadataObjects.first { guard let readableObject = metadataObject as? AVMetadataMachineReadableCodeObject else { return } guard let stringValue = readableObject.stringValue else { foundButInvalid() return } found(code: stringValue) } } Sample QR code
3
1
1.3k
Oct ’23
Change picture-in-picture forward/backward values in AVPictureInPictureController
How can be changed playback (forward and backward) buttons values in AVPictureInPictureController? My backward and forward buttons have 15 seconds value by default (screenshot from my app is attached), but I've found other apps has 10 seconds (for instance, Apple TV iOS app). This Apple forum discussion I've read that AVPlayerViewController adapts its capabilities and controls to the asset being played. But it seems backward/forward values in PiP stay the same for all videos independent of duration in both my app and apps I've found. But I can't find the way to change them.
0
0
476
Nov ’23
Standalone rendering of Audio Unit in AUv3 host fails with NoConnection error
I am trying to migrate an Audio Unit host based on the AUv2 C API to the newer AUv3 API. While the migration itself was relatively straightforward (in terms of getting it to compile), the actual rendering fails at run-time with error -10876 aka. kAudioUnitErr_NoConnection. The app does not use AUGraph or AVAudioEngine, perhaps that is an issue? Since the AUv3 and the AUv2 API are bridged in both directions and the rendering works fine with the v2 API, I would expect there to be some way to make it work via the v3 API though. Perhaps someone has an idea why (or under which circumstances) the render block throws this error? For context, the app is Mixxx, an open-source DJing application, and here is the full diff between my AUv2 -> v3 migration: https://github.com/fwcd/mixxx/pull/5/files
1
0
765
Nov ’23
[AVCaptureDevice devicesWithMediaType:] not find usb device on macOS 14.1
I use the following API to find a specific USB device(Roxio video capture usb). [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo]; It works fine on macOS 14.0 or below version, but it find nothing on macOS 14.1, I replace with the following API and it still doesn't work. AVCaptureDeviceDiscoverySession *session = [AVCaptureDeviceDiscoverySession discoverySessionWithDeviceTypes:@[AVCaptureDeviceTypeExternalUnknown] mediaType:AVMediaTypeVideo position:AVCaptureDevicePositionUnspecified]; How can we solve it or have other solutions?
0
0
272
Nov ’23
Microphone mode
Is there any API to check which microphone mode is active for my macOS application? There is API to check microphone mode for AVCaptureDevice. But the status bar allows to select Microphone mode for an application that reads Microphone Audio (not for Microphone itself).
1
1
389
Nov ’23
CVPixelBufferPool poor performance vis-a-vis directly allocation
I have been allocating pixel buffers from CVPixelBufferPool and the code has been adapted from older various Apple sample codes such as RosyWriter. I see direct API such as CVPixelBufferCreate are highly performant and rarely cause frame drops as opposed to allocating from pixel buffer pool where I regularly get frame drops. Is this a known issue or a bad use of API? Here is the code for creating pixel buffer pool: private func createPixelBufferPool(_ width: Int32, _ height: Int32, _ pixelFormat: FourCharCode, _ maxBufferCount: Int32) -> CVPixelBufferPool? { var outputPool: CVPixelBufferPool? = nil let sourcePixelBufferOptions: NSDictionary = [kCVPixelBufferPixelFormatTypeKey: pixelFormat, kCVPixelBufferWidthKey: width, kCVPixelBufferHeightKey: height, kCVPixelFormatOpenGLESCompatibility: true, kCVPixelBufferIOSurfacePropertiesKey: [:] as CFDictionary] let pixelBufferPoolOptions: NSDictionary = [kCVPixelBufferPoolMinimumBufferCountKey: maxBufferCount] CVPixelBufferPoolCreate(kCFAllocatorDefault, pixelBufferPoolOptions, sourcePixelBufferOptions, &outputPool) return outputPool } private func createPixelBufferPoolAuxAttributes(_ maxBufferCount: size_t) -> NSDictionary { // CVPixelBufferPoolCreatePixelBufferWithAuxAttributes() will return kCVReturnWouldExceedAllocationThreshold if we have already vended the max number of buffers return [kCVPixelBufferPoolAllocationThresholdKey: maxBufferCount] } private func preallocatePixelBuffersInPool(_ pool: CVPixelBufferPool, _ auxAttributes: NSDictionary) { // Preallocate buffers in the pool, since this is for real-time display/capture var pixelBuffers: [CVPixelBuffer] = [] while true { var pixelBuffer: CVPixelBuffer? = nil let err = CVPixelBufferPoolCreatePixelBufferWithAuxAttributes(kCFAllocatorDefault, pool, auxAttributes, &pixelBuffer) if err == kCVReturnWouldExceedAllocationThreshold { break } assert(err == noErr) pixelBuffers.append(pixelBuffer!) } pixelBuffers.removeAll() } And here is the usage: bufferPool = createPixelBufferPool(outputDimensions.width, outputDimensions.height, outputPixelFormat, Int32(maxRetainedBufferCount)) if bufferPool == nil { NSLog("Problem initializing a buffer pool.") success = false break bail } bufferPoolAuxAttributes = createPixelBufferPoolAuxAttributes(maxRetainedBufferCount) preallocatePixelBuffersInPool(bufferPool!, bufferPoolAuxAttributes!) And then creating pixel buffers from pool err = CVPixelBufferPoolCreatePixelBufferWithAuxAttributes( kCFAllocatorDefault, bufferPool!, bufferPoolAuxAttributes, &dstPixelBuffer ) if err == kCVReturnWouldExceedAllocationThreshold { // Flush the texture cache to potentially release the retained buffers and try again to create a pixel buffer err = CVPixelBufferPoolCreatePixelBufferWithAuxAttributes( kCFAllocatorDefault, bufferPool!, bufferPoolAuxAttributes, &dstPixelBuffer ) } if err != 0 { if err == kCVReturnWouldExceedAllocationThreshold { NSLog("Pool is out of buffers, dropping frame") } else { NSLog("Error at CVPixelBufferPoolCreatePixelBuffer %d", err) } break bail } When used with AVAssetWriter, I see lot of frame drops caused due to kCVReturnWouldExceedAllocationThreshold error. No frame drops are seen when I directly allocate the pixel buffer without using a pool: CVPixelBufferCreate(kCFAllocatorDefault, Int(dimensions.width), Int(dimensions.height), outputPixelFormat, sourcePixelBufferOptions, &dstPixelBuffer) What could be the cause?
0
0
705
Nov ’23
Video orientation doesn't work in AVPlayerViewController
When attempting to present an AVPlayerViewController without animations, the video orientation does not function as expected. However, when the animation parameter is set to true, the video orientation works correctly. The following code does not produce the desired video orientation behavior when animation is disabled: parentViewController.present(playerViewController, animated: false) In contrast, the desired video orientation is achieved with animation enabled: parentViewController.present(playerViewController, animated: true)
0
0
415
Nov ’23
Correct settings to record HDR/SDR with AVAssetWriter
I have set AVCaptureVideoDataOutput with 10-bit 420 YCbCr sample buffers. I use Core Image to process these pixel buffers for simple scaling/translation. var dstBounds = CGRect.zero dstBounds.size = dstImage.extent.size /* *srcImage is created from sample buffer received from Video Data Output */ _ciContext.render(dstImage, to: dstPixelBuffer!, bounds: dstImage.extent, colorSpace: srcImage.colorSpace ) I then set the color attachments to this dstPixelBuffer using set colorProfile in the app settings (BT.709 or BT.2020). switch colorProfile { case .BT709: CVBufferSetAttachment(dstPixelBuffer!, kCVImageBufferColorPrimariesKey, kCVImageBufferColorPrimaries_ITU_R_709_2, .shouldPropagate) CVBufferSetAttachment(dstPixelBuffer!, kCVImageBufferTransferFunctionKey, kCVImageBufferTransferFunction_ITU_R_709_2, .shouldPropagate) CVBufferSetAttachment(dstPixelBuffer!, kCVImageBufferYCbCrMatrixKey, kCVImageBufferYCbCrMatrix_ITU_R_709_2, .shouldPropagate) case .HLG2100: CVBufferSetAttachment(dstPixelBuffer!, kCVImageBufferColorPrimariesKey, kCVImageBufferColorPrimaries_ITU_R_2020, .shouldPropagate) CVBufferSetAttachment(dstPixelBuffer!, kCVImageBufferTransferFunctionKey, kCVImageBufferTransferFunction_ITU_R_2100_HLG, .shouldPropagate) CVBufferSetAttachment(dstPixelBuffer!, kCVImageBufferYCbCrMatrixKey, kCVImageBufferYCbCrMatrix_ITU_R_2020, .shouldPropagate) } These pixel buffers are then vended to AVAssetWriter whose videoSettings is set to recommendedSettings by VDO. But the output seems to be washed out completely, esp. for SDR (BT.709). What am I doing wrong?
0
0
676
Nov ’23
CAEDRMetadata.hlg(ambientViewingEnvironment:) API crash
I am trying to use the new API CAEDRMetadata.hlg(ambientViewingEnvironment:) introduced in iOS 17.0. Since ambientViewingEnvironmentData is dynamic, I understand the edrMetaData of CAMetalLayer needs to be set on every draw call. But doing so causes CAMetalLayer to freeze and even crash. if let pixelBuffer = image.pixelBuffer, let aveData = pixelBuffer.attachments.propagated[kCVImageBufferAmbientViewingEnvironmentKey as String] as? Data { if #available(iOS 17.0, *) { metalLayer.edrMetadata = CAEDRMetadata.hlg(ambientViewingEnvironment: aveData) } else { // Fallback on earlier versions } }
0
0
408
Nov ’23
Adding multiple AVCaptureVideoDataOutput stalls captureSession
Adding multiple AVCaptureVideoDataOutput is officially supported in iOS 16 and works well, except for certain configurations such as ProRes (YCbCr422 pixel format) where session fails to start if two VDO outputs are added. Is this a known limitation or a bug? Here is the code: device.activeFormat = device.findFormat(targetFPS, resolution: targetResolution, pixelFormat: kCVPixelFormatType_422YpCbCr10BiPlanarVideoRange)! NSLog("Device supports tone mapping \(device.activeFormat.isGlobalToneMappingSupported)") device.activeColorSpace = .HLG_BT2020 device.activeVideoMinFrameDuration = CMTime(value: 1, timescale: CMTimeScale(targetFPS)) device.activeVideoMaxFrameDuration = CMTime(value: 1, timescale: CMTimeScale(targetFPS)) device.unlockForConfiguration() self.session?.addInput(input) let output = AVCaptureVideoDataOutput() output.alwaysDiscardsLateVideoFrames = true output.setSampleBufferDelegate(self, queue: self.samplesQueue) if self.session!.canAddOutput(output) { self.session?.addOutput(output) } let previewVideoOut = AVCaptureVideoDataOutput() previewVideoOut.alwaysDiscardsLateVideoFrames = true previewVideoOut.automaticallyConfiguresOutputBufferDimensions = false previewVideoOut.deliversPreviewSizedOutputBuffers = true previewVideoOut.setSampleBufferDelegate(self, queue: self.previewQueue) if self.session!.canAddOutput(previewVideoOut) { self.session?.addOutput(previewVideoOut) } self.vdo = vdo self.previewVDO = previewVideoOut self.session?.startRunning() It works for other formats such as 10-bit YCbCr video range HDR sample buffers, but there are lot of frame drops when recording with AVAssetWriter at 4K@60 fps. Are these known limitations or bad use of API?
1
0
600
Nov ’23
Record on built in mic and simultaneously playback processed audio on AirPods
Basically for this iPhone app I want to be able to record from either the built in microphone or from a connected USB audio device while simultaneously playing back processed audio on connected AirPods. It's a pretty simple AVAudioEngine setup that includes a couple of effects units. The category is set to .playAndRecord with the .allowBluetooth and .allowBluetoothA2DP options added. With no attempts to set the preferred input and AirPods connected, the AirPods mic will be used and output also goes to the AirPods. If I call setPreferredInput to either built in mic or a USB audio device I will get input as desired but then output will always go to the speaker. I don't really see a good explanation for this and overrideOutputAudioPort does not really seem to have suitable options. Testing this on iPhone 14 Pro
3
0
739
Nov ’23
AVPlayer playing HLS over TLS-PSK connection
Hello, I'm trying to play a local playlist using AVPlayer and AVAssetResourceLoaderDelegate using a TLS-PSK connection. I'm facing two obstacles. Whereas I'm able to download myself m3u8 files, when it comes to chunks I can only redirect the URL which doesn't give me much control over the connection. It seems that URLSession does not support TLS-PSK. Is there a way to accomplish this? Thanks in advance
0
0
294
Nov ’23
Use AVPlayer to play and AVAssetResourceLoaderDelegate to read data. The following errors occasionally occur during playback.
Use AVPlayer to play and AVAssetResourceLoaderDelegate to read data. The following errors occasionally occur during playback. -11819:Cannot Complete Action -11800:The operation could not be completed -11829:Cannot Open -11849:Operation Stopped -11870:这项操作无法完成 -1002:unsupported URL -11850:操作已停止 -1:未知错误 -17377
0
0
490
Nov ’23