Dive into the technical aspects of audio on your device, including codecs, format support, and customization options.

Audio Documentation

Post

Replies

Boosts

Views

Activity

Is there a way to adjust (reduce) the upper limit of the system volume
Sometimes when I'm putting on or taking off clothes, I accidentally bump the digital crown of my Apple Watch or AirPods Max, and then the volume suddenly becomes very loud, which has been bothering me for a long time. I followed the instructions in https://support.apple.com/zh-sg/guide/iphone/iphb71f9b54d/ios, but I couldn't find the relevant settings. The system prompt is to "Reduce Loud Audio", rather than to lower the volume (iOS 17.4). I searched, but I couldn't find any related apps in the App Store. I asked the AI and it provided a relevant solution, so I want to learn Swift and create an app myself (I've only been learning for less than a week). Here's the solution provided by the AI: The general idea is to listen for the routeChange event of AVAudioSession through NotificationCenter then use MPVolumeView to get the slider, and set the value of the slider to control the volume limit. However, when I debugged it, I found that it didn't work even after setting it. I would like to ask where the problem might be and how I should adjust it? @objc func setMaximumVolume () -> Void { if !enableMaxvolume { return; } let volumeView = MPVolumeView() if let slider = volumeView.subviews.first as? UISlider { slider.value = Float(self.maximumVolume / 100) print("setMaximumVolume: \(slider.value)") } }
0
0
411
Apr ’24
AudioSession activation while App is in background or killed
Hello, I'm developing a voice communication App using Livekit SDK. Everything works fine in the foreground, AudioSession is activated and audio transmitted. However, I would like to add a feature, I would like my app to receive audio even when it's in background or terminated. I know I can run code when the App is in that state by sending a background push notification, but the only thing that is not working in that case is the AudioSession activation. It fails with error "Session activation failed", no more clues. I tried every combination of category and mode, but no success. Bacground modes in XCode have been activated: -Audio, AirPlay, and Picture in Picture -Background Processing Is this a limit of Livekit? I would be grateful if someone can point me into the right direction.
0
0
639
Apr ’24
Replacing AVCaptureVideoOrientation with AVCaptureDevice.RotationCoordinator
On iOS, working with a video feed, I am getting a yellow warning that: "'AVCaptureVideoOrientation' was deprecated in iOS 17.0: Use AVCaptureDeviceRotationCoordinator instead". But I haven't been able to figure out how to get AVCaptureDevice.RotationCoordinator to work, and I haven't found any example of its usage in the Developer Forums or on the wider internet (the one mention of it in a WWDC session doesn't provide illustration of its use). Can anyone offer a working example using Swift?
2
0
756
Apr ’24
Other Audio Ducking in AVAudio session
https://developer.apple.com/videos/play/wwdc2023/10235/ - In this WWDC session, at 3:19 - Apple has introduced **Other audio ducking ** feature In iOS17, we can control the amount of 'other audio' ducking through the AVAudioEngine. Is this also possible on AVAudioSession ? We are using an AVAudioSession for a VOIP call while concurrently attempting to play a video through an AVPlayer. However, the volume of the AVPlayer is considerably low. Does anyone have any ideas on how to achieve the level of control that AVAudioEngine offers?
0
0
547
Apr ’24
Capturing HDR pixel buffers with AVPlayerItemVideoOutput
On a Vision Pro I load an HDR video served over HLS using AVPlayer. Per FFMPEG the video has: pixel format: yuv420p10le color space / ycbcr matrix: bt2020nc color primaries: bt2020 transfer function: smte2084 I wanted to try out letting AVFoundation do all of the color conversion instead of making my own YUV -> RGB shader. To display a 10-bit texture in a drawable queue, the destination Metal texture format must be MTLPixelFormat.rgba16Float (no other formats above 8-bit are supported). So the pixel format I am capturing in is kCVPixelFormatType_64RGBAHalf since it's pretty close. It's worth noting that the AVAsset shows no track information...must be because it's HLS? I am using AVPlayerItemVideoOutput to get pixel buffers: AVPlayerItemVideoOutput(outputSettings: [ AVVideoColorPropertiesKey: [ AVVideoColorPrimariesKey: AVVideoColorPrimaries_ITU_R_2020, AVVideoTransferFunctionKey: AVVideoTransferFunction_SMPTE_ST_2084_PQ, AVVideoYCbCrMatrixKey: AVVideoYCbCrMatrix_ITU_R_2020 ], kCVPixelBufferPixelFormatTypeKey as String: Int(kCVPixelFormatType_64RGBAHalf), kCVPixelBufferMetalCompatibilityKey as String: true ]) I can change these settings in real time and see they are having an effect on my drawable queue. The BT.2020 primaries do not look correct to me, it's very bright and washed out. When I switch to BT.709 it looks closer to the output of the AVPlayer. The AVPlayer by itself doesn't look terrible, just a little dark maybe. When I leave out the outputSettings and let the AVPlayerItemVideoOutput choose its own color settings, it appears to choose BT.2020 also. Is it enough to put in these outputSettings and expect an RGB pixelBuffer that perfectly matches those settings? Or do I have to just capture in YUV and do all of the conversion manually? Am I misunderstanding something related to color settings here? I am definitely not an expert. Thanks
0
0
529
Apr ’24
AVFoundation call [AVCaptureSession stopRunning] crash on macos 14.4.1
As the title describe, I cause a crash when call [AVCaptureSession stopRunning] on macos 14.4.1. the crash stack is as below: Process: Nebula for Mac [31922] Path: /Applications/Nebula for Mac.app/Contents/MacOS/Nebula for Mac Identifier: ai.nreal.nebula.mac Version: 0.8.0.1098 (0.8.0) Code Type: ARM-64 (Native) Parent Process: launchd [1] User ID: 501 Date/Time: 2024-04-11 14:12:34.6474 +0800 OS Version: macOS 14.4.1 (23E224) Report Version: 12 Anonymous UUID: C438684A-95E7-7DA1-D063-81E1A5FBF5DC Sleep/Wake UUID: 3EB85031-82AC-4BDB-8F28-FAF4CBD28CA1 Time Awake Since Boot: 110000 seconds Time Since Wake: 1108 seconds System Integrity Protection: enabled Crashed Thread: 0 Dispatch queue: com.apple.avfoundation.proprietarydefaults.singleton.source_queue.0x202f8b460 Exception Type: EXC_CRASH (SIGTRAP) Exception Codes: 0x0000000000000000, 0x0000000000000000 Termination Reason: Namespace SIGNAL, Code 5 Trace/BPT trap: 5 Terminating Process: Nebula for Mac [31922] Thread 0 Crashed:: Dispatch queue: com.apple.avfoundation.proprietarydefaults.singleton.source_queue.0x202f8b460 0 libsystem_kernel.dylib 0x19c1a61f4 mach_msg2_trap + 8 1 libsystem_kernel.dylib 0x19c1b8b24 mach_msg2_internal + 80 2 libsystem_kernel.dylib 0x19c1aee34 mach_msg_overwrite + 476 3 libsystem_kernel.dylib 0x19c1a6578 mach_msg + 24 4 libdispatch.dylib 0x19c0513b0 _dispatch_mach_send_and_wait_for_reply + 544 5 libdispatch.dylib 0x19c051740 dispatch_mach_send_with_result_and_wait_for_reply + 60 6 libxpc.dylib 0x19bef2af0 xpc_connection_send_message_with_reply_sync + 288 7 AVFCapture 0x1b9e5565c -[CMIOProprietaryDefaultsSource setObject:forKey:] + 140 8 AVFCapture 0x1b9e57044 __58-[AVCaptureProprietaryDefaultsSingleton setObject:forKey:]_block_invoke + 36 9 libdispatch.dylib 0x19c0363e8 _dispatch_client_callout + 20 10 libdispatch.dylib 0x19c0458d8 _dispatch_lane_barrier_sync_invoke_and_complete + 56 11 AVFCapture 0x1b9e597e0 -[AVCaptureProprietaryDefaultsSingleton _runBlockOnProprietaryDefaultsSourceQueueSync:] + 136 12 AVFCapture 0x1b9e56fbc -[AVCaptureProprietaryDefaultsSingleton setObject:forKey:] + 180 13 AVFCapture 0x1b9e776a0 -[AVCaptureDALDevice _refreshCenterStageUnavailableReasons] + 400 14 AVFCapture 0x1b9e7d0fc -[AVCaptureDALDevice updateActivelyProvidingInputCountForActiveUseState:] + 488 15 AVFCapture 0x1b9e33474 -[AVCaptureSession_Tundra _updateNewActiveUseState:forConnection:] + 196 16 AVFCapture 0x1b9e32e4c -[AVCaptureSession_Tundra _setRunning:] + 428 17 AVFCapture 0x1b9e32a28 -[AVCaptureSession_Tundra stopRunning] + 432 18 libnr_api.dylib 0x1446d7514 0x144478000 + 2487572 19 libnr_api.dylib 0x14468a690 0x144478000 + 2172560 20 libnr_api.dylib 0x14468bcb0 0x144478000 + 2178224 21 libnr_api.dylib 0x1444d0268 0x144478000 + 361064 22 libnr_api.dylib 0x1444ecb00 0x144478000 + 477952 23 libnr_api.dylib 0x1444ec724 0x144478000 + 476964 24 libnr_api.dylib 0x144541bcc 0x144478000 + 826316 25 libnr_api.dylib 0x144543e00 0x144478000 + 835072 26 libnr_api.dylib 0x144543f88 0x144478000 + 835464 27 libnr_api.dylib 0x144542ca8 0x144478000 + 830632 28 GameAssembly.dylib 0x12117d4c4 0x120000000 + 18339012 29 GameAssembly.dylib 0x1211894e0 0x120000000 + 18388192 30 GameAssembly.dylib 0x121165fe4 0x120000000 + 18243556 31 GameAssembly.dylib 0x1202e4248 0x120000000 + 3031624 32 GameAssembly.dylib 0x12116931c 0x120000000 + 18256668 33 GameAssembly.dylib 0x1201dcdf0 0x120000000 + 1953264 34 GameAssembly.dylib 0x1201dcd2c 0x120000000 + 1953068 35 UnityPlayer.dylib 0x10428dc60 0x103c38000 + 6642784 36 UnityPlayer.dylib 0x104295170 0x103c38000 + 6672752 37 UnityPlayer.dylib 0x1042b1620 0x103c38000 + 6788640 38 UnityPlayer.dylib 0x103f788d0 0x103c38000 + 3410128 39 UnityPlayer.dylib 0x1040d8c4c 0x103c38000 + 4852812 40 UnityPlayer.dylib 0x1040d8c98 0x103c38000 + 4852888 41 UnityPlayer.dylib 0x1040d8f2c 0x103c38000 + 4853548 42 UnityPlayer.dylib 0x104b104b8 0x103c38000 + 15566008 43 UnityPlayer.dylib 0x104b10304 0x103c38000 + 15565572 44 Foundation 0x19d430224 __NSFireTimer + 104 45 CoreFoundation 0x19c2e1f90 CFRUNLOOP_IS_CALLING_OUT_TO_A_TIMER_CALLBACK_FUNCTION + 32 46 CoreFoundation 0x19c2e1c34 __CFRunLoopDoTimer + 972 47 CoreFoundation 0x19c2e176c __CFRunLoopDoTimers + 356 48 CoreFoundation 0x19c2c4ba4 __CFRunLoopRun + 1856 49 CoreFoundation 0x19c2c3e0c CFRunLoopRunSpecific + 608 50 HIToolbox 0x1a6a5f000 RunCurrentEventLoopInMode + 292 51 HIToolbox 0x1a6a5ec90 ReceiveNextEventCommon + 220 52 HIToolbox 0x1a6a5eb94 _BlockUntilNextEventMatchingListInModeWithFilter + 76 53 AppKit 0x19fb1c970 _DPSNextEvent + 660 54 AppKit 0x1a030edec -[NSApplication(NSEventRouting) _nextEventMatchingEventMask:untilDate:inMode:dequeue:] + 700 55 AppKit 0x19fb0fcb8 -[NSApplication run] + 476 56 AppKit 0x19fae6f54 NSApplicationMain + 880 57 UnityPlayer.dylib 0x104b0ffe4 PlayerMain(int, char const**) + 944 58 dyld 0x19be5e0e0 start + 2360
2
0
470
Apr ’24
AVAssetReaderTrackOutput reset(forReadingTimeRanges:) skips sample
I'm doing random access sampling from AVAsset of local h264 video file let track = asset.tracks(withMediaType: .video)[0] let assetReader = try! AVAssetReader(asset: asset) let trackOutput = AVAssetReaderTrackOutput(track: track, outputSettings: nil) trackOutput.supportsRandomAccess = true assetReader.add(trackOutput) assetReader.startReading() ... let targetFrameDTS = CMTime(value: 56, timescale: 30) let timeRange = CMTimeRange( start: CMTimeAdd(time, CMTime(value: -1, timescale: 30)), duration: CMTime(value: 2, timescale: 30) ) // reset output to be near target frame decoding time trackOutput.reset(forReadingTimeRanges: [NSValue(timeRange: timeRange)]) while assetReader.status == .reading { guard let sample = trackOutput.copyNextSampleBuffer() else { break } let dts = CMSampleBufferGetDecodeTimeStamp(sample) print("\(dts.value)/\(dts.timescale)") } for some reason with some targetFrameDTS assetReader copyNextSampleBuffer will skip samples. in my particular case the output is ... 47/30 48/30 50/30 51/30 54/30 55/30 57/30 why is it so?
0
0
381
Apr ’24
Lidar Accuracy
I am using Lidar to measure the distance between the target point and the iPhone Pro. I am getting the correct distance only if I am greater than 70 cm away from the target point. I need that value to be accurate for distances below 70 cm as well. Is there any coding level issue or It's Lidar's limitations?
1
0
684
Apr ’24
AVAudioSession multiRoute disables volume buttons
My app is trying to continuously record audio from the background. Due to user feedback, I'm setting the AVAudioSession to use the .multiRoute category and .mixWithOthers options. This is because otherwise, if the device is connected to a car with CarPlay, output from the car's radio is muted. The only drawback seems to be that in this setup, controlling the phone's volume using the hardware volume buttons doesn't work anymore. This, of course, is also disliked by users. I've searched the docs and this and other forums for any documentation of this and if there's anything I can do to either setup the session to handle volume changes again or if and how I'm expected to receive notifications of these button presses and how to forward them to the right spot. Unfortunately, I didn't find anything. Can offer any ideas?
1
0
547
Apr ’24
AVAudioEngine Dolby Atmos
Hi! I have a music app using AVAudioEngine. Right now, I have set it up to play multi channel tracks and show "Multichannel" in the volume controls. However, I am unable to figure out how to get it to use Dolby Atmos. Is there something that needs to be enabled? Is it even possible for AVAudioEngine? I saw some apps that are able of playing with Dolby Atmos, but they do not have EQ feature, so I'm guessing that they are not using AVAudioEngine.
2
0
579
Apr ’24
kAudioHardwarePropertyDevices does not list AirPlay sound output device
The device listing core-audio API kAudioHardwarePropertyDevices does not list Airplay device if virtual-audio-driver is selected as sound output device in System settings. This virtual audio driver is developed by us and is named as BoomAudio. We need to select BoomAudio in System Settings Sound output so that we can get system audio and apply Boom effects/enhancement. But whenever BoomAudio is selected as sound output, we cannot get Airplay device in device-list API and hence cannot play-through to AirPlay sound output device. Steps: Select BoomAudio as Sound output in System Settings. (The issue occurs if any other sound output device like Headphone/Internal Speakers is selected) If AppleTV is connected then we should not AirPlay the system-display. Only Sound output of System should be Airplayed. Build and run the sample project that we have attached “SampleAirplayAudio Click on the button “Sound Output Device List”. Output: In the console of Xcode, Airplay device does not get listed. BoomAudio can be installed from the following path: https://d3jbf8nvvpx3fh.cloudfront.net/gdassets/airplaydts/Boom+2+Installer.zip The sample project 'SampleAirplayAudio' is available at this path: https://d3jbf8nvvpx3fh.cloudfront.net/gdassets/airplaydts/SampleAirplayAudio.zip We have already raised Bug report at Feedback Assistant Apple and the bug id is: FB7543204
0
0
560
Apr ’24
CoreMediaErrorDomain error -12852 HLS Fairplay since iOS 17.4.1
Dear Apple Developer Forum, we have customers here complaining about not being able to play live streams (HLS FairPlay) with ou application anymore since having upgraded their phone to iOS 17.4.1. We can't reproduce this problem in-house but the error code sent to ou analytics platform is CoreMediaErrorDomain error -12852 . Would it be possible to get more information on this error especially the potential cause of this and if the app is not responsible how we can help our customers ? Kind regards Cédric
1
0
1.1k
Apr ’24
Extract metadata MXF files
I'm trying to read meta information from MXF files without success. I get an empty AVAsset array. I saw that there are mentions of "MTRegisterProfessionalVideoWorkflowFormatReaders". But there is absolutely no documentation. I don't know where to look. Has anyone encountered this? Please help with any information
1
0
461
May ’24
AVCaptureVideoPreviewLayer transforms are not working. Need pan/zoom solution.
I am implementing pan and zoom features for an app using a custom USB camera device, in iPadOS. I am using an update function (shown below) to apply transforms for scale and translation but they are not working. By re-enabling the animation I can see that the scale translation seems to initially take effect but then the image animates back to its original scale. This all happens in a fraction of a second but I can see it. The translation transform seems to have no effect at all. Printing out the value of AVCaptureVideoPreviewLayer.transform before and after does show that my values have been applied. private func updateTransform() { #if false // Disable default animation. CATransaction.begin() CATransaction.setDisableActions(true) defer { CATransaction.commit() } #endif // Apply the transform. logger.debug("\(String(describing: self.videoPreviewLayer.transform))") let transform = CATransform3DIdentity let translate = CATransform3DTranslate(transform, translationX, translationY, 0) let scale = CATransform3DScale(transform, scale, scale, 1) videoPreviewLayer.transform = CATransform3DConcat(translate, scale) logger.debug("\(String(describing: self.videoPreviewLayer.transform))") } My question is this, how can I properly implement pan/zoom for an AVCaptureVideoPreviewLayer? Or even better, if you see a problem with my current approach or understand why the transforms I am applying do not work, please share that information.
0
0
504
May ’24