Dive into the technical aspects of audio on your device, including codecs, format support, and customization options.

Audio Documentation

Post

Replies

Boosts

Views

Activity

FB13398940: Removing a CMIOObjectPropertyListenerBlock ...doesn't do anything?
I've added a listener block for camera notifications. This works as expected: the listener block is invoked then the camera is activated/deactivated. However, when I call CMIOObjectRemovePropertyListenerBlock to remove the listener block, though the call succeeds, camera notifications are still delivered to the listener block. Since in the header file it states this function "Unregisters the given CMIOObjectPropertyListenerBlock from receiving notifications when the given properties change." I'd assume that once called, no more notifications would be delivered? Sample code: #import <Foundation/Foundation.h> #import <CoreMediaIO/CMIOHardware.h> #import <AVFoundation/AVCaptureDevice.h> int main(int argc, const char * argv[]) { AVCaptureDevice* camera = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]; OSStatus status = -1; CMIOObjectID deviceID = 0; CMIOObjectPropertyAddress propertyStruct = {0}; propertyStruct.mSelector = kAudioDevicePropertyDeviceIsRunningSomewhere; propertyStruct.mScope = kAudioObjectPropertyScopeGlobal; propertyStruct.mElement = kAudioObjectPropertyElementMain; deviceID = (UInt32)[camera performSelector:NSSelectorFromString(@"connectionID") withObject:nil]; CMIOObjectPropertyListenerBlock listenerBlock = ^(UInt32 inNumberAddresses, const CMIOObjectPropertyAddress addresses[]) { NSLog(@"Callback: CMIOObjectPropertyListenerBlock invoked"); }; status = CMIOObjectAddPropertyListenerBlock(deviceID, &propertyStruct, dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), listenerBlock); if(noErr != status) { NSLog(@"ERROR: CMIOObjectAddPropertyListenerBlock() failed with %d", status); return -1; } NSLog(@"Monitoring %@ (uuid: %@ / %x)", camera.localizedName, camera.uniqueID, deviceID); sleep(10); status = CMIOObjectRemovePropertyListenerBlock(deviceID, &propertyStruct, dispatch_get_main_queue(), listenerBlock); if(noErr != status) { NSLog(@"ERROR: 'AudioObjectRemovePropertyListenerBlock' failed with %d", status); return -1; } NSLog(@"Stopped monitoring %@ (uuid: %@ / %x)", camera.localizedName, camera.uniqueID, deviceID); sleep(10); return 0; } Compiling and running this code outputs: Monitoring FaceTime HD Camera (uuid: 3F45E80A-0176-46F7-B185-BB9E2C0E436A / 21) Callback: CMIOObjectPropertyListenerBlock invoked Callback: CMIOObjectPropertyListenerBlock invoked Stopped monitoring FaceTime HD Camera (uuid: 3F45E80A-0176-46F7-B185-BB9E2C0E436A / 21) Callback: CMIOObjectPropertyListenerBlock invoked Callback: CMIOObjectPropertyListenerBlock invoked Note the last two log messages showing that the CMIOObjectPropertyListenerBlock is still invoked ...even though CMIOObjectRemovePropertyListenerBlock has successfully been invoked. Am I just doing something wrong here? Or is the API broken?
3
0
827
Nov ’23
AVPlayer observePlayingState with @ObservationTracked possible?
I am following this Apple Article on how to setup an AVPlayer. The only difference is I am using @Observable instead of an ObservableObject with @Published vars. Using @Observable results in the following error: Cannot find '$isPlaying' in scope If I remove the "$" symbol I get a bit more insight: Cannot convert value of type 'Bool' to expected argument type 'Published<Bool>.Publisher' If I change by class to an OO, it works fine, although, is there anyway to get this to work with @Observable?
1
0
528
Nov ’23
AVFoundation player.publisher().assign() for @Observable/@ObservationTracked?
Following this Apple Article, I copied their code over for observePlayingState(). The only difference I am using @Observable instead of ObservableObject and @Published for var isPlaying. We get a bit more insight after removing the $ symbol, leading to a more telling error of: Cannot convert value of type 'Bool' to expected argument type 'Published.Publisher' Is there anyway to get this working with @Observable?
2
0
740
Nov ’23
AVPlayerVideoOutput on visionOS
I'm trying to decode MV-HEVC videos on visionOS, but I noticed the normal API for using AVPlayer (AVPlayerVideoOutput and some new methods for setting videoOutput on AVPlayer) are not available in visionOS 1.0. They're only available in iOS 17.2 and macOS 14.2 and the header claims visionOS 1.1 but it doesn't say that in the Apple documentation anywhere. Does this mean there's really no way to work on this functionality at this time (!) This seems like a major omission given we can't even target visionOS 1.1 with the beta version of Xcode. Can you please move this API forward into visionOS 1.0.
2
0
550
Nov ’23
Can not get access to Camera
I just been working on an Augmented Reality (AR) application using Unity. I am facing a critical issue with camera access when my app is published on the App Store, and I'm reaching out to seek your guidance and assistance in resolving this matter. Here's a brief overview of the problem: During development and testing in Unity, the camera functionality in my AR app works as expected. I can access and utilize the device's camera for AR features without any issues. However, when I publish the app to the App Store and users download it, they are unable to access the device's camera within the app. This is a significant problem as camera access is fundamental to the app's functionality and user experience. I have taken several steps to ensure that I have correctly configured camera access permissions both in Unity's Player Settings and within Xcode: In Unity's Player Settings, I have provided a "Camera usage Description" to explain why the app needs camera access. In Xcode, I have also included the necessary privacy descriptions for camera access in the Info.plist file. Despite these efforts, the issue still persists. Users cannot access the camera when they download the app from the App Store. I have reviewed Apple's documentation and guidelines regarding camera access permissions, and I believe I have followed them correctly. I am eager to resolve this issue promptly to ensure that my AR app provides a seamless experience for users. If there are any specific steps or configurations that I might have missed or if there are any additional requirements or changes I need to make in order to enable camera access for my app on the App Store, I would greatly appreciate your guidance. If there is any additional information or logs you require from my end to assist in diagnosing and resolving this issue, please do let me know, and I will provide them promptly.
1
0
585
Nov ’23
AVPlayer with multiple audio tracks plays audio differently when start
Hi, I'm trying to play multiple video/audio file with AVPlayer using AVMutableComposition. Each video/audio file can process simultaneously so I set each video/audio in individual tracks. I use only local file. let second = CMTime(seconds: 1, preferredTimescale: 1000) let duration = CMTimeRange(start: .zero, duration: second) var currentTime = CMTime.zero for _ in 0...4 { let mutableTrack = composition.addMutableTrack( withMediaType: .audio, preferredTrackID: kCMPersistentTrackID_Invalid ) try mutableTrack?.insertTimeRange( duration, of: audioAssetTrack, at: currentTime ) currentTime = currentTime + second } When I set many audio tracks (maybe more than 5), the first part sounds a little different from original when it starts. It seems like audio's front part is skipped. But when I set only two tracks, AVPlayer plays as same as original file. avPlayer.play() How can I fix it? Why do audio tracks affect that don't have any playing parts when start? Please let me know.
1
2
959
Dec ’23
AudioComponentInstanceNew crash on ios17 when use address sanitizer
I tried the same code on ios17 and ios16 when enable address sanitizer, ios17 will crash, why? Can anyone help me? AudioComponent comp = {0}; AudioComponentDescription compDesc = {0}; compDesc.componentType = kAudioUnitType_Output; compDesc.componentSubType = kAudioUnitSubType_RemoteIO; compDesc.componentManufacturer = kAudioUnitManufacturer_Apple; compDesc.componentFlags = 0; compDesc.componentFlagsMask = 0; comp = AudioComponentFindNext(NULL, &compDesc); if (comp == NULL) { assert(false); } AudioUnit tempAudioUnit; osResult = AudioComponentInstanceNew(comp, &tempAudioUnit); if (osResult != noErr) { assert(false); }
2
0
799
Dec ’23
I can't add more than 2 thousand and a few CMSAMPLEBUFFER in AVAssetWrite
Hello, I am deaf-blind and I program with a braille display. Currently, I am experiencing a difficulty with one of my APPs. Basically, I'm converting AVAudioPCMBuffer for CMSampleBuffer and so far so good. I want to add several CMSampleBuffer in a video written with AVAssetWrite . The problem is that I can only add up to more or less 2 thousands CMSampleBuffer . I'm trying to create a video. In this video, I put photos that are in an array and then I put audio from CMSampleBuffer. But I can't add many CMSampleBuffer and only goes up to 2 thousand something. I do not know what else to do. Help me. Below is a small excerpt of the code: let queue = DispatchQueue(label: "AssetWriterQueue") let audioProvider = SampleProvider(buffers: audioBuffers) let videoProvider = SampleProvider(buffers: videoBuffers) let audioInput = createAudioInput(audioBuffers: audioBuffers) let videoInput = createVideoInput(videoBuffers: videoBuffers) let adaptor = createPixelBufferAdaptor(videoInput: videoInput) let assetWriter = try AVAssetWriter(outputURL: url, fileType: .mp4) assetWriter.add(videoInput) assetWriter.add(audioInput) assetWriter.startWriting() assetWriter.startSession(atSourceTime: .zero) await withCheckedContinuation { continuation in videoInput.requestMediaDataWhenReady(on: queue) { let time = videoProvider.getPresentationTime() if let buffer = videoProvider.getNextBuffer() { adaptor.append(buffer, withPresentationTime: time) } else { videoInput.markAsFinished() continuation.resume() } } } await withCheckedContinuation { continuation in audioInput.requestMediaDataWhenReady(on: queue) { if let buffer = audioProvider.getNextBuffer() { audioInput.append(buffer) } else { audioInput.markAsFinished() continuation.resume() } } }
0
0
402
Dec ’23
Display jitter with "ProMotion"
Our DJ application Mixxx renders scrolling waveforms with 60 Hz. This looks perfectly smooth on an older 2015 MacBook Pro. However it looks jittery on a new M1 device with "ProMotion" enabled. Selecting 60 Hz fixes the issue. We are looking for a way to tell macOS that it can expect 60 Hz renderings from Mixxx and must not display them early (at 120 Hz) even if the pictures are ready. The alternative would be to read out the display settings and ask the user to select 60 Hz. Is there an API to: hint the display diver that we render with 60 Hz read out the refresh rate settings?
0
1
624
Dec ’23
App crashes: CA::Render::InterpolatedFunction::encode(CA::Render::Encoder*)
I've started seeing several users getting an app crash that I've been unable to find the root cause for so far. I've tried running the app in release build with address sanitizer and zombie objects checks enabled but have been unable to reproduce it. It only occurs for iOS 17 users. Any ideas on how I can troubleshoot this? Crashed: com.apple.main-thread EXC_BAD_ACCESS KERN_INVALID_ADDRESS 0x0000000000000000 Crashed: com.apple.main-thread 0 libsystem_platform.dylib 0xed4 _platform_memmove + 52 1 QuartzCore 0x137120 CA::Render::InterpolatedFunction::encode(CA::Render::Encoder*) const + 248 2 QuartzCore 0x136f40 CA::Render::GradientLayer::encode(CA::Render::Encoder*) const + 44 3 QuartzCore 0x2e384 CA::Render::Layer::encode(CA::Render::Encoder*) const + 284 4 QuartzCore 0x2e224 CA::Render::encode_set_object(CA::Render::Encoder*, unsigned long, unsigned int, CA::Render::Object*, unsigned int) + 196 5 QuartzCore 0x2b654 invocation function for block in CA::Context::commit_transaction(CA::Transaction*, double, double*) + 244 6 QuartzCore 0x2b4fc CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 368 7 QuartzCore 0x2b488 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 252 8 QuartzCore 0x2b4bc CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 304 9 QuartzCore 0x2b488 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 252 10 QuartzCore 0x2b488 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 252 11 QuartzCore 0x2b488 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 252 12 QuartzCore 0x2b488 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 252 13 QuartzCore 0x2b488 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 252 14 QuartzCore 0x2b488 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 252 15 QuartzCore 0x2b488 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 252 16 QuartzCore 0x2b488 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 252 17 QuartzCore 0x2b488 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 252 18 QuartzCore 0x2b488 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 252 19 QuartzCore 0x6fc60 CA::Context::commit_transaction(CA::Transaction*, double, double*) + 11192 20 QuartzCore 0x66574 CA::Transaction::commit() + 648 21 UIKitCore 0x31b5ec __34-[UIApplication _firstCommitBlock]_block_invoke_2 + 36 22 CoreFoundation 0x373a8 __CFRUNLOOP_IS_CALLING_OUT_TO_A_BLOCK__ + 28 23 CoreFoundation 0x35b9c __CFRunLoopDoBlocks + 356 24 CoreFoundation 0x33a9c __CFRunLoopRun + 848 25 CoreFoundation 0x33668 CFRunLoopRunSpecific + 608 26 GraphicsServices 0x35ec GSEventRunModal + 164 27 UIKitCore 0x22c2b4 -[UIApplication _run] + 888 28 UIKitCore 0x22b8f0 UIApplicationMain + 340 29 Coach 0x799d8 main + 14 (main.m:14) 30 ??? 0x1abefadcc (Missing)
11
8
2.9k
Dec ’23
How Can I Access The Secondary MV-HEVC Frame
I’m working with the Spatial Video related APIs in AVFoundation, and while I can create an AVAssetReader that reads an AVAssetTrack that reports a .containsStereoMultiviewVideo media characteristic (on a spatial video recorded by an iPhone 15 Pro), the documentation doesn’t make it clear how I can obtain the secondary video frame from that track. Does anyone know where to look? I've scoured the forums, documentation, and other resources, and I've had no luck. Thanks!
4
0
961
Dec ’23
Avoiding microphone permission popup on macOS Sonoma
I am working on an app that uses Core Audio through JUCE library for audio. The problem I'm trying to solve is that when the app is using a full duplex audio interface such as one from Focusrite Scarlett series for output, the app shows a dialog requesting permission to use microphone. The root cause of the issue is that by default, Core Audio opens full duplex devices for both input and output. On previous macOS versions, I was able to work around the problem by disabling the input stream before starting the IOProc by setting AudioHardwareIOProcStreamUsage to all zero for input. On macOS Sonoma this disables input so that the microphone indicator is not shown, but the permission popup is still shown. What other reasons there are to show the popup? I have noticed that Chrome and Slack have the same problem that they show the microphone popup when trying to play sounds on the Focusrite, but for example Deezer manages without the popup.
2
0
981
Dec ’23
Library to identify musical note
Developing for iphone/ipad/mac I have an idea for a music training app, but need to know of supporting libraries for recognizing a musical note's fundamental frequency in close to real time (100 ms delay) Accuracy should be within a few cents (hundredths of a semi tone) A search for "music" resolved the core-midi library -- fine if I want to take input from midi, but I want to be open to audio input too. And I found MusicKit, which seems to be a programmer's API for digging into Meta questions: Should I be using different search terms: Where are libraries listed? Who are the names in 3rd party libraries.
0
0
597
Dec ’23
AVPlayer can't resume after stalling
I have repeatedly checked if you limit the connection speed to a host with a video file (mp4), it brings that AVPlayer is stalled, but after you return the high speed connection to the host, the player does not resume playback. If you check the status, no errors, just the empty buffer: AVPlayer.error is nil. AVPlayerItem.error is nil. AVPlayerItem.isPlaybackBufferEmpty is true AVPlayerItem.isPlaybackLikelyToKeepUp is false Even if you try to wait a lot of time nothing happens and tapping play button it doesn't help as well. The player is frozen forever. Only if you call "seek" or call "playImmediately" method the player is unfrozen and resume playback. It happens not all the time, maybe one time from four. It seems like AVPlayer has a bug. What do you think?
0
0
489
Dec ’23
ClassInfo Audio Unit Property not being set
I have a music player that is able to save and restore AU parameters using the kAudioUnitProperty_ClassInfo property. For non apple AUs, this works fine. But for any of the Apple units, the class info can be set only the first time after the audio graph is built. Subsequent sets of the property do not stick even though the OSStatus code is 0 upon return. Previously this had worked fine. But sometime, not sure when, the Apple provided AUs changed their behavior and is now causing me problems. Can anyone help shed light on this ? Thanks in advance for the help. Jeff Frey
0
0
661
Jan ’24
Application Crashed: com.apple.main-thread EXC_BAD_ACCESS KVO_IS_RETAINING_ALL_OBSERVERS_OF_THIS_OBJECT_IF_IT_CRASHES_AN_OBSERVER_WAS_OVERRELEASED_OR_SMASHED + 76
Application Crashed: com.apple.main-thread EXC_BAD_ACCESS KERN_INVALID_ADDRESS 0x000000000000001e Crashed: com.apple.main-thread 0 libobjc.A.dylib 0x2df58 object_isClass + 16 1 Foundation 0x1c9bc KVO_IS_RETAINING_ALL_OBSERVERS_OF_THIS_OBJECT_IF_IT_CRASHES_AN_OBSERVER_WAS_OVERRELEASED_OR_SMASHED + 76 2 Foundation 0x1bd60 NSKeyValueWillChangeWithPerThreadPendingNotifications + 300 3 AVFoundation 0x1380 -[AVPlayerAccessibility willChangeValueForKey:] + 72 4 AVFCore 0x13954 -[AVPlayer _noteNewPresentationSizeForPlayerItem:] + 48 5 AVFCore 0x1fbb0 __avplayeritem_fpItemNotificationCallback_block_invoke + 4336 6 libdispatch.dylib 0x26a8 _dispatch_call_block_and_release + 32 7 libdispatch.dylib 0x4300 _dispatch_client_callout + 20 8 libdispatch.dylib 0x12998 _dispatch_main_queue_drain + 984 9 libdispatch.dylib 0x125b0 _dispatch_main_queue_callback_4CF + 44 10 CoreFoundation 0x3701c CFRUNLOOP_IS_SERVICING_THE_MAIN_DISPATCH_QUEUE + 16 11 CoreFoundation 0x33d28 __CFRunLoopRun + 1996 12 CoreFoundation 0x33478 CFRunLoopRunSpecific + 608 13 GraphicsServices 0x34f8 GSEventRunModal + 164 14 UIKitCore 0x22c62c -[UIApplication _run] + 888 15 UIKitCore 0x22bc68 UIApplicationMain + 340 16 UIKitCore 0x4563d0 __swift_destroy_boxed_opaque_existential_1Tm + 12220 17 AajTak 0x84c4 main + 4333552836 (QuizLeaderboardViewModel.swift:4333552836) com.livingMedia.AajTakiPhone_issue_4e4b5f148b75496175c3900a1405bd62_crash_session_3ff23a3e8e854c4ab68de2789fe76c5b_DNE_0_v2_stacktrace.txt
0
0
555
Jan ’24
selecting a specific audio output device
there is a method setPreferredInput in AVAudioSession which can be used to select different input device. But, does there any similar function like "setPerferredOutput" so that in my APP I can select a specific audio output device to play audio ? I do not want user to change it through system interfaces (such as the Control Center), but by logic inside APP. thanks!
0
0
429
Jan ’24
Play Music While Camera Is Open
I am creating a camera app where I would like music from another app (Apple Music, Spotify, etc.) to continue playing once the app is opened. Currently I am using .mixWithOthers to do this in my viewDidLoad. let audioSession = AVAudioSession.sharedInstance() do { try audioSession.setCategory(AVAudioSession.Category.playback, options: [.mixWithOthers]) try audioSession.setActive(true) } catch { print("error trying to record and play audio") } However I am running into an issue where the music only plays if you resume music playback once you start recording a video. Otherwise, when you open the app music will stop when you see the preview. The interesting thing is that if you start playing music while recording, then once you stop music continues to play in the preview view. If you close the app (not force close) and reopen then music play back continues as expected. However, once you force close the app then it returns to the original behavior. I've tried to do research on this and I have not been able to find anything. Any help is appreciated. Let me know if more details are needed.
1
0
793
Jan ’24