Capturing system audio no longer works with macOS Sequoia

Our capture application records system audio via HAL plugin, however, with the latest macOS 15 Sequoia, all audio buffer values are zero.

I am attaching sample code that replicates the problem. Compile as a Command Line Tool application with Xcode.

STEPS TO REPRODUCE Install BlackHole 2ch audio driver: https://existential.audio/blackhole/download/?code=1579271348

Start some system audio, e.g. YouTube. Compile and run the sample application.

On macOS up to Sonoma, you will hear audio via loopback and see audio values in the debug/console window.

On macOS Sequoia, you will not hear audio and the audio values are 0.

#import <AVFoundation/AVFoundation.h>
#import <CoreAudio/CoreAudio.h>

#define BLACKHOLE_UID @"BlackHole2ch_UID"
#define DEFAULT_OUTPUT_UID @"BuiltInSpeakerDevice"

@interface AudioCaptureDelegate : NSObject <AVCaptureAudioDataOutputSampleBufferDelegate>
@end

void setDefaultAudioDevice(NSString *deviceUID);

@implementation AudioCaptureDelegate
// receive samples from CoreAudio/HAL driver and print amplitute values for testing
// this is where samples would normally be copied and passed downstream for further processing which
// is not needed in this simple sample application
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
    // Access the audio data in the sample buffer
    CMBlockBufferRef blockBuffer = CMSampleBufferGetDataBuffer(sampleBuffer);
    if (!blockBuffer) {
        NSLog(@"No audio data in the sample buffer.");
        return;
    }
    
    size_t length;
    char *data;
    CMBlockBufferGetDataPointer(blockBuffer, 0, NULL, &length, &data);
    
    // Process the audio samples to calculate the average amplitude
    int16_t *samples = (int16_t *)data;
    size_t sampleCount = length / sizeof(int16_t);
    int64_t sum = 0;
    
    for (size_t i = 0; i < sampleCount; i++) {
        sum += abs(samples[i]);
    }
    
    // Calculate and log the average amplitude
    float averageAmplitude = (float)sum / sampleCount;
    NSLog(@"Average Amplitude: %f", averageAmplitude);
}
@end




// set the default audio device to Blackhole while testing or speakers when done
// called by main
void setDefaultAudioDevice(NSString *deviceUID) {
    AudioObjectPropertyAddress address;
    AudioDeviceID deviceID = kAudioObjectUnknown;
    UInt32 size;
    
    CFStringRef uidString = (__bridge CFStringRef)deviceUID;
    
    // Gets the device corresponding to the given UID.
    AudioValueTranslation translation;
    translation.mInputData = &uidString;
    translation.mInputDataSize = sizeof(uidString);
    translation.mOutputData = &deviceID;
    translation.mOutputDataSize = sizeof(deviceID);
    size = sizeof(translation);
    address.mSelector = kAudioHardwarePropertyDeviceForUID;
    address.mScope = kAudioObjectPropertyScopeGlobal; //????
    address.mElement = kAudioObjectPropertyElementMain;
    
    OSStatus status = AudioObjectGetPropertyData(kAudioObjectSystemObject, &address, 0, NULL, &size, &translation);
    
    if (status != noErr) {
        NSLog(@"Error: Could not retrieve audio device ID for UID %@. Status code: %d", deviceUID, (int)status);
        return;
    }

    AudioObjectPropertyAddress propertyAddress;
    
    propertyAddress.mSelector = kAudioHardwarePropertyDefaultOutputDevice;
    propertyAddress.mScope = kAudioObjectPropertyScopeGlobal;
    status = AudioObjectSetPropertyData(kAudioObjectSystemObject, &propertyAddress, 0, NULL, sizeof(AudioDeviceID), &deviceID);

    if (status == noErr) {
        NSLog(@"Default audio device set to %@", deviceUID);
    } else {
        NSLog(@"Failed to set default audio device: %d", status);
    }
}

// sets Blackhole device as default and configures it as AVCatureDeviceInput
// sets the speakers as loopback so we can hear what is being captured
// sets up queue to receive capture samples
// runs session for 30 seconds, then restores speakers as default output
int main(int argc, const char * argv[]) {
    @autoreleasepool {
        
        // Create the capture session
        AVCaptureSession *session = [[AVCaptureSession alloc] init];
        
        // Select the audio device
        AVCaptureDevice *audioDevice = nil;
        NSString *audioDriverUID = nil;
        
        audioDriverUID = BLACKHOLE_UID;
    
        setDefaultAudioDevice(audioDriverUID);

        audioDevice = [AVCaptureDevice deviceWithUniqueID:audioDriverUID];
        
        if (!audioDevice) {
            NSLog(@"Audio device %s not found!", [audioDriverUID UTF8String]);
            return -1;
        } else {
            NSLog(@"Using Audio device: %s", [audioDriverUID UTF8String]);
        }
     
        
        // Configure the audio input with the selected device (Blackhole)
        NSError *error = nil;
        AVCaptureDeviceInput *audioInput = [AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:&error];
        
        if (error || !audioInput) {
            NSLog(@"Failed to create audio input: %@", error);
            return -1;
        }
        
        [session addInput:audioInput];
        
        // Configure the audio data output
        AVCaptureAudioDataOutput *audioOutput = [[AVCaptureAudioDataOutput alloc] init];
        AudioCaptureDelegate *delegate = [[AudioCaptureDelegate alloc] init];
        dispatch_queue_t queue = dispatch_queue_create("AudioCaptureQueue", NULL);
        [audioOutput setSampleBufferDelegate:delegate queue:queue];
        
        [session addOutput:audioOutput];
        
        // Set audio settings
        NSDictionary *audioSettings = @{
            AVFormatIDKey: @(kAudioFormatLinearPCM),
            AVSampleRateKey: @48000,
            AVNumberOfChannelsKey: @2,
            AVLinearPCMBitDepthKey: @16,
            AVLinearPCMIsFloatKey: @NO,
            AVLinearPCMIsNonInterleaved: @NO
        };
        [audioOutput setAudioSettings:audioSettings];
        
        AVCaptureAudioPreviewOutput * loopback_output = nil;
        loopback_output = [[AVCaptureAudioPreviewOutput alloc] init];
        loopback_output.volume = 1.0;
        loopback_output.outputDeviceUniqueID = DEFAULT_OUTPUT_UID;
        [session addOutput:loopback_output];
        const char *deviceID = loopback_output.outputDeviceUniqueID ? [loopback_output.outputDeviceUniqueID UTF8String] : "nil";
        NSLog(@"session addOutput for preview/loopback: %s", deviceID);
        
        // Start the session
        [session startRunning];
        
        NSLog(@"Capturing audio data for 30 seconds...");
        [[NSRunLoop currentRunLoop] runUntilDate:[NSDate dateWithTimeIntervalSinceNow:30.0]];
        
        // Stop the session
        [session stopRunning];
        NSLog(@"Capture session stopped.");
        
        setDefaultAudioDevice(DEFAULT_OUTPUT_UID);
    }
    return 0;
}

Can you file a Feedback and post the ID please?

I filed FB15620713

Is the input audio data silent when only using AVAudioCapture API? Is there valid audio input when using the HAL API to record input from the black hole device?

Testing with only the AVAudioCapture API also did not work.

But as I was testing, I discovered the following:

If I set the default audio device after starting the session instead of before starting the session, then audio is working with Sequoia.

In the sample application, when moving setDefaultAudioDevice on line 103 down past [session startRunning], then I see audio values and hear audio via the loopback:

        ...
        // Start the session
        [session startRunning];
        
        setDefaultAudioDevice(audioDriverUID); // moved from line 103
        ...

Apparently, there is some change in Sequoia which requires that the session has to be running prior to configuring the default audio device.

Capturing system audio no longer works with macOS Sequoia
 
 
Q