Recording audio from a microphone using the AVFoundation framework does not work after reconnecting the microphone

There are different microphones that can be connected via a 3.5-inch jack or via USB or via Bluetooth, the behavior is the same.

There is a code that gets access to the microphone (connected to the 3.5-inch audio jack) and starts an audio capture session. At the same time, the microphone use icon starts to be displayed. The capture of the audio device (microphone) continues for a few seconds, then the session stops, the microphone use icon disappears, then there is a pause of a few seconds, and then a second attempt is made to access the same microphone and start an audio capture session. At the same time, the microphone use icon is displayed again. After a few seconds, access to the microphone stops and the audio capture session stops, after which the microphone access icon disappears.

Next, we will try to perform the same actions, but after the first stop of access to the microphone, we will try to pull the microphone plug out of the connector and insert it back before trying to start the second session. In this case, the second attempt to access begins, the running part of the program does not return errors, but the microphone access icon is not displayed, and this is the problem. After the program is completed and restarted, this icon is displayed again.

This problem is only the tip of the iceberg, since it manifests itself in the fact that it is not possible to record sound from the audio microphone after reconnecting the microphone until the program is restarted.

Is this normal behavior of the AVFoundation framework? Is it possible to somehow make it so that after reconnecting the microphone, access to it occurs correctly and the usage indicator is displayed? What additional actions should the programmer perform in this case? Is there a description of this behavior somewhere in the documentation?

Below is the code to demonstrate the described behavior.

I am also attaching an example of the microphone usage indicator icon.

Computer description: MacBook Pro 13-inch 2020 Intel Core i7 macOS Sequoia 15.1.

#include <chrono>
#include <condition_variable>
#include <iostream>
#include <mutex>
#include <thread>

#include <AVFoundation/AVFoundation.h>
#include <Foundation/NSString.h>
#include <Foundation/NSURL.h>

AVCaptureSession* m_captureSession = nullptr;
AVCaptureDeviceInput* m_audioInput = nullptr;
AVCaptureAudioDataOutput* m_audioOutput = nullptr;

std::condition_variable conditionVariable;
std::mutex mutex;
bool responseToAccessRequestReceived = false;

void receiveResponse()
{
    std::lock_guard<std::mutex> lock(mutex);
    responseToAccessRequestReceived = true;
    conditionVariable.notify_one();
}

void waitForResponse()
{
    std::unique_lock<std::mutex> lock(mutex);
    conditionVariable.wait(lock, [] { return responseToAccessRequestReceived; });
}

void requestPermissions()
{
    responseToAccessRequestReceived = false;
    [AVCaptureDevice requestAccessForMediaType:AVMediaTypeAudio completionHandler:^(BOOL granted)
    {
        const auto status = [AVCaptureDevice authorizationStatusForMediaType:AVMediaTypeAudio];
        std::cout << "Request completion handler granted: " << (int)granted << ", status: " << status << std::endl;
        receiveResponse();
    }];

    waitForResponse();
}

void timer(int timeSec)
{
    for (auto timeRemaining = timeSec; timeRemaining > 0; --timeRemaining)
    {
        std::cout << "Timer, remaining time: " << timeRemaining << "s" << std::endl;
        std::this_thread::sleep_for(std::chrono::seconds(1));
    }
}

bool updateAudioInput()
{
    [m_captureSession beginConfiguration];

    if (m_audioOutput)
    {
        AVCaptureConnection *lastConnection = [m_audioOutput connectionWithMediaType:AVMediaTypeAudio];
        [m_captureSession removeConnection:lastConnection];
    }

    if (m_audioInput)
    {
        [m_captureSession removeInput:m_audioInput];
        [m_audioInput release];
        m_audioInput = nullptr;
    }

    AVCaptureDevice* audioInputDevice = [AVCaptureDevice deviceWithUniqueID: [NSString stringWithUTF8String: "BuiltInHeadphoneInputDevice"]];

    if (!audioInputDevice)
    {
        std::cout << "Error input audio device creating" << std::endl;
        return false;
    }

    // m_audioInput = [AVCaptureDeviceInput deviceInputWithDevice:audioInputDevice error:nil];
   // NSError *error = nil;
    NSError *error = [[NSError alloc] init];
    m_audioInput = [AVCaptureDeviceInput deviceInputWithDevice:audioInputDevice error:&error];
    if (error)
    {
        const auto code = [error code];
        const auto domain = [error domain];
        const char* domainC = domain ? [domain UTF8String] : nullptr;
        std::cout << code << " " << domainC << std::endl;
    }


    if (m_audioInput && [m_captureSession canAddInput:m_audioInput]) {
        [m_audioInput retain];
        [m_captureSession addInput:m_audioInput];
    }
    else
    {
        std::cout << "Failed to create audio device input" << std::endl;
        return false;
    }

    if (!m_audioOutput)
    {
        m_audioOutput = [[AVCaptureAudioDataOutput alloc] init];
        if (m_audioOutput && [m_captureSession canAddOutput:m_audioOutput])
        {
            [m_captureSession addOutput:m_audioOutput];
        }
        else
        {
            std::cout << "Failed to add audio output" << std::endl;
            return false;
        }
    }

    [m_captureSession commitConfiguration];

    return true;
}

void start()
{
    std::cout << "Starting..." << std::endl;

    const bool updatingResult = updateAudioInput();
    if (!updatingResult)
    {
        std::cout << "Error, while updating audio input" << std::endl;
        return;
    }

    [m_captureSession startRunning];
}

void stop()
{
    std::cout << "Stopping..." << std::endl;
    [m_captureSession stopRunning];
}

int main()
{
    requestPermissions();

    m_captureSession = [[AVCaptureSession alloc] init];

    start();
    timer(5);
    stop();

    timer(10);

    start();
    timer(5);
    stop();
}


Hi there, AVCaptureSession is normally called from GUI apps, where a main run loop is running on which notifications can fire and be acted upon. It looks like your test app is not running a run loop and instead just sleeping. That's the most likely problem.

Instead of sleeping, run the main runloop for a second at a time. Like this:

	while ( ! quit ) {
		NSDate* aSecondFromNow = [[NSDate alloc] initWithTimeIntervalSinceNow:1.0];
		[[NSRunLoop currentRunLoop] runUntilDate:aSecondFromNow];
		[aSecondFromNow release]; 
	}
Recording audio from a microphone using the AVFoundation framework does not work after reconnecting the microphone
 
 
Q