ProRes encoding on M1 Max fails for high bit depth buffers

I have code that has worked for many years for writing ProRes files, and it is now failing on the new M1 Max MacBook. Specifically, if I construct buffers with the pixel type "kCVPixelFormatType_64ARGB", after a few frames of writing, the pixel buffer pool becomes nil. This code works just fine on non Max processors (Intel and base M1 natively).

Here's a sample main that demonstrates the problem. Am I doing something wrong here?

//  main.m
//  TestProresWriting
//

#import <Foundation/Foundation.h>
#import <AVFoundation/AVFoundation.h>

int main(int argc, const char * argv[]) {
    @autoreleasepool {
        int timescale = 24;
        int width = 1920;
        int height = 1080;

        NSURL *url = [NSURL URLWithString:@"file:///Users/diftil/TempData/testfile.mov"];

        NSLog(@"Output file = %@", [url absoluteURL]);
        NSFileManager *fileManager = [NSFileManager defaultManager];
        NSError *error = nil;
        [fileManager removeItemAtURL:url error:&error];

        // Set up the writer
        AVAssetWriter *trackWriter = [[AVAssetWriter alloc] initWithURL:url
                                                    fileType:AVFileTypeQuickTimeMovie
                                                        error:&error];

        // Set up the track
        NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                       AVVideoCodecTypeAppleProRes4444, AVVideoCodecKey,
                                       [NSNumber numberWithInt:width], AVVideoWidthKey,
                                       [NSNumber numberWithInt:height], AVVideoHeightKey,
                                       nil];

        

        AVAssetWriterInput *track = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo
                                                        outputSettings:videoSettings];

        // Set up the adapter

        NSDictionary *attributes = [NSDictionary
                                    dictionaryWithObjects:
                                    [NSArray arrayWithObjects:[NSNumber numberWithUnsignedInt:kCVPixelFormatType_64ARGB], // This pixel type causes problems on M1 Max, but works on everything else
                                     [NSNumber numberWithUnsignedInt:width],[NSNumber numberWithUnsignedInt:height],
                                     nil]
                                    forKeys:
                                    [NSArray arrayWithObjects:(NSString *)kCVPixelBufferPixelFormatTypeKey,
                                     (NSString*)kCVPixelBufferWidthKey, (NSString*)kCVPixelBufferHeightKey,
                                     nil]];

        /*
        NSDictionary *attributes = [NSDictionary
                                    dictionaryWithObjects:
                                    [NSArray arrayWithObjects:[NSNumber numberWithUnsignedInt:kCVPixelFormatType_32ARGB], // This pixel type works on M1 Max
                                     [NSNumber numberWithUnsignedInt:width],[NSNumber numberWithUnsignedInt:height],
                                     nil]
                                    forKeys:
                                    [NSArray arrayWithObjects:(NSString *)kCVPixelBufferPixelFormatTypeKey,
                                     (NSString*)kCVPixelBufferWidthKey, (NSString*)kCVPixelBufferHeightKey,
                                     nil]];
        */

        AVAssetWriterInputPixelBufferAdaptor *pixelBufferAdaptor = [AVAssetWriterInputPixelBufferAdaptor
                            assetWriterInputPixelBufferAdaptorWithAssetWriterInput:track
                            sourcePixelBufferAttributes:attributes];

        // Add the track and start writing

        [trackWriter addInput:track];
        [trackWriter startWriting];

        CMTime startTime = CMTimeMake(0, timescale);
        [trackWriter startSessionAtSourceTime:startTime];

        while(!track.readyForMoreMediaData);

        int frameTime = 0;

        CVPixelBufferRef frameBuffer = NULL;

        for (int i = 0; i < 100; i++)
        {
            NSLog(@"Frame %@", [NSString stringWithFormat:@"%d", i]);
            CVPixelBufferPoolRef PixelBufferPool = pixelBufferAdaptor.pixelBufferPool;
            if (PixelBufferPool == nil)
            {
                NSLog(@"PixelBufferPool is invalid.");
                exit(1);
            }

            CVReturn ret = CVPixelBufferPoolCreatePixelBuffer(nil, PixelBufferPool, &frameBuffer);

            if (ret != kCVReturnSuccess)
            {
                NSLog(@"Error creating framebuffer from pool");
                exit(1);
            }

            CVPixelBufferLockBaseAddress(frameBuffer, 0);
            // This is where we would put image data into the buffer.  Nothing right now.
            CVPixelBufferUnlockBaseAddress(frameBuffer, 0);

            while(!track.readyForMoreMediaData);

            CMTime presentationTime = CMTimeMake(frameTime+(i*timescale), timescale);
            BOOL result = [pixelBufferAdaptor appendPixelBuffer:frameBuffer
                                           withPresentationTime:presentationTime];
            if (result == NO)
            {
                NSLog(@"Error appending to track.");
                exit(1);
            }

            CVPixelBufferRelease(frameBuffer);
        }

        // Close everything
        if ( trackWriter.status == AVAssetWriterStatusWriting)
            [track markAsFinished];

        NSLog(@"Completed.");
    }
    return 0;
}

It isn't fixed.

Just an update: this bug is still present in the macOS releases. It's still very annoying.

I have a Mac Studio with M1 Ultra that has been behaving strangely with 4444, specifically related to alpha channel transparency. When I export a project as 4444 from Final Cut Pro, it has transparency, but when I encode video to 4444 using Syphon (crucial to my workflow) the alpha channel is not transparent, but black. On my previous intel machine, this was not an issue. I've tried exporting in Syphon as HAP Q, then converting to 4444 using an application called AVF Batch Converter, but this does not work on my M1 ultra. Specifically, the background remains black, not transparent.

However, I can send files to my video editor as HAP Q, and he can use the exact same AVF Batch Converter settings to get 4444 with alpha channel on his Intel i9 iMac. I don't understand why the M1 ultra is having this issue. If I could export in 4444 with Syphon like I could on my previous intel computer, that would be amazing.

I have received a notification from the Apple byg system that it is fixed in the latest beta version of osx ( I have bot verified it yet ).

I am a full time cinematographer and video editor and have run into this issue after switching from an intel-based iMac Pro to the M2 Ultra Mac Studio earlier this year and am having a hard time finding documentation of this issue and this is the closest I have come, even if it was a year ago, it is still a problem.

Apple ProRes 4444 XQ render files exported from a render artist using Houdini worked fined on my Intel Mac but exhibit a broken alpha channel on my new Mac Studio that can only be remedied by taking the files and re-exporting from an Intel based Mac within Adobe Media Encoder to MXF format. Reexporting from the Mac Studio does not remedy the issue.

The files exhibit the erroneous characteristic in Adobe programs and QuickTime Player, but not in VLC. I do not possess the wherewithal to determine the root cause of the issue, only to provide the one workaround I discovered. I tried several different settings within Premier to no avail to try to get the footage to playback correctly. Essentially it comes out blocky frame by frame, as if the alpha channel is inverting itself incorrectly in segments.

videosettings and attributes are not identical dictionaries. I solved a similar problem by using the same dictionary for both variables.

ProRes encoding on M1 Max fails for high bit depth buffers
 
 
Q