Hi there, I'm having some trouble with AVAudioMixerNode only working when there is a single input, and outputting silence or very quiet buzzing when >1 input node is connected. My setup has voice processing enabled, input going to a sink, and N source nodes going to the main mixer node, going to the output node. In all cases I am connecting nodes in the graph with the same declared format: 48kHz 1 channel Float32 PCM.
This is working great for 1 source node, but as soon as I add a second it breaks. I can reproduce this behaviour in the SignalGenerator sample, when the same format is used everywhere. Again, it'll work fine with 1 source node even in this configuration, but add another and there's silence.
Am I doing something wrong with formats here? Is this expected? As I understood it with voice processing on and use of a mixer node I should be able to use my own format essentially everywhere in my graph?
My SignalGenerator modified repro example follows:
import Foundation
import AVFoundation
// True replicates my real app's behaviour, which is broken.
// You can remove one source node connection
// to make it work even when this is true.
let showBrokenState: Bool = true
// SignalGenerator constants.
let frequency: Float = 440
let amplitude: Float = 0.5
let duration: Float = 5.0
let twoPi = 2 * Float.pi
let sine = { (phase: Float) -> Float in
return sin(phase)
}
let whiteNoise = { (phase: Float) -> Float in
return ((Float(arc4random_uniform(UINT32_MAX)) / Float(UINT32_MAX)) * 2 - 1)
}
// My "application" format.
let format: AVAudioFormat = .init(commonFormat: .pcmFormatFloat32,
sampleRate: 48000,
channels: 1,
interleaved: true)!
// Engine setup.
let engine = AVAudioEngine()
let mainMixer = engine.mainMixerNode
let output = engine.outputNode
try! output.setVoiceProcessingEnabled(true)
let outputFormat = engine.outputNode.inputFormat(forBus: 0)
let sampleRate = Float(format.sampleRate)
let inputFormat = format
var currentPhase: Float = 0
let phaseIncrement = (twoPi / sampleRate) * frequency
let srcNodeOne = AVAudioSourceNode { _, _, frameCount, audioBufferList -> OSStatus in
let ablPointer = UnsafeMutableAudioBufferListPointer(audioBufferList)
for frame in 0..<Int(frameCount) {
let value = sine(currentPhase) * amplitude
currentPhase += phaseIncrement
if currentPhase >= twoPi {
currentPhase -= twoPi
}
if currentPhase < 0.0 {
currentPhase += twoPi
}
for buffer in ablPointer {
let buf: UnsafeMutableBufferPointer<Float> = UnsafeMutableBufferPointer(buffer)
buf[frame] = value
}
}
return noErr
}
let srcNodeTwo = AVAudioSourceNode { _, _, frameCount, audioBufferList -> OSStatus in
let ablPointer = UnsafeMutableAudioBufferListPointer(audioBufferList)
for frame in 0..<Int(frameCount) {
let value = whiteNoise(currentPhase) * amplitude
currentPhase += phaseIncrement
if currentPhase >= twoPi {
currentPhase -= twoPi
}
if currentPhase < 0.0 {
currentPhase += twoPi
}
for buffer in ablPointer {
let buf: UnsafeMutableBufferPointer<Float> = UnsafeMutableBufferPointer(buffer)
buf[frame] = value
}
}
return noErr
}
engine.attach(srcNodeOne)
engine.attach(srcNodeTwo)
engine.connect(srcNodeOne, to: mainMixer, format: inputFormat)
engine.connect(srcNodeTwo, to: mainMixer, format: inputFormat)
engine.connect(mainMixer, to: output, format: showBrokenState ? inputFormat : outputFormat)
// Put the input node to a sink just to match the formats and make VP happy.
let sink: AVAudioSinkNode = .init { timestamp, numFrames, data in
.zero
}
engine.attach(sink)
engine.connect(engine.inputNode, to: sink, format: showBrokenState ? inputFormat : outputFormat)
mainMixer.outputVolume = 0.5
try! engine.start()
CFRunLoopRunInMode(.defaultMode, CFTimeInterval(duration), false)
engine.stop()
Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.
Post
Replies
Boosts
Views
Activity
Hello,
I'm facing an issue with Xcode 15 and iOS 17: it seems impossible to get AVAudioEngine's audio input node to work on simulator.
inputNode has a 0ch, 0kHz input format,
connecting input node to any node or installing a tap on it fails systematically.
What we tested:
Everything works fine on iOS simulators <= 16.4, even with Xcode 15.
Nothing works on iOS simulator 17.0 on Xcode 15.
Everything works fine on iOS 17.0 device with Xcode 15.
More details on this here: https://github.com/Fesongs/InputNodeFormat
Any idea on this? Something I'm missing?
Thanks for your help š
Tom
PS: I filed a bug on Feedback Assistant, but it usually takes ages to get any answer so I'm also trying here š
Is it possible to access "From my mac" photos/PHAssetCollection through PhotoKit in iOS?
"From my mac" photos/videos are media synced from a mac where iCloud Photos are turned off on the iOS device, like what we did in the ole' days before iCloud Photos.
I have set up an iOS device with "From my mac" albums present in Photos.app, but in my own app I don't seem to be able to access those collections/photos through PhotoKit using all the defined PHAssetCollectionTypes.
Are these directly synced photos simply not available through PhotoKit and you would have to revert to the deprecated ALAssetLibrary?
Recently I've been trying to play some AV1-encoded streams on my iPhone 15 Pro Max. First, I check for hardware support:
VTIsHardwareDecodeSupported(kCMVideoCodecType_AV1); // YES
Then I need to create a CMFormatDescription in order to pass it into a VTDecompressionSession. I've tried the following:
{
mediaType:'vide'
mediaSubType:'av01'
mediaSpecific: {
codecType: 'av01' dimensions: 394 x 852
}
extensions: {{
CVFieldCount = 1;
CVImageBufferChromaLocationBottomField = Left;
CVImageBufferChromaLocationTopField = Left;
CVPixelAspectRatio = {
HorizontalSpacing = 1;
VerticalSpacing = 1;
};
FullRangeVideo = 0;
}}
}
but VTDecompressionSessionCreate gives me error -8971 (codecExtensionNotFoundErr, I assume).
So it has something to do with the extensions dictionary? I can't find anywhere which set of extensions is necessary for it to work šæ.
VideoToolbox has convenient functions for creating descriptions of AVC and HEVC streams (CMVideoFormatDescriptionCreateFromH264ParameterSets and CMVideoFormatDescriptionCreateFromHEVCParameterSets), but not for AV1.
As of today I am using XCode 15.0 with iOS 17.0.0 SDK.
I'm using iCloud Music Library. Iām using macOS 14.1 (23B74) and iOS 17.1.
iām using MusicKit to find songs that do not have artwork. On iOS, Song.artwork will be nil for items I know do not have artwork. On macOS, Song.artwork is not nil. However when the songs are shown in Music.app, they do not have Artwork. Is this expected? Alternately, is there a more correct way to determine that a Song has no Artwork?
I have also filed FB13315721.
Thank you for any tips!
WWDC23 Platform State of the Union mentioned that Volume shutter buttons to trigger the camera shutter is coming later this year. This was mentioned at 0:30:15.
Would anyone know when this will be available?
My app uses PHLivePhoto.request to generate live photos, but memory leaks if I use a custom targetSize.
PHLivePhoto.request(withResourceFileURLs: [imageUrl, videoUrl], placeholderImage: nil, targetSize: targetSize, contentMode: .aspectFit) {[weak self] (livePhoto, info) in
Change targetSize to CGSizeZero, problem resolved.
PHLivePhoto.request(withResourceFileURLs: [imageUrl, videoUrl], placeholderImage: nil, targetSize: CGSizeZero, contentMode: .aspectFit) {[weak self] (livePhoto, info) in
Hello,
I used kAudioDevicePropertyDeviceIsRunningSomewhere to check if an internal or external microphone is being used.
My code works well for the internal microphone, and for microphones which are connected using a cable.
External microphones which are connected using bluetooth are not reporting their status.
The status is always requested successfully, but it is always reported as inactive.
Main relevant parts in my code :
static inline AudioObjectPropertyAddress
makeGlobalPropertyAddress(AudioObjectPropertySelector selector) {
AudioObjectPropertyAddress address = {
selector,
kAudioObjectPropertyScopeGlobal,
kAudioObjectPropertyElementMaster,
};
return address;
}
static BOOL getBoolProperty(AudioDeviceID deviceID,
AudioObjectPropertySelector selector)
{
AudioObjectPropertyAddress const address =
makeGlobalPropertyAddress(selector);
UInt32 prop;
UInt32 propSize = sizeof(prop);
OSStatus const status =
AudioObjectGetPropertyData(deviceID, &address, 0, NULL, &propSize, &prop);
if (status != noErr) {
return 0; //this line never gets executed in my tests. The call above always succeeds, but it always gives back "false" status.
}
return static_cast<BOOL>(prop == 1);
}
...
__block BOOL microphoneActive = NO;
iterateThroughAllInputDevices(^(AudioObjectID object, BOOL *stop) {
if (getBoolProperty(object, kAudioDevicePropertyDeviceIsRunningSomewhere) !=
0) {
microphoneActive = YES;
*stop = YES;
}
});
What could cause this and how could it be fixed?
Thank you for your help in advance!
Are there any plans to support developers for a portion of the iPhone 15 series' 24MP photoshoot? I wonder if the app can support it other than the basic camera.
Dear Apple devs,
I hope this message finds you well. I am writing to seek your guidance and assistance in a matter related to my blog, which features affiliate links to the Apple Music store.
Currently, I manage all aspects of my blog independently. However, I find myself facing the formidable task of updating 300 articles, and I am exploring options to streamline and automate this process.
To achieve this automation, I am looking into the creation of affiliate links through the Apple Music API and the affiliate program. While this approach promises efficiency and enhanced user experience, I must acknowledge a financial constraint. Acquiring a developer account, a prerequisite for API access, involves a cost of ā¬99, which presents a challenge within my budgetary constraints.
I have come across information suggesting the possibility of requesting a fee waiver, yet the process entails an initial payment with no guarantees. This leads me to seek your advice on the most viable and cost-effective solution to address this predicament.
Your insights and recommendations would be greatly appreciated, as they will guide my decision on the best course of action moving forward. I am committed to upholding the integrity of my blog and ensuring its continued success, and your expertise will be instrumental in achieving this goal.
Thank you in advance for your time and assistance. I eagerly await your response and look forward to resolving this matter with your support.
i am using DRM content in my player. facing this error frequently . we used both licenseUrl ,certificateUrl and DRM token .Content is not playing getting this error .
A few versions of iOS ago, the stitching algorithm for panoramas was updated, which produces results that in my opinion look less good for what I'm using the panoramas for. I was exploring developing a custom panorama app but couldn't find the API for taking panoramic photos, much less stitching them. Is there an API in AVFoundation or elsewhere to use for capturing a panoramic photo and stitching it?
I have two CIContexts configured with the following options:
let options1:[CIContextOption:Any] = [CIContextOption.cacheIntermediates: false, CIContextOption.outputColorSpace: NSNull(), CIContextOption.workingColorSpace: NSNull()];
let options2:[CIContextOption:Any] = [CIContextOption.cacheIntermediates: false];
And an MTKView with CAMetalLayer configured with HDR output.
metalLayer = self.layer as? CAMetalLayer
metalLayer?.wantsExtendedDynamicRangeContent = true
metalLayer.colorspace = CGColorSpace(name: CGColorSpace.itur_2100_HLG)
colorPixelFormat = .bgr10a2Unorm
The two context options produce different outputs when input is in BT.2020 pixel buffers. But I believe the outputs shouldn't be different. Because the first option simply disables color management. The second one performs intermediate buffer calculations in sRGB extended linear color space and then converts those buffers to BT.2020 color space in the output.
Hi there, I am trying to retrieve over 450+ urls of podcast episodes by using `api_url = f'https://itunes.apple.com/lookup?id={podcast_id}&entity=podcastEpisode&limit=200'. Even I modified the limit, I can only get 200 items.
Anyone has an idea about this issue?
Any resonse is appreciated!!
Best!
I have a website that had been publishing to Apple News for a good while via RSS feed. At some point I implemented the Word Press plugin for publishing, created the PAPI file and that was working. I could login to Apple News Publisher and see my most recent articles. The RSS feed also still shows in the ANP dashboard.
This appears to have stopped working in September as those are the most recent articles I can see in the Apple News Publisher dashboard. If I attempt to publish an article within the plugin it returns an error about "ONLY_PREVIEW_ALLOWED"
So it appears my channel is back in a preview only mode. But nowhere in the News Publisher dashboard can I see anything to resubmit for approval. If I go to All Published Articles I can see where they stopped in September. If I go to Draft Articles and click "Preview these articles" I can see the channel in the Apple News app and it contains the most recent articles published.
But back in the APN dashboard I don't have anything that allows me to submit to being published again. Any advice? Thanks!
I just ripped a CD onto my Itunes, and when it plays on the CD, it plays gapless as intended, but when ripped and uploaded to my Itunes and Iphone, there is a gap. I hate it. I know previously you could have gapless playback. Can you PLEASE bring it back? It is a simple fix, update your software, and bring back gapless playback please. People have complained about it before, DO something about it.
the log of error thread is as following
Thread 4 Crashed:: Dispatch queue: CA DispatchGroup
0 libwx_osx_cocoau_core-3.1.1.0.0.dylib 0x10f3707f0 wxMacCoreGraphicsBrushData::CalculateShadingValues(void*, double const*, double*) + 240
1 CoreGraphics 0x7ff80acda1ce CGFunctionEvaluate + 243
2 CoreGraphics 0x7ff80acda032 function_evaluate + 369
3 CoreGraphics 0x7ff80acd9758 ripc_AcquireFunction + 831
4 CoreGraphics 0x7ff80acd8934 ripc_DrawShading + 5841
5 CoreGraphics 0x7ff80b1e6e7a CG::DisplayListExecutor::drawShading(CG::DisplayListEntryShading const*) + 438
6 CoreGraphics 0x7ff80b2de843 CG::DisplayList::executeEntries(std::__1::__wrap_iter<std::__1::shared_ptr<CG::DisplayListEntry const>*>, std::__1::__wrap_iter<std::__1::shared_ptr<CG::DisplayListEntry const>*>, CGContextDelegate*, CGRenderingState*, CGGStack*, CGRect const*, __CFDictionary const*, bool) + 195
7 CoreGraphics 0x7ff80ad5cb03 CG::DisplayList::execute(CGContextDelegate*, CGRenderingState*, CGGStack*, CGRect const*, __CFDictionary const*) + 341
8 CoreGraphics 0x7ff80ad5c869 CGDisplayListDrawInContextDelegate + 617
9 AppKit 0x7ff80900769a -[NSViewBackingLayerContentLayer drawInContext:] + 57
10 QuartzCore 0x7ff80dd19293 CABackingStoreUpdate_ + 630
11 QuartzCore 0x7ff80dd77f2e invocation function for block in CA::Layer::display_() + 53
12 QuartzCore 0x7ff80dd182f2 -[CALayer _display] + 2253
13 QuartzCore 0x7ff80dd43d67 display_callback(void*, void*) + 97
14 QuartzCore 0x7ff80dd43ce6 CA::DispatchGroup::dispatch(bool) + 108
15 libdispatch.dylib 0x7ff80510d59a _dispatch_client_callout + 8
16 libdispatch.dylib 0x7ff805113668 _dispatch_lane_serial_drain + 816
17 libdispatch.dylib 0x7ff805114100 _dispatch_lane_invoke + 377
18 libdispatch.dylib 0x7ff80511daee _dispatch_root_queue_drain_deferred_wlh + 271
19 libdispatch.dylib 0x7ff80511d3fd _dispatch_workloop_worker_thread + 451
20 libsystem_pthread.dylib 0x7ff8052b1c47 _pthread_wqthread + 327
21 libsystem_pthread.dylib 0x7ff8052b0b97 start_wqthread + 15
Thread 4 crashed with X86 Thread State (64-bit):
rax: 0x0000000000000000 rbx: 0x000000030ee7ea60 rcx: 0x0000600000a4df58 rdx: 0x000000030ee7ea60
rdi: 0x000060000237b9a8 rsi: 0x000000030ee7ea10 rbp: 0x000000030ee7ea00 rsp: 0x000000030ee7ea00
r8: 0xe000000000000002 r9: 0x1ffffffffffffffe r10: 0x000000008ee09001 r11: 0x0000000000000019
r12: 0x0000000000000001 r13: 0x0000000000000001 r14: 0x000060000237bdb0 r15: 0x000000030ee7ea10
rip: <unavailable> rfl: 0x0000000000000242
tmp0: 0x000000010fa69800 tmp1: 0x00007ff80acda1ce tmp2: 0x00007ff89da92500
Binary Images:
0x20606a000 - 0x206109fff dyld (*) <d5406f23-6967-39c4-beb5-6ae3293c7753> /usr/lib/dyld
0x10f2c9000 - 0x10f2d8fff libobjc-trampolines.dylib (*) <7e101877-a6ff-3331-99a3-4222cb254447> /usr/lib/libobjc-trampolines.dylib
0x10ebb0000 - 0x10ebc3fff tdsearch-x64.dylib (*) <1bd85264-ee0d-36f4-ab94-a6d81ec1bb0f> /Applications/BaKoMa TeX/*/tdsearch-x64.dylib
0x10f2fd000 - 0x10f7c1fff libwx_osx_cocoau_core-3.1.1.0.0.dylib (*) <0362fcaf-20bc-39e8-8a36-a8736662480e> /Applications/BaKoMa TeX/*/libwx_osx_cocoau_core-3.1.1.0.0.dylib
0x10ffb0000 - 0x1101a6fff libwx_baseu-3.1.1.0.0.dylib (*) <a0e6ac20-4be1-3f6d-810f-f116d5f29279> /Applications/BaKoMa TeX/*/libwx_baseu-3.1.1.0.0.dylib
0x10ecf2000 - 0x10ed6bfff libwx_osx_cocoau_aui-3.1.1.0.0.dylib (*) <97bc52cb-e328-361a-a7ea-58e1b7c04f58> /Applications/BaKoMa TeX/*/libwx_osx_cocoau_aui-3.1.1.0.0.dylib
0x10ef30000 - 0x10efc8fff libwx_osx_cocoau_html-3.1.1.0.0.dylib (*) <3abef03a-ea27-3b81-bdbe-136afe43eae2> /Applications/BaKoMa TeX/*/libwx_osx_cocoau_html-3.1.1.0.0.dylib
0x1104a8000 - 0x1105c4fff libwx_osx_cocoau_adv-3.1.1.0.0.dylib (*) <048f9971-9720-32ee-9b41-aa1224eba010> /Applications/BaKoMa TeX/*/libwx_osx_cocoau_adv-3.1.1.0.0.dylib
0x10ee4a000 - 0x10ee7cfff libwx_baseu_net-3.1.1.0.0.dylib (*) <4ef643bb-33ba-353e-8f1f-42e3cfd80259> /Applications/BaKoMa TeX/*/libwx_baseu_net-3.1.1.0.0.dylib
0x10ec63000 - 0x10ec7efff liblzma.5.dylib (*) <e4406e42-7bc4-3945-a1a4-e9b6874ef052> /usr/local/Cellar/xz/5.2.5/lib/liblzma.5.dylib
0x7ff7ffc5a000 - 0x7ff7ffc89fff runtime (*) <2c5acb8c-fbaf-31ab-aeb3-90905c3fa905> /usr/libexec/rosetta/runtime
0x10e235000 - 0x10e288fff libRosettaRuntime (*) <a61ec9e9-1174-3dc6-9cdb-0d31811f4850> /Library/Apple/*/libRosettaRuntime
0x104b78000 - 0x104d88fff texword (*) <feb2b054-bd1a-36a5-8803-605478982dc1> /Applications/BaKoMa TeX/*/texword
0x0 - 0xffffffffffffffff ??? (*) <00000000-0000-0000-0000-000000000000> ???
0x7ff805274000 - 0x7ff8052aeff7 libsystem_kernel.dylib (*) <4df0d732-7fc4-3200-8176-f1804c63f2c8> /usr/lib/system/libsystem_kernel.dylib
0x7ff8052af000 - 0x7ff8052bafff libsystem_pthread.dylib (*) <c64722b0-e96a-3fa5-96c3-b4beaf0c494a> /usr/lib/system/libsystem_pthread.dylib
0x7ff80dcf6000 - 0x7ff80e09dff9 com.apple.QuartzCore (1.11) <75bd9503-d1ab-32d2-bd5b-89ec89d3e8dd> /System/Library/Frameworks/QuartzCore.framework/Versions/A/QuartzCore
0x7ff8088c1000 - 0x7ff809cc6ffb com.apple.AppKit (6.9) <27fed5dd-d148-3238-bc95-1dac5dd57fa1> /System/Library/Frameworks/AppKit.framework/Versions/C/AppKit
0x7ff805314000 - 0x7ff8057acffc com.apple.CoreFoundation (6.9) <4d842118-bb65-3f01-9087-ff1a2e3ab0d5> /System/Library/Frameworks/CoreFoundation.framework/Versions/A/CoreFoundation
0x7ff8100c5000 - 0x7ff810360ff4 com.apple.HIToolbox (2.1.1) <06bf0872-3b34-3c7b-ad5b-7a447d793405> /System/Library/Frameworks/Carbon.framework/Versions/A/Frameworks/HIToolbox.framework/Versions/A/HIToolbox
0x7ff80acad000 - 0x7ff80b53effc com.apple.CoreGraphics (2.0) <c709e588-6adf-33ad-b7c8-5dbe61c7694d> /System/Library/Frameworks/CoreGraphics.framework/Versions/A/CoreGraphics
0x7ff80510a000 - 0x7ff805150ffd libdispatch.dylib (*) <4472f1a5-1d47-3665-ac8d-7adb0e9d2d87> /usr/lib/system/libdispatch.dylib
0x7ff805154000 - 0x7ff8051dbfff libsystem_c.dylib (*) <83c7b73c-86fe-32f9-85dd-f46fa2c1315b> /usr/lib/system/libsystem_c.dylib
Overview
Hi I'm an Android Engineer at Mapbox. We are building SDKs that would include Apple Music integrations. The issue is that we cannot publish an .aar file that includes .aar files.
Direct local .aar file dependencies are not supported when building an AAR
So the question is, why isn't Apple Music Android SDK published to maven? It's bizarre to download android sdks and then include raw versions in your app anyways. To be specific, the issue is talking about these files.
mediaplayback-release-1.1.1.aar
musickitauth-release-1.1.2.aar
If the artifacts were uploaded to maven, then we could make them part of a build.gradle like this:
implementation("com.apple.music.mediaplayback:1.1.1")
implementation("com.apple.music.musickitauth:1.1.2")
Questions
Has someone published these SDKs so we can include them without a raw download?
Is there a legal reason to not publish it ourselves?
Since the latest updates of macOS, probably 14.1, I have not been able to play media using SystemMusicPlayer.shared or MPMusicPlayerController.systemMusicPlayer. ApplicationMusicPlayer still works.
This is the error code I am encountering:
systemMusicPlayer _establishConnectionIfNeeded failed [application failed to launch]
Failed to prepareToPlay with error: Error Domain=MPMusicPlayerControllerErrorDomain Code=10 "Failed to obtain remoteObject [nil server]" UserInfo={NSDebugDescription=Failed to obtain remoteObject [nil server]}
Here is a small example to reproduce the bug.
MPMusicPlayerController.systemMusicPlayer.setQueue(with: ["1445887715"])
MPMusicPlayerController.systemMusicPlayer.prepareToPlay()
Or:
extension PlayableMusicItem {
// Not working
func playWithSystemPlayer() {
Task {
SystemMusicPlayer.shared.queue = [self]
try await SystemMusicPlayer.shared.play()
}
}
// Working
func playWithApplicationPlayer() {
Task {
ApplicationMusicPlayer.shared.queue = [self]
try await ApplicationMusicPlayer.shared.play()
}
}
}
I have set AVCaptureVideoDataOutput with 10-bit 420 YCbCr sample buffers. I use Core Image to process these pixel buffers for simple scaling/translation.
var dstBounds = CGRect.zero
dstBounds.size = dstImage.extent.size
/*
*srcImage is created from sample buffer received from Video Data Output
*/
_ciContext.render(dstImage, to: dstPixelBuffer!, bounds: dstImage.extent, colorSpace: srcImage.colorSpace )
I then set the color attachments to this dstPixelBuffer using set colorProfile in the app settings (BT.709 or BT.2020).
switch colorProfile {
case .BT709:
CVBufferSetAttachment(dstPixelBuffer!, kCVImageBufferColorPrimariesKey, kCVImageBufferColorPrimaries_ITU_R_709_2, .shouldPropagate)
CVBufferSetAttachment(dstPixelBuffer!, kCVImageBufferTransferFunctionKey, kCVImageBufferTransferFunction_ITU_R_709_2, .shouldPropagate)
CVBufferSetAttachment(dstPixelBuffer!, kCVImageBufferYCbCrMatrixKey, kCVImageBufferYCbCrMatrix_ITU_R_709_2, .shouldPropagate)
case .HLG2100:
CVBufferSetAttachment(dstPixelBuffer!, kCVImageBufferColorPrimariesKey, kCVImageBufferColorPrimaries_ITU_R_2020, .shouldPropagate)
CVBufferSetAttachment(dstPixelBuffer!, kCVImageBufferTransferFunctionKey, kCVImageBufferTransferFunction_ITU_R_2100_HLG, .shouldPropagate)
CVBufferSetAttachment(dstPixelBuffer!, kCVImageBufferYCbCrMatrixKey, kCVImageBufferYCbCrMatrix_ITU_R_2020, .shouldPropagate)
}
These pixel buffers are then vended to AVAssetWriter whose videoSettings is set to recommendedSettings by VDO. But the output seems to be washed out completely, esp. for SDR (BT.709). What am I doing wrong?