I have attempted to use VideoMaterial with HDR HLS stream, and also a TextureResource.DrawableQueue with rgba16Float in a ShaderGraphMaterial.
I'm capturing to 64RGBAHalf with AVPlayerItemVideoOutput and converting that to rgba16Float.
I don't believe it's displaying HDR properly or behaving like a raw AVPlayer.
Since we can't configure any EDR metadata or color space for a RealityView, how do we display HDR video? Is using rgba16Float supposed to be enough?
Is expecting the 64RGBAHalf capture to handle HDR properly a mistake and should I capture YUV and do the conversion myself?
Thank you
Converting to rgba16Float pixel format alone won't render HDR in VideoMaterial, despite having sufficient range but it's a good start. You'll also need a transfer function like PQ or HLG.
VideoMaterial is expected to work with an HDR HLS stream so I'm looking into what you need and would appreciate any details you can provide on your use of TextureResource.DrawableQueue and ShaderGraphMaterial.
Alternatively, you can use AVPlayerViewController or VideoPlayerComponent to playback HDR content.