Does anyone know if HDR video is supported in a RealityView?

I have attempted to use VideoMaterial with HDR HLS stream, and also a TextureResource.DrawableQueue with rgba16Float in a ShaderGraphMaterial.

I'm capturing to 64RGBAHalf with AVPlayerItemVideoOutput and converting that to rgba16Float.

I don't believe it's displaying HDR properly or behaving like a raw AVPlayer.

Since we can't configure any EDR metadata or color space for a RealityView, how do we display HDR video? Is using rgba16Float supposed to be enough?

Is expecting the 64RGBAHalf capture to handle HDR properly a mistake and should I capture YUV and do the conversion myself?

Thank you

Answered by DTS Engineer in 803426022

Converting to rgba16Float pixel format alone won't render HDR in VideoMaterial, despite having sufficient range but it's a good start. You'll also need a transfer function like PQ or HLG.

VideoMaterial is expected to work with an HDR HLS stream so I'm looking into what you need and would appreciate any details you can provide on your use of TextureResource.DrawableQueue and ShaderGraphMaterial.

Alternatively, you can use AVPlayerViewController or VideoPlayerComponent to playback HDR content.

DISCLAIMER: This is a guess on my part

I'm not sure the Vision Pro display ever gets brighter than about 280 nits and something like EDR is in use all the time for anything that is HDR. I would love to be proven wrong however.

Converting to rgba16Float pixel format alone won't render HDR in VideoMaterial, despite having sufficient range but it's a good start. You'll also need a transfer function like PQ or HLG.

VideoMaterial is expected to work with an HDR HLS stream so I'm looking into what you need and would appreciate any details you can provide on your use of TextureResource.DrawableQueue and ShaderGraphMaterial.

Alternatively, you can use AVPlayerViewController or VideoPlayerComponent to playback HDR content.

It turns out that VideoMaterial does require any additional configuration for HDR HLS streaming so the issue is likely elsewhere.

Again, any details you can provide on the entire processing pipeline would be helpful.

Oh hello there

When you render HDR video on a Metal layer, you have to set the .colorspace, .edrMetadata and .wantsExtendedDynamicRangeContent

How does that work with a VideoMaterial where we can't set any of that when using a TextureResoure.DrawableQueue?

I can't use AVPlayerViewController or VideoPlayerComponent because I need to use a ShaderGraphMaterial to process the video.

You also probably have to get the display to match the dynamic range using AVDisplayCriteria(refreshRate: , videoDynamicRange: )

So these are all things that an AVPlayer handles that you have to account for when trying to present these textures correct?

One of the main reasons I am having to use a ShaderGraphMaterial is there doesn't seem to be a way to render a texture to a specific eye outside of RealityKit's Camera Index Switch.

@DTS Engineer I think RealityKit may not be suited to doing this, and the ability to address a texture to each eye with a Metal layer would be needed instead. What do you think

Does anyone know if HDR video is supported in a RealityView?
 
 
Q