sceneUnderstanding occlusion not work with blendMode = alpha in ios 18

If I import a USDZ model with blendMode set to alpha, occlusion does not work on iPhone with iOS 18. How should transparent materials and occlusion be properly used in the new RealityKit? Additionally, new artifacts have appeared when working with transparent objects overlapping each other. The transparency results do not blend but rather parts of the model just not rendering.

Answered by DTS Engineer in 805521022

Hey @Cektant,

Thank you for clarifying, I see the difference now!

Please file a bug report for the issue demonstrated in your video using Feedback Assistant, I have no workaround to suggest for that issue.

The following change was also noticed

In iOS 16, 17:

materialPlane = UnlitMaterial(color: .init(white: 1.0, alpha: 0.0))

created a fully transparent material.

In iOS 18, with this approach, the material becomes fully opaque and white. To fix this, I added:

materialPlane?.blending = .transparent(opacity: 0.0)

The difference in rendering behavior across different iOS versions is absolutely obvious; something has changed in the post-processing of the alpha value.

This change was intentional and is listed in the visionOS 2 Release Notes (see id 118210191, it applies to all platforms, despite only appearing in the visionOS 2 Release Notes). It is expected that you would have to set the blending mode in iOS 18.

Best regards,

Greg

This also applies to ar Quick Look.

Hello @Cektant,

Could you provide a link to an example project that reproduces the issues you mentioned?

Best regards,

Greg

And

https://youtube.com/shorts/U51bsfvQ2FM https://youtube.com/shorts/cFctihaemsI

In the first video, iOS 17.5.1, and occlusion works correctly, in the second video, iOS 18.0, and it does not work correctly. The app builds are identical. Here is the shader code.

#include <metal_stdlib>
#include <RealityKit/RealityKit.h>
using namespace metal;

constexpr sampler textureSampler(coord::normalized,
                                 address::repeat,
                                 filter::linear,
                                 mip_filter::linear);

[[visible]]
void pngImage(realitykit::surface_parameters params) {
    auto surface = params.surface();
    float2 uv = params.geometry().uv0() * 8;
    auto tex = params.textures();
    half3 color = (half3)tex.base_color().sample(textureSampler, uv).rgb;
    half opacity = tex.opacity().sample(textureSampler, uv).r;
    if (color.r > 0.15 && color.g > 0.15 && color.b > 0.15) {
        surface.set_emissive_color(color * 1.5);
        opacity = 0.5;
    } else {
        opacity *= 0.0;
    }
    params.surface().set_opacity(opacity);
}

The same problem occurs, for example, with this shader.

[[visible]]
void strobe(realitykit::surface_parameters params) {
    auto surface = params.surface();
    half3 white = half3(1.0, 1.0, 1.0);
    params.surface().set_base_color(white);
    surface.set_base_color(white);
    surface.set_emissive_color(white);
    float time = params.uniforms().time();
    half opacity = abs (sin (time * 2) * 0.5 + 0.5 );
    surface.set_opacity(opacity);
}

Hello @Cektant,

Sorry, I'm struggling to see the difference between the two videos. It appears that occlusion is working in both of those videos?

Best regards,

Greg

@DTS Engineer

I will try to explain more clearly. The first image shows the correct behavior (iOS 17), while the second shows the incorrect behavior (iOS 18). As you can see, the occlusion is not working.

I will also attach two videos of the same build so that you can see the strange behavior on iOS 18. https://youtube.com/shorts/VPCMBiASxLc?feature=share https://youtube.com/shorts/Q8LL4lU2cZE?feature=share

You can also try this model in Quick Look on iOS 17 and 18 to see the difference: https://vizbl-2.s3.us-west-1.amazonaws.com/native/cc60dd92-58c2-4867-82a1-55afae8caf9f.usdz

The following change was also noticed

In iOS 16, 17:

materialPlane = UnlitMaterial(color: .init(white: 1.0, alpha: 0.0))

created a fully transparent material.

In iOS 18, with this approach, the material becomes fully opaque and white. To fix this, I added:

materialPlane?.blending = .transparent(opacity: 0.0)

The difference in rendering behavior across different iOS versions is absolutely obvious; something has changed in the post-processing of the alpha value.

Hey @Cektant,

Thank you for clarifying, I see the difference now!

Please file a bug report for the issue demonstrated in your video using Feedback Assistant, I have no workaround to suggest for that issue.

The following change was also noticed

In iOS 16, 17:

materialPlane = UnlitMaterial(color: .init(white: 1.0, alpha: 0.0))

created a fully transparent material.

In iOS 18, with this approach, the material becomes fully opaque and white. To fix this, I added:

materialPlane?.blending = .transparent(opacity: 0.0)

The difference in rendering behavior across different iOS versions is absolutely obvious; something has changed in the post-processing of the alpha value.

This change was intentional and is listed in the visionOS 2 Release Notes (see id 118210191, it applies to all platforms, despite only appearing in the visionOS 2 Release Notes). It is expected that you would have to set the blending mode in iOS 18.

Best regards,

Greg

sceneUnderstanding occlusion not work with blendMode = alpha in ios 18
 
 
Q