Access to ARKit underlying scene mesh in visionOS

Is there anyway to reset the scan memory Vision Pro stores on-device, so that upon every new scanning in my application, it starts from scratch rather than getting instantly recognized. In Apple Vision Pro Privacy overview (https://www.apple.com/privacy/docs/Apple_Vision_Pro_Privacy_Overview.pdf), it is stated:

"visionOS builds a three-dimensional model to map your surroundings on-device. Apple Vision Pro uses a combination of camera and LiDAR data to map the area around you and save that model on-device. The model enables visionOS to alert you about real-life obstacles, as well as appropriately reflect the lighting and shadows of your physical space. visionOS uses audio ray tracing to analyze your room’s acoustic properties on-device to adapt and match sound to your space. The underlying scene mesh is stored on-device and encrypted with your passcode if one is set"

How to access and erase the, and I quote, “underlying scene mesh stored on-device”?

Answered by JoonAhn in 802192022

There is no way we developers access to building MeshAnchors. Apple's building MeshAnchors is a complicated system function, not perfect, generating unwanted unexpected MeshAnchors. We have to accept the current and wait till solved. Apple engineers are aware of.

FindSurface Real-Time Preview - Apple Vision Pro https://youtu.be/CGjhfKxjpUU

FindSurface Real-Time 1 - Apple Vision Pro https://youtu.be/2aSMBrPTEtg

Accepted Answer

There is no way we developers access to building MeshAnchors. Apple's building MeshAnchors is a complicated system function, not perfect, generating unwanted unexpected MeshAnchors. We have to accept the current and wait till solved. Apple engineers are aware of.

FindSurface Real-Time Preview - Apple Vision Pro https://youtu.be/CGjhfKxjpUU

FindSurface Real-Time 1 - Apple Vision Pro https://youtu.be/2aSMBrPTEtg

Apple Vision Pro

Object planes, spheres, cylinders, cones, and tori can now be effortlessly detected and measured in real-time at up to 120 fps (found-per-second).

Check out the source code of the AVP app on GitHub CurvSurf for more details: GitHub Link

Access to ARKit underlying scene mesh in visionOS
 
 
Q