Change zFar in RealityView

I am working on a RealityView on iOS 18 that needs to render objects farther away than 1,000 meters. My app is used outside in open areas. I am using RealityView with content.camera = .spatialTracking and I have turned off occlusion, collisions, and plane detection with a simple scene understanding like this.

let configuration = SpatialTrackingSession.Configuration(
    tracking: [.camera],
    sceneUnderstanding: [], //We don't want occlusions, collisions, etc
    camera: .back)
let session = SpatialTrackingSession()
if let unavailable = await session.run(configuration) {
    print("unavailable \(unavailable)")
}

Is this possible with spatialTracking with RealityView or with ARView?

I have my RealityView working on visionOS inside an ImmersiveSpace. On visionOS I don't have the camera as a passthrough, it is virtual scene and it has wold tracking set up via the WorldTrackingProvider and I can render objects father away than 1000 meters. I would like to do the same thing on iOS. I don't need to have the camera pass through, but I do need to have the world tracking.

I see that PerspectiveCameraComponent lets me set the near and far clipping planes, but I don't see how I can use that camera with world tracking.

Change zFar in RealityView
 
 
Q