My visionOS app was rejected for not supporting gaze-and-pinch features, but as far as I can tell there is no way to track the users eye movements in visionOS. (The intended means of interaction for my app is an extended gamepad.) I am able to detect pinch but can't find any way to detect where the user is looking. ARFaceAnchor and lookAtPoint appear to not be available in visionOS. So how do we go about doing this? For context, I am porting a metal game to vision OS from iOS.
When using CAMetalLayer you can use UIKit and SwiftUI input support. The event location will be the intersection of the gaze with the CAMetalLayer.
For the CompositorLayer you can use the onSpatialEvent callback which provides the pinch and gaze when the pinch began: https://developer.apple.com/documentation/compositorservices/layerrenderer/4245856-onspatialevent