iOS Object Capture Questions

I'm really excited about the Object Capture APIs being moved to iOS, and the complex UI shown in the WWDC session. I have a few unanswered questions:

  • Where is the sample code available from?
  • Are the new Object Capture APIs on iOS limited to certain devices?
  • Can we capture images from the front facing cameras?

I’m running Xcode 15.0 beta (15A5160n) and building on an iPhone 14 Pro Max simulator running iOS 17.0 (21A5248u), and I’ve set the iOS Deployment Target to 17.0, but the compiler cannot find the ObjectCaptureView or ObjectCaptureSession APIs from RealityKit. I see in the documentation that these are in Beta.

Is there something I’m missing?

You’ll probably need the sample code, which is unavailable as of Wednesday afternoon

We apologize the sample app is not yet available as anticipated -- please stay tuned as we expect it to be available soon in the developer documentation for Object Capture.

Supported devices for the guided capture UI are listed in the session video and include the iPhone 12 Pro (w/ LiDAR) and newer as well as the LiDAR-capable iPad Pro (3rd generation) and newer. We recommend that you check the ObjectCaptureSession.isSupported on a given device before trying to make a session to see if a device is supported. (Note simulator is not supported.)

The ObjectCaptureSession utilizes only the back-facing cameras and LiDAR.

Hey! Any updates?

The most recent version of this code works great. Thanks for the update.

Is there up-to-date code?

The link above is gone. ALso, one of the people mentioned that it was broken as of iOS 17.

iOS Object Capture Questions
 
 
Q