Discuss spatial computing on Apple platforms and how to design and build an entirely new universe of apps and games for Apple Vision Pro.

All subtopics

Post

Replies

Boosts

Views

Activity

CompositorServices Or RealityKit
I have been concentrating on developing the visionOS application. While I am currently quite familiar with RealityKit, CompositorServices has also captured my attention. I have not yet acquired knowledge of CompositorServices. Could you please clarify whether it is essential for me to learn CompositorServices? Additionally, I would appreciate it if you could provide insights into the advantages of RealityKit and CompositorServices.
1
0
273
Oct ’24
DeactivatedEntities in scene not found with entity.findEntity(named:)?
I have a rkasset package, from which I load my scene. In the scene, I'm using entity.findEntity(named:"..") to find entities to activate/deactivate. When I have entities deactivated in the *.usda, they are not found with this method. Further inspection shows that the deactivated entities seem not to be compiled into the build. Is there anything I can set that prevents skipping the build for these deactivated entities?
1
0
178
Oct ’24
Manipulate objects in Reality Composer Pro
The documentation at https://developer.apple.com/documentation/visionos/designing-realitykit-content-with-reality-composer-pro states: Reality Composer Pro treats your imported assets as read-only. This is a huge obstacle for me, as I need to do multiple adjustments to the scene. I somehow managed to actually import one of my assets into the scene and can manipulate it directly, but now I can't figure out how I did this. As I have to prepare further assets and would like to do this directly in Reality Composer Pro, I'm looking for a way to actually load them into the scene. Any idea how this can be done?
5
0
288
Oct ’24
Make subdivision surfaces work in Reality Composer Pro
Information is light on the new subdivision support for USD models in RealityKit, and I have been unable so far to get one of my models to actually subdivide within Reality Composer Pro or Quick Look (or when viewing on Vision Pro). I've exported a few test models from Houdini and verified that it contains ' uniform token subdivisionScheme = "catmullClark"'. I've started with some very lightweight, basic meshes. But, when viewing, they simply look like polygonal meshes. There's no 'subdividing' occurring at runtime when viewing the models. Is there a trick to getting them to actually smooth-out?
5
0
271
Oct ’24
Quick Look viewer (3D/AR) breaking in-app
We have a native iOS app that supports the upload and display of USDZ files. It has been working great since beta (late 2022) and live launch (late 2023) until now. But recently we had reports from some users on Max model phones (14 + 15 Pro Max) at least. When they launch tap and launch a 3D file the Quick Look player is triggered. So far so good. But for affected users the controls along the top of the player - X (close) AR | Object (toggle) and share button - are moving too high up the phone screen and getting stuck (untappable) behind the phone's top status bar (time, camera bug, connection, battery). This means that when they open a USDZ file in AR or 3D view they have to hard-close the app to get out of it again. This doesn't happen when they open a USDZ file from Files, Dropbox etc on their phone (which also uses the Quick Look player). The controls only move up and get stuck when launching a USDZ from within our app. I'm at a loss to figure out what might be causing this on some phones and not all others. And why only when opening a USDZ file from our app! So far we have replicated this issue on a single iphone 14 Pro Max and a 15Pro Max, both running iOS18+ We have tested on other 15Pro Max's on same OS, and Pros, normal iPhones, Minis and are not experiencing the issue. You would think that a USDZ file is a USDZ file and that your iPhone knows what to do with it and open it in the Quick Look player regardless of where you open the file from. Why would the navigation items be moving if you open the USDZ file from within our app, and why only for some select users? We will continue to troubleshoot and test but I wanted to throw this out to the community in case anyone had experienced this or if anyone had any theories that would expedite our testing. Your thoughts are most appeciated! Here is a video showing the expected (correct) behaviour: https://www.dropbox.com/scl/fi/0sp8s4opaf2m4gukkcbrk/How-opening-a-USDZ-should-behave_correct-behaviour.MP4?rlkey=tzzau9x91mwox66gsgguryhep&st=qiykmne9&dl=0 and a screenSHOT attached below of what is happening on one of the affected user's iPhone 15Pro Max.
1
0
274
Oct ’24
Apple's Object capture
We are currently using Apple's Object capture module and wonder if it would be possible to collect the following data : Device information Current translation / rotation Focal length embedded to the image headers GPS localisation information. Information about the exposure time White balances and the color correction matrices We also have 2 additional questions : Is there an option to block close up accomodation of the camera ? Is there a way for the object capture module to take a video instead of a series of picture ?
1
0
266
Oct ’24
QuickLook and .heic
I"m trying to create a simple app for my students that will display .heic images taken with a nikon and them converted to .heic in the photos app. My attempts only result in the QuickLook viewer showing the images in 2d. Any guidance? Here is my ContentView: import SwiftUI import QuickLook struct ContentView: View { @State private var showQuickLook = false @State private var previewURL: URL? = nil // State to store the URL for Quick Look var body: some View { VStack { Button("See it in 3D") { // Set the URL for the file from the bundle and toggle Quick Look presentation if let imageURL = Bundle.main.url(forResource: "Michelia_fuego", withExtension: "heic") { previewURL = imageURL // Set the preview URL if the image is found showQuickLook.toggle() // Toggle to trigger Quick Look presentation } else { print("File not found") // Print error if the file is missing } } .quickLookPreview($previewURL) // Binding to the URL } } } #Preview { ContentView() }
2
0
271
Oct ’24
Can visionOS windows be smarter about following users?
I haven't found a way to programmatically position the main view in visionOS apps, which seems intentional. While this aligns with user-controlled window placement, it creates a challenge: as users move, they must constantly reposition the main window manually. A potential solution could be a feature that quickly brings the window to the user, perhaps via a custom gesture. This might improve user experience significantly. Given my current understanding of visionOS, I may be missing something. I'd appreciate any insights or alternative perspectives on this issue. Thoughts?
2
0
281
Oct ’24
Generating procedural textures sample code error.
Screenshot: Specific error message: validateComputeFunctionArguments:1149: failed assertion `Compute Function(textureShader): Shader uses texture(texture[0]) as read-write, but hardware does not support read-write texture of this pixel format.' OS: visionOS 2.1 (22N5548c) simulator. Link: https://developer.apple.com/documentation/visionos/generating-procedural-textures-in-visionos
1
0
237
Oct ’24
How to Display Transparent GIF on a 3D Point in AR Environment?
I’m currently working on a project where I need to display a transparent GIF at a specific 3D point within an AR environment. I’ve tried various methods, including using RealityKit with UnlitMaterial and TextureResource, but the GIF still appears with a black background instead of being transparent. Does anyone have experience with displaying animated transparent GIFs on an AR plane or point? Any guidance on how to achieve true transparency for GIFs in ARKit or RealityKit would be appreciated.
2
0
203
Oct ’24
HandTracking
Hi guys I'm currently developing a game for the Vision Pro and i'm trying to figure out how the hand tracking works so I can make a superpower appear when the user looks at their hand and widens it. But im really struggling to wrap my head around the whole concept and how to implement it in my code. Is there anything out there (other than apple doc) or anyone who could help me shed some light on the whole idea and how I could actually usefully implement it? would be much appreciated Thanks
1
0
220
Oct ’24
Synchronize 3D object animation inside a volume with SharePlay
I have been experimenting some experiences in which I would like to use SharePlay to allow the app to be used by multiple users. Currently I achieved sharing a volume containing a Reality Composer Pro scene inside of it, the scene contains some entities with an animation. So far I have been able to correctly share the volume and its content, with the animation playing without problems, but once I activate SharePlay different users see different moments of the animation if no animation at all. Is there a way to synchronize animations between all the users, no matter when someone entered the SharePlay session, aside from communicating the animation time once someone joins?
0
0
249
Oct ’24
Skysphere flickering w attachment at finial display of scene
There is a flickering and slight dimming occurring specifically on skysphere, at initial load of the scene, when using Attachment. This is observed in the simulator and on the real device. Since we cannot upload a video illustrating the undesirable behaviour, I have to describe how to setup the project for you to observe it. To replicate the issue, follow these steps: Create a new visionOS app using Xcode template, see image. Configure the project to launch directly into an immersive space (set Preferred Default Scene Session Role to Immersive Space Application Session Role in Info.plist), see image. Replace all swift files with those you will find in the attached texts. Add the skysphere image asset Skydome_8k found at this Apple Sample App Presenting an artist’s scene. Launch the app in debug mode via Xcode and onto the AVP device or simulator Continuously open and dismiss the skysphere by pressing on buttons Open Skysphere and Close. Observe the skysphere flicker and dim upon display of the skysphere. The current workaround is commented in file ThreeSixtySkysphereRealityView at lines 65, 70, 71, and 72. Uncomment these lines, and the flickering and dimming do not occur. Are we using attachments wrongly? Is this behavior known and documented? Or, is there really a bug in visionOS? AppModel InitialImmersiveView MainImmersiveView TestSkysphereAttachmentFlickerApp ThreeSixtySkysphereRealityView
1
0
216
Oct ’24
Issue with tracking multiple images using ARKit on VisionOS
We are using the ARKit image tracking feature on visionOS 2.0 with three pre-registered images. The image tracking works, but only one image is actively tracked at a time. When more than one target image is visible to the camera, it has difficulty detecting and tracking the other images. Is this the expected behavior in visionOS, or is there something we need to do to resolve this issue?
0
0
210
Oct ’24