Discuss Spatial Computing on Apple Platforms.

Post

Replies

Boosts

Views

Activity

Determining if an ObjectAnchor is currently observed
I'm writing code using ObjectAnchor for Vision OS. If an object is tracked, and then becomes not visible (either because the user looked in a different direction, or because the tracked object was occluded by another object), it is still tracked and you get anchor updates (e.g., object permanence). For my application, it would be very helpful if I could determine if the object is currently being observed, or it is not currently observed and just assumed to be in the same location as seen previously. ObjectAnchor.isTracked just seems to indicate whether it is getting anchor updates. I don't see anything in the ObjectAnchor or AnchorUpdate that would allow me to determine if the object is currently observed. Does anyone know of a way to do this, or would this be a feature request?
2
0
155
3w
Synchronizing Physical Properties of EntityEquipment in TableTopKi
I am working on adding synchronized physical properties to EntityEquipment in TableTopKit, allowing seamless coordination during GroupActivities sessions between players. Current Approach and Limitations I have tried setting EntityEquipment's state to DieState and treating it as a TossableRepresentation object. This approach achieves basic physical properties synchronized across players. However, it has several limitations: No Collision Detection Between Dice: Multiple dice do not collide with each other. Shape Limitations: Custom shapes, like parallelepipeds, cannot be configured. Below is my existing code for Base Entity Equipment without physical properties: struct CubeWithPhysics: EntityEquipment { let id: ID let entity: Entity var initialState: BaseEquipmentState init(id: ID, entity: Entity) { self.id = id self.entity = entity initialState = .init(parentID: .tableID, pose: .init(position: .zero, rotation: .zero), entity: self.entity) } } I’d appreciate any guidance on the recommended approach to adding synchronized physical properties to EntityEquipment.
4
4
363
3w
Reality Composer Pro timelines management, but using code
Is it possible to manage the behavior of timeline totally from code? I am exploring the Compose interactive 3D content in Reality Composer Pro sample project after seeing the related video, but the example shows only the use of Behaviors from RCP to activate timelines actions. I was wondering if it is possible to, somehow, retrieve some kind of timeline controller that allows me access to its informations just like the AnimationPlaybackController does with single animations. What I would like to achieve is being able to play/pause/retrieve timestamp from them in order to allow synchronization between different users on SharePlay
1
0
210
Oct ’24
Xcode 16 crashes my Vision Pro Object Tracking App
I created an Object & Hand Tracking app based on the sample code released here by Apple. https://developer.apple.com/documentation/visionos/exploring_object_tracking_with_arkit The app worked great and everything was fine, but I realized I was coding on Xcode 16 beta 3, so I installed the latest Xcode 16 from the App Store and tested by app there, and it completely crashed. No idea why. Here is the console dyld[1457]: Symbol not found: _$ss13withTaskGroup2of9returning9isolation4bodyq_xm_q_mScA_pSgYiq_ScGyxGzYaXEtYas8SendableRzr0_lF Referenced from: <3AF14FE4-0A5F-381C-9FC5-E2520728FC65> /private/var/containers/Bundle/Application/F74E88F2-874F-4AF4-9D9A-0EFB51C9B1BD/Hand Tracking.app/Hand Tracking.debug.dylib Expected in: <2F158065-9DC8-33D2-A4BF-CF0C8A32131B> /usr/lib/swift/libswift_Concurrency.dylib It was working perfectly fine on Xcode 16 beta 3, which makes me think it's an Xcode 16 issue, but no idea how to fix this. I also installed Xcode 16.2 beta (the newest beta) but same error. Please help if anyone knows what is wrong!
1
0
253
3w
RealityView Limits VisionOS
Im asking myself we are the limits of RealityView. For example is it possible to place an entity on postion (x=800m,y=0,z=-900m) What happens if i walk from my (0,0,0) to this point, will i see the entity then ? Does someone know where are the limits ?
1
0
138
Oct ’24
WebXR Immersive AR Support & DOM Overlays
As mentioned in https://forums.developer.apple.com/forums/thread/756736?answerId=810096022#810096022 Is there any update about the full support to WebXR AR Module, which should enable immersive-ar mode? Are the features such as DOM overlays and WebGPU bindings on the roadmap? Is it possible to capture stereoscopic video either internally or externally or via airplay for debugging purposes? Thanks
0
2
254
4w
How to convert in iOS a RealityKit SpatialTapGesture value to an Entity coordinate?
I have an app with a visionOS target, and I want to add an iOS target. Both are based on RealityKit. I want to use a SpatialTapGesture to get the tap coordinate local to the entity tapped. In visionOS this is easy: SpatialTapGesture(coordinateSpace: .local) .targetedToAnyEntity() .onEnded { tap in let entity = tap.entity let localPoint3D = tap.convert(tap.location3D, from: .local, to: entity) // … } However, according to the docs, the convert function seems to exist only in visionOS, not in iOS. So how can I do this conversion in iOS? PS: This was already posted on StackOverflow without success. There, I tried to find a workaround, but I failed.
8
0
287
Oct ’24
Spatial streaming from iPhone
Hi, I am trying to stream spatial video in realtime from my iPhone 16. I am able to record spatial video as a file output using: let videoDeviceOutput = AVCaptureMovieFileOutput() However, when I try to grab the raw sample buffer, it doesn't include any spatial information: let captureOutput = AVCaptureVideoDataOutput() //when init camera session.addOutput(captureOutput) captureOutput.setSampleBufferDelegate(self, queue: sessionQueue) //finally func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) { //use sample buffer (but no spatial data available here) } Is this how it's supposed to work or maybe I am missing something? this video: https://developer.apple.com/videos/play/wwdc2023/10071 gives us a clue towards setting up spatial streaming and I've got the backend all ready for 3D HLS streaming. Now I am only stuck at how to send the video stream to my server.
1
0
324
Oct ’24
Synchronizing Multi-User AR Experiences on Apple Vision Pro
Hello Developers, I am currently in the initial planning stages of my bachelor thesis in computer science, where I will be developing an application in collaboration with a manufacturer of large-scale machinery. One of the core features I aim to implement is the ability for multiple Apple Vision Pro users to view the same object in augmented reality simultaneously, each from their respective positions relative to the object. I am still exploring how best to achieve this feature. My initial approach involves designating one device as the host of a "room" within the application, allowing other users to join. If I can accurately determine the relative positions of all users to the host device, it should be possible to display the AR content correctly in terms of angle, size, and location for each user. Despite my research, I haven't found much information on similar projects, and I would appreciate any insights or suggestions. Specifically, I am curious about common approaches for synchronizing AR experiences across multiple devices. Given that the Apple Vision Pro does not have a GPS sensor, I am also looking for alternative methods to precisely determine the positions of multiple devices relative to each other. Any advice or shared experiences would be greatly appreciated! Best regards, Revin
4
0
239
Oct ’24
Is dismissWindow actually asynchronous?
On visionOS, I have discovered that if dismissWindow is followed immediately by a call to openWindow, the new window does not open where the user is looking at. Instead, it appears at the same location as the dismissed window. However, if I open the new window after a small delay, or after UIScene's willDeactivateNotification, the new window correctly opens in front of the user. (I tested this within a opened immersive space.) Does this imply that dismissWindow is actually asynchronous, in the sense that it requires extra time to reset certain internal states before the next openWindow can be called? What is the best practice to close a window, then open a new window in front of the user's current head position?
0
0
186
Oct ’24
How to set the AttractionCenter for a ParticleEmitterComponent in a System with real time updates
I am trying to achieve an effect such that the particles of a particle system are attracted to my hand entity. The hand entity is essentially an AnchorEntity that is tracking my right hand. let particleEmitterEntities = context.entities(matching: particleEmitterQuery, updatingSystemWhen: .rendering) for particleEmitterEntity in particleEmitterEntities { if var particleEmitter = particleEmitterEntity.components[ParticleEmitterComponent.self] { particleEmitter.mainEmitter.attractionCenter = rightHandEntity.position(relativeTo: nil) // trying to get the world space position of the hand // I also tried relative to particleEmitterEntity particleEmitterEntity.components[ParticleEmitterComponent.self] = particleEmitter } else { fatalError("Cannot find particle emitter") } } The particle attraction center doesn't seem to update Another issue I am noticing here that My particle system doesn't show the particle image a lot of times and just renders a placeholder square when I do this, when I comment this code out I get the right particle image. I believe this is due to the number of times this loop runs to update the position of the attraction center. What is the right way to do an effect where the particles are attracted to my hand.
3
0
244
Oct ’24
Quick Look viewer (3D/AR) breaking in-app
We have a native iOS app that supports the upload and display of USDZ files. It has been working great since beta (late 2022) and live launch (late 2023) until now. But recently we had reports from some users on Max model phones (14 + 15 Pro Max) at least. When they launch tap and launch a 3D file the Quick Look player is triggered. So far so good. But for affected users the controls along the top of the player - X (close) AR | Object (toggle) and share button - are moving too high up the phone screen and getting stuck (untappable) behind the phone's top status bar (time, camera bug, connection, battery). This means that when they open a USDZ file in AR or 3D view they have to hard-close the app to get out of it again. This doesn't happen when they open a USDZ file from Files, Dropbox etc on their phone (which also uses the Quick Look player). The controls only move up and get stuck when launching a USDZ from within our app. I'm at a loss to figure out what might be causing this on some phones and not all others. And why only when opening a USDZ file from our app! So far we have replicated this issue on a single iphone 14 Pro Max and a 15Pro Max, both running iOS18+ We have tested on other 15Pro Max's on same OS, and Pros, normal iPhones, Minis and are not experiencing the issue. You would think that a USDZ file is a USDZ file and that your iPhone knows what to do with it and open it in the Quick Look player regardless of where you open the file from. Why would the navigation items be moving if you open the USDZ file from within our app, and why only for some select users? We will continue to troubleshoot and test but I wanted to throw this out to the community in case anyone had experienced this or if anyone had any theories that would expedite our testing. Your thoughts are most appeciated! Here is a video showing the expected (correct) behaviour: https://www.dropbox.com/scl/fi/0sp8s4opaf2m4gukkcbrk/How-opening-a-USDZ-should-behave_correct-behaviour.MP4?rlkey=tzzau9x91mwox66gsgguryhep&st=qiykmne9&dl=0 and a screenSHOT attached below of what is happening on one of the affected user's iPhone 15Pro Max.
1
0
268
Oct ’24
USDZ with Blend Shapes Workflow Recommendations
Hi, since RealityKit 4 now supports Blend Shapes I was wondering if there are any workflow or tooling recommendations to bake/export them into a USDZ. Are Blender or Cinema4D capable to do that out of the box? Should we look into NVIDIA omniverse (https://docs.omniverse.nvidia.com/connect/latest/blender/manual.htm) So far this topic seems very sparsely documented and I would appreciate any hints. Thank you!
2
0
357
Oct ’24
RealityKit Subdivide
In the Discover RealityKit APIs for iOS, macOS, and visionOS presentation, there was a slide at the end highlighting new features not covered in the video. One of them was surface subdivision, but I have not been able to find any documentation or APIs that support this feature. Does anyone have any further details or how this works in RealityKit?
4
1
590
Jul ’24
Make subdivision surfaces work in Reality Composer Pro
Information is light on the new subdivision support for USD models in RealityKit, and I have been unable so far to get one of my models to actually subdivide within Reality Composer Pro or Quick Look (or when viewing on Vision Pro). I've exported a few test models from Houdini and verified that it contains ' uniform token subdivisionScheme = "catmullClark"'. I've started with some very lightweight, basic meshes. But, when viewing, they simply look like polygonal meshes. There's no 'subdividing' occurring at runtime when viewing the models. Is there a trick to getting them to actually smooth-out?
5
0
269
Oct ’24
Understanding the Distance for Physical Object Visibility in Vision Pro Immersive System
I understand that the system helps maintain user comfort by automatically adjusting the opacity of content in certain situations, like when someone moves too quickly or gets too close to a physical object. The content in front of them dims briefly to allow a clearer view of their surroundings. And I'd like to know the specific distance at which the system begins to show the physical object, or what criteria are used for this adjustment.
0
0
173
Oct ’24