Reality Composer Pro

RSS for tag

Prototype and produce content for AR experiences using Reality Composer Pro.

Learn More

Post

Replies

Boosts

Views

Activity

Help Building spatial video app using Quick Look preview
Hello everyone I am looking to build a simple app for displaying a spatial video using the quick look preview API. I have been following this video which is useful: https://developer.apple.com/videos/play/wwdc2024/10166/#:~:text=QuickLook%20is%20the%20system%20standard,just%20like%20the%20Photos%20app. I am new to building apps in Xcode, and I could do with some advice on how to build the rest of the project mentioned in the above video. I was wondering if there is source code or a project example available anywhere for an app the uses the Quick Look preview API?
0
0
352
Jul ’24
Timeline Animation in Reality Composer Pro
I'm using Reality Composer Pro Version 2.0 Version 2.0 (448.0.10.0.2) avaliable in Xcode_16_beta_4 When adding a animation from the Animation Library component on my armature to a timeline - the animation does not 'freeze' on the last frame. Is there a way to 'freeze' the first or last frames when adding animations to the timeline? And how should I expect the first and last keys on my animations to behave with the default 'rest pose' on the imported usd file?
1
0
608
Jul ’24
RealityKit scene with the Entity Component System
I'm following WWDC for interactive 3D content in reality composer pro and apple's document https://developer.apple.com/wwdc24/10102 https://developer.apple.com/documentation/realitykit/implementing-systems-for-entities-in-a-scene#Retrieve-entities-with-an-entity-query However, this simple code to declare a dummy Component and System has compile error /Users/Workspaces/repository/Packages/RealityKitContent/Sources/RealityKitContent/RobotComponent.swift:18:24 Static property 'query' is not concurrency-safe because non-'Sendable' type 'EntityQuery' may have shared mutable state // Define a query to return all entities with a MyComponent. private static let query = EntityQuery(where: .has(MyComponent.self)) // Initializer is required. Use an empty implementation if there's no setup needed. required init(scene: Scene) { } // Iterate through all entities containing a MyComponent. func update(context: SceneUpdateContext) { for entity in context.entities( matching: Self.query, updatingSystemWhen: .rendering ) { // Make per-update changes to each entity here. } } } I'm using XCode beta3 and project target visionos 2
1
0
466
Jul ’24
LowLevelMesh: Triangle Colors
I am trying to follow the documentation with the beta version of visionOS with the new realitykit LowLevelMesh construct (https://developer.apple.com/documentation/realitykit/lowlevelmesh) that draws a triangle. Although the code indicates different colors for each of the 3 vertex points, the triangle renders in white. I believe that the missing link may be a shadergraph material, but because I will be drawing millions of triangles, with colors defined at the nodes and interpolated over the area of the triangles, I want to make sure it is efficient, either with shadergraph materials or perhaps metal. I have, with an earlier version of the app I'm working on, successfully used a shadergraph material with MeshDescriptor.primatives as polygons for tetrahedrons. However, that is inefficient for more than 1,000 tetrahedrons (and crashes) so I'm trying to use the new LowLevelMesh instead (with each tetrahedron split into 4 triangles). However, I can't get very far using the example code from the documentation (that results in the white triangles), even trying to use the default shadergraph (GridMaterial) without getting quite a few error messages. I try to fix the errors with the suggested fixes and then get new ones (whack-a-mole) until it's seems to be all broken.... So in addition to my general question of shadergraph vs metal for a LowLevelMesh, a concrete example of using a shadergraph material with LowLevelMesh would be most appreciated! Thanks.
3
0
668
Jul ’24
Reality Composer Pro - Audio not working for .USDZ
Based on info online I'm under the impression we can add spatial audio to USDZ files using Reality Composer Pro, however I've been unable to hear this audio outside of the preview audio in the scene inspector. Attached is a screenshot with how I've laid out the scene. I see the 3D object fine on mobile and Vision Pro, but can't get audio to loop. I have ensured the audio file is in the scene linked as the resource for the spatial audio node. Am I off on setting this up, it's broken or this simply isn't a feature to save back to USDZ? In the following link they note their USDZ could "play an audio track while viewing the model", but the model isn't there anymore. Can someone confirm where I might be off please?
3
0
716
Jul ’24
Cloud Service for Apple Vision Pro App
For all the AVP devs out there, what cloud service are you using to load content in your app that has extremely low latency? I tried using CloudKit and it did not work well at all. Latency was super bad :/ Firebase looks like the most promising at this point?? Wish Apple would create an ultra low latency cloud service for streaming high quality content such as USDZ files and scenes made in Reality Composer Pro.
1
0
515
Jul ’24