I'm following WWDC for interactive 3D content in reality composer pro and apple's document
https://developer.apple.com/wwdc24/10102
https://developer.apple.com/documentation/realitykit/implementing-systems-for-entities-in-a-scene#Retrieve-entities-with-an-entity-query
However, this simple code to declare a dummy Component and System has compile error
/Users/Workspaces/repository/Packages/RealityKitContent/Sources/RealityKitContent/RobotComponent.swift:18:24 Static property 'query' is not concurrency-safe because non-'Sendable' type 'EntityQuery' may have shared mutable state
// Define a query to return all entities with a MyComponent.
private static let query = EntityQuery(where: .has(MyComponent.self))
// Initializer is required. Use an empty implementation if there's no setup needed.
required init(scene: Scene) { }
// Iterate through all entities containing a MyComponent.
func update(context: SceneUpdateContext) {
for entity in context.entities(
matching: Self.query,
updatingSystemWhen: .rendering
) {
// Make per-update changes to each entity here.
}
}
}
I'm using XCode beta3 and project target visionos 2
Reality Composer Pro
RSS for tagPrototype and produce content for AR experiences using Reality Composer Pro.
Post
Replies
Boosts
Views
Activity
Given that one can add custom components and expose them via RCP, how do I go about implementing my components / system in a way where when I make a parameter change that gets applied to the entitiy in the RCP viewport?
CrashLog panicString
I have an Entity exported from Blender, after loaded from RealityView, the "Body" and "Mesh" Entity have no ModelComponent, but they have Material Bindings reference, how can I update their materials?
I have read this thread to send notification to play animations in RCP.
If I now want to pause and come back later or stop and reset the timeline, is there a way to do so?
The object capture feature in Reality Composer App is only available in iOS and iPadOS at the moment, would this feature be available for visionOS in near future?
Reality Composer App Store
https://apps.apple.com/us/app/reality-composer/id1462358802
I can't see any tools or buttons about Timeline, In my interface, there are just Project Browser, Shader Graph, Audio Mixer, and Statistics. Why? How to trigger or turn on it? Please help me, thanks.
I am trying to follow the documentation with the beta version of visionOS with the new realitykit LowLevelMesh construct (https://developer.apple.com/documentation/realitykit/lowlevelmesh) that draws a triangle. Although the code indicates different colors for each of the 3 vertex points, the triangle renders in white.
I believe that the missing link may be a shadergraph material, but because I will be drawing millions of triangles, with colors defined at the nodes and interpolated over the area of the triangles, I want to make sure it is efficient, either with shadergraph materials or perhaps metal.
I have, with an earlier version of the app I'm working on, successfully used a shadergraph material with MeshDescriptor.primatives as polygons for tetrahedrons. However, that is inefficient for more than 1,000 tetrahedrons (and crashes) so I'm trying to use the new LowLevelMesh instead (with each tetrahedron split into 4 triangles). However, I can't get very far using the example code from the documentation (that results in the white triangles), even trying to use the default shadergraph (GridMaterial) without getting quite a few error messages. I try to fix the errors with the suggested fixes and then get new ones (whack-a-mole) until it's seems to be all broken....
So in addition to my general question of shadergraph vs metal for a LowLevelMesh, a concrete example of using a shadergraph material with LowLevelMesh would be most appreciated! Thanks.
For years, the preliminary behaviours provided a way to trigger an action sequence (now called timeline) when the user came close to an object.
I could not find the same in the new RealityComposerPro.
Hey, is there a way to create a good ground shadow shader? I'm using a ground with an unlit material and I can't get the ground shadow to work properly. If I use a PBR texture it works better, but i can barely see it and I want to control the intensity more.
Hey, I need help achieving realistic fog and clouds for immersive spaces. Making 3D planes with transparent fog/cloud textures work, but they create issues when there are a lot of them overlapping each other. Also I can't get a good result with particles either.
Thanks in advance!
I loaded usdz of a room model. After putting it into RealityView, the entire model surrounded me. Even if there was a SwiftUI View in front of me, I couldn't interact with it with my fingers. How do I set it up so that SwiftUI responds to my finger tap gesture first?
Based on info online I'm under the impression we can add spatial audio to USDZ files using Reality Composer Pro, however I've been unable to hear this audio outside of the preview audio in the scene inspector. Attached is a screenshot with how I've laid out the scene.
I see the 3D object fine on mobile and Vision Pro, but can't get audio to loop. I have ensured the audio file is in the scene linked as the resource for the spatial audio node. Am I off on setting this up, it's broken or this simply isn't a feature to save back to USDZ? In the following link they note their USDZ could "play an audio track while viewing the model", but the model isn't there anymore.
Can someone confirm where I might be off please?
For all the AVP devs out there, what cloud service are you using to load content in your app that has extremely low latency? I tried using CloudKit and it did not work well at all. Latency was super bad :/
Firebase looks like the most promising at this point??
Wish Apple would create an ultra low latency cloud service for streaming high quality content such as USDZ files and scenes made in Reality Composer Pro.
hi, would just like a reality check: can anyone else rename a Timeline in Reality Composer Pro as shown in the Compose interactive 3D content in Reality Composer Pro presentation from several weeks ago.
because, I cannot, thank you!
Can RealityView and Custom render engine (Metal) be mixed for rendering? For example, I want to use Metal for post-processing.
In the RealityKit API, there are instructions for immersive scenes. What should I pay attention to when using full and mixed modes separately or at the same time? How to intelligently control the scene mixing results through the brighten method?
I used other software to export usdz files, hoping to further adjust the PBR and other parameters in the model in Reality Composer Pro. Because usdz is a whole, I cannot use the mouse to select a specific model in usdz on the interface. I have to find the models I want to modify one by one in the list on the left.
This method of operation is too inefficient. Is there a better way?
Or is there a way to disassemble the usdz file into numerous sub-models and texture material files, so that I can select it with the mouse on the interface in Reality Composer Pro and then modify the PBR, which would be much more efficient.
Hi
I am bit confusing about using Reality Composer Pro work pipeline. Searched many thread and got a reply about Reality Composer Pro. the answer was Reality Composer Pro is only can work with Vision Pro.
and I found Reality Composer (not pro) which looks support iPhone app building. but cannot found for download way.
So my question is What way is existing for building iPhone AR app? Can someone clearly explain this? please let me know what I missed.
thanks
In the editor (Reality composer pro), I can edit and mount the IBL component in real time, and the preview effect in the editor is normal. When loading the parsed scene using xcode, some models will appear black. I have tried many model formats (usda, usdc, usdz), and the final effect is the same. However, I can create IBL effects through code, and the effect is normal. I suspect that the IBL component exported by the Realitykit parsing editor has a maximum number of material balls.