Reality Composer Pro

RSS for tag

Prototype and produce content for AR experiences using Reality Composer Pro.

Learn More

Post

Replies

Boosts

Views

Activity

Potential problem when projecting massive 3D entity
Hi fellows, I am developing a conceptual prototype to explore how Apple Vision Pro can be applied in Real Estate selling. But I encounter an problem and to consult with the developer community. When rendering a massive 3D entity, sometimes it work fine but sometimes my Vision Pro device automatically reboot (all the running apps are shutdown and then Apple logo appears to reboot) while running the app. I have a 1:1 3D entity imported into the Reality Composer Pro, and I managed to develop a simple rendering Vision Pro app. It works great after I finally made it work and see the 1:1 building in front of me. You can see the full demo video for details: https://www.icloud.com/iclouddrive/0d5QA-sYSehmLF9rEMXsCUyDg#REPtt02_Demo_v1.1 But sometimes when rendering the 1:1 building, my Vision Pro device suddenly reboot. So far I have no clues on its root cause. It might be caused by the not-enough-memory or for safety concern? (since the 3D entity is big and might block a big area of the view) Can any one or even an official provide some guidance here and identify the root cause? Thx much!
0
0
324
Jun ’24
Add a modifier to a single model in the Reality Composer Pro scene
We can add many models in the Reality Composer Pro scene, but when I use RealityView to display and add modifiers in SwiftUI, the modifiers will have Effect, and I don't want to do this. I hope this modifier will be valid for a single model in the Reality Composer Pro scenario. May I ask how to add modifiers to a single model in the Reality Composer Pro scene?
2
0
533
Jun ’24
Apply sprite scene as material in Reality Composer pro
Hello, I’m trying to move my app into vision OS, my app is used for pilot to study the airplane system, is a 3d airplane cockpit build with scene kit and I use sprite scene to animate the cockpit instruments . Scenekit allow to apply as material a sprite scene , so I could animate easy all the different instruments and indication there, but I can’t find this option on reality compose pro , is this possible? any suggestions I can look into to animate and simulate instruments.
2
0
553
Jun ’24
Blender to Reality Composer Pro 2.0 to SwiftUI + RealityKit visionOS Best Practices
Hi, I'm very new to 3D and am currently porting a SwiftUI iOS app to visionOS 2.0. I saw WWDC24 feature Blender in multiple spatial videos, and have begun integrating Blender models and animations into my VisionOS app (I would also like to integrate skeletons and programmatic rigging, more on that later). I'm wondering if there are “Best Practices” for this workflow - from Blender to USD to RCP 2.0 to visionOS 2 in Xcode. I’ve cobbled together the following that has some obvious holes: I’ve been able to find some pre-rigged and pre-animated models online that can serve as a great starting point. As a reference, here is a free model from SketchFab - a simple rigged skeleton with 6 built in animations: https://sketchfab.com/3d-models/skeleton-character-low-poly-8856e0138f424d68a8e0b40e185951f6 When exporting to USD from Blender, I haven’t been able to export more than one animation per USD file. Is there a workflow to export multiple animations in a single USDC file, or is this just not possible? As a temporary workaround, here is a python script I’ve been using to loop through all Blender animations, and export a model for each animation: import bpy import os # Set the directory where you want to save the USD files output_directory = “/path/to/export” # Ensure the directory exists if not os.path.exists(output_directory): os.makedirs(output_directory) # Function to export current scene as USD def export_scene_as_usd(output_path, start_frame, end_frame): bpy.context.scene.frame_start = start_frame bpy.context.scene.frame_end = end_frame # Export the scene as a USD file bpy.ops.wm.usd_export( filepath=output_path, export_animation=True ) # Save the current scene name original_scene = bpy.context.scene.name # Iterate through each action and export it as a USD file for action in bpy.data.actions: # Create a new scene for each action bpy.context.window.scene = bpy.data.scenes[original_scene].copy() new_scene = bpy.context.scene # Link the action to all relevant objects for obj in new_scene.objects: if obj.animation_data is not None: obj.animation_data.action = action # Determine the frame range for the action start_frame, end_frame = action.frame_range # Export the scene as a USD file output_path = os.path.join(output_directory, f"{action.name}.usdc") export_scene_as_usd(output_path, int(start_frame), int(end_frame)) # Delete the temporary scene to free memory bpy.data.scenes.remove(new_scene) print("Export completed.") I have also been able to successfully export rigging armatures as a single Skeleton - each “bone” showing getting imported into Reality Composer Pro 2.0 when exporting/importing manually. I would like to have all of these animations available in a single scene to be used in a RealityView in visionOS - so I have placed all animation models in a RCP scene and created named Timeline Action animations for each, showing the correct model and hiding the rest when triggering specific animations. I apply materials/textures to each so they appear the same, using Shader Graph. Then in SwiftUI I use notifications (as shown here - https://forums.developer.apple.com/forums/thread/756978) to trigger each RCP Timeline Action animation from code. Two questions: Is there a better way than to have multiple models of the same skeleton - each with a different animation - in a scene to be able to trigger multiple animations? Or would this require recreating Blender animations using skeleton rigging and keyframes from within RCP Timelines? If I want to programmatically create custom animations and move parts of the skeleton/armatures - do I need to do this by defining custom components in RCP, using IKRig and define movement of each of the “bones” in Xcode? I’m looking for any tips/tricks/workflow from experienced engineers or 3D artists that can create a more efficient/optimized workflow using Blender, USD, RCP 2 and visionOS 2 with SwiftUI. Thanks so much, I appreciate any help! I am very excited about all the new tools that keep evolving to make spatial apps really fun to build!
1
2
605
Jun ’24
VisionOS Object Capture get entity position
Hello :) As title, I have used RCP with reference objects to capture items in real world. My next step is to detect how close the user finger is that object. I had tried to get the entity's relative position to root but found the position, somehow, is always the same regardless of how and where I move around the camera or the object. The entity has a child transform with a collision component, which is used to detect collision when the finger is closed enough to calculate the distance, but it fails as well... Any help will be appreciated, ty
1
0
542
Jul ’24
PortalComponent invalid
it is my code let portal = Entity() portal.components[ModelComponent.self] = .init(mesh: .generatePlane(width: Float(size.width), height: Float(size.height), cornerRadius: 0.02), materials: [PortalMaterial()]) portal.components[PortalComponent.self] = .init(target: world) portal.components[PortalComponent.self]?.clippingPlane = .init(position: SIMD3(x: 0, y: 0, z: 0), normal: SIMD3(x: 0, y: 0, z: 0)) portal.components.set(HoverEffectComponent()) I added RealityView to multiple HStacks and implemented the portal effect. I found that the portal effect would cause confusion in the rendering level on some machines, as shown in the figure
2
0
457
Jul ’24
Biānjí qì (Reality composer pro) guānyú IBL zǔjiàn shèzhì wèntí 36 / 5,000 Editor (Reality Composer Pro) about IBL component settings
In the editor (Reality composer pro), I can edit and mount the IBL component in real time, and the preview effect in the editor is normal. When loading the parsed scene using xcode, some models will appear black. I have tried many model formats (usda, usdc, usdz), and the final effect is the same. However, I can create IBL effects through code, and the effect is normal. I suspect that the IBL component exported by the Realitykit parsing editor has a maximum number of material balls.
0
0
294
Jul ’24
Confusion about Reality Composer Pro
Hi I am bit confusing about using Reality Composer Pro work pipeline. Searched many thread and got a reply about Reality Composer Pro. the answer was Reality Composer Pro is only can work with Vision Pro. and I found Reality Composer (not pro) which looks support iPhone app building. but cannot found for download way. So my question is What way is existing for building iPhone AR app? Can someone clearly explain this? please let me know what I missed. thanks
0
0
269
Jul ’24
How to select a object in usdz by mouse select in 3d view in Reality Composer Pro
I used other software to export usdz files, hoping to further adjust the PBR and other parameters in the model in Reality Composer Pro. Because usdz is a whole, I cannot use the mouse to select a specific model in usdz on the interface. I have to find the models I want to modify one by one in the list on the left. This method of operation is too inefficient. Is there a better way? Or is there a way to disassemble the usdz file into numerous sub-models and texture material files, so that I can select it with the mouse on the interface in Reality Composer Pro and then modify the PBR, which would be much more efficient.
2
0
657
Jul ’24
Cloud Service for Apple Vision Pro App
For all the AVP devs out there, what cloud service are you using to load content in your app that has extremely low latency? I tried using CloudKit and it did not work well at all. Latency was super bad :/ Firebase looks like the most promising at this point?? Wish Apple would create an ultra low latency cloud service for streaming high quality content such as USDZ files and scenes made in Reality Composer Pro.
1
0
515
Jul ’24
Reality Composer Pro - Audio not working for .USDZ
Based on info online I'm under the impression we can add spatial audio to USDZ files using Reality Composer Pro, however I've been unable to hear this audio outside of the preview audio in the scene inspector. Attached is a screenshot with how I've laid out the scene. I see the 3D object fine on mobile and Vision Pro, but can't get audio to loop. I have ensured the audio file is in the scene linked as the resource for the spatial audio node. Am I off on setting this up, it's broken or this simply isn't a feature to save back to USDZ? In the following link they note their USDZ could "play an audio track while viewing the model", but the model isn't there anymore. Can someone confirm where I might be off please?
3
0
716
Jul ’24