Reality Composer Pro

RSS for tag

Prototype and produce content for AR experiences using Reality Composer Pro.

Learn More

Post

Replies

Boosts

Views

Activity

Export USDZ With Unlit Shader From Maya
Hello! I have the great fortune to be working on a joint Apple/(unnamed brand) app. I am teamed up with a photogrammetry vendor, and will be assembling a scene in Maya, using a UDIM workflow from retouched assets in Substance Painter. This will be for an immersive environment, similar to Joshua Tree and the Hawaiian environments on AVP, which are great. We did a first pass, bringing USDZ files into Reality Composer Pro and sending to headset - very cool. However the models have a specular component that is washing everything out. After further researching, I learned that I need to not be using a Physically based shader, but instead, an Unlit shader. I had to manually create this in RCP, then it viewed properly. The issue is, every time I need to add a new asset, will I now have to manually create numerous custom shaders? Ideally, I am wanting to either figure out how to get an Unlit shader specified in Substance Painter that will get exported out properly, in addition to getting it to export from Maya. I tried using a Maya SurfaceShader - which is essentially unlit - but that does not export properly. I found that a Maya StandardSurface will export properly, however, it still imports as Physically Based shader. The conundrum is, I'll be working with a UDIM workflow, and would like to avoid having to manually create and hookup what could be between 10-40 textures per USDZ file. I guess what I'm trying to ask is - what is the preferred shader to use in Maya when exporting to USDZ that will import as Unlit? Or, is there a way to easily switch from Physically Based to Unlit inside RCP? And NOT doing it in Xcode/Swift, because the files I need to deliver need to be USDZ, the the developer will be assembling themselves. I'm using Maya because I need to work on my PC for the heavy GPU lifting, and also, it's just easier to assemble everything there VS Reality Composer Pro, which is like the iMovie of game engines. ;) (please take that constructively, it really needs to be more industry standard). I just need to make sure I can use Maya with the proper shaders to export my pieces, quickly send to RCP (with proper Unlit specification) then over to headset so I can check stuff in realtime with my photogrammetry vendor. Any help or advice is greatly appreciated! I'm really excited to be working on my first Vision Pro app. Thx!
4
0
513
Aug ’24
Having trouble in loading audio file resources from RCP bundle.
RealityContentKit bundle resource issue Recently I always encounter weird loading bugs from RealityKitContent bundle. When I was trying to load audio resource as AudioFileResource or AudioFileGroupResource by loading from *.usda from RealityKitContent bundle, with this method. My code is nothing complicated but simple as below: let primPath: String = "/SampleAudios/SE_bounce_audio" guard let resource = try? AudioFileGroupResource.load(named: primPath, from: "MyScene.usda", in: realityKitContentBundle) else { return } And the runtime program "sometimes"(whenever I change something RCP it somethings work again but the behavior is unpredictable) reports that it "Cannot find MyScene.usda:/SampleAudios/SE_bounce_audio in RealityKitContent.bundle". I put MyScene.usda under the root folder of RealityKitContent package because I found that RealityKit just cannot find any *.usda scene if you didn't put that on the root level (could be a bug because of the way it indexes its files). I even double checked my .usda file with usdview, the primPath is absolutely correct. I think there are some unknown issues when RealityKitContent copy resources and build the package. I tried to play with the package Package.swift file a bit to see if I could manually copy my resources (everything) and let the package carry my resources but it just didn't work. So right now I just keep this file untouched below (just upgrade the swift-tools-version to 6.0 as only that can supports .visionOS(.v2)): // swift-tools-version:6.0 // The swift-tools-version declares the minimum version of Swift required to build this package. import PackageDescription let package = Package( name: "RealityKitContent", platforms: [ .visionOS(.v2) ], products: [ // Products define the executables and libraries a package produces, and make them visible to other packages. .library( name: "RealityKitContent", targets: ["RealityKitContent"]), ], dependencies: [ // Dependencies declare other packages that this package depends on. // .package(url: /* package url */, from: "1.0.0"), ], targets: [ // Targets are the basic building blocks of a package. A target can define a module or a test suite. // Targets can depend on other targets in this package, and on products in packages this package depends on. .target( name: "RealityKitContent" ), ] ) That is just issue one, RealityKitContent package build issue. Audio file format issue Another is about Audio File Format RCP supports. I remember is a place (WWDC?) saying .wav and .mp4 are supported to be used as audio source. But when I try to set up Spatial Audio, I find sometimes *.wav or *.mp3 can also be imported as AudioSourceFile. But the behavior is unpredictable. With two *.wav files SE_ball_hit_01.wav and SE_ball_hit_02.wav, only SE_ball_hit_01.wav is supported, 02 is reported as the format is not supported/ Check out my screenshots to see the details of two files. Two files have almost the same format (same sample rate or channel). I understand there might be different requirements for a source file to be used as Spatial or Ambient audio. But I haven't figured that out or there is nothing I can find helpful on Apple Documentation. So what is the rules? Thanks for reading and any thought is welcomed.
1
0
398
Aug ’24
Reality Composer Pro node previews?
I have been digging into learning shader graphs by watching Unity shader graph content, cause lots of the same concepts apply. One thing I noticed was that in Unity, each node in the shader graph has a little preview. I don't think this exists in Reality Composer Pro, but is there anyway to mimic it (like can I hook up a node that allows me to debug the graph at that point?) If not, I'm happy to just file a feedback about it, but just thought I'd ask!
3
0
920
Mar ’24
Object Tracking (moving objects)
From my early testing it seems like the object tracking works best for static objects. For example, if I am holding something in my hand the object tracker is slow to update. Is there anything that can be modified to decrease the tracking latency? I noticed that the Enterprise API has some override features is this something that can only be done using Enterprise?
1
0
419
Aug ’24
Material Reference from Reality Composer Pro
I have a model entity (from Reality Composer Pro) I want to change the material of the model entity inside swift. The material is also imported in reality composer pro. I am copying the USDZ file of the material in the same directory as the script. This is the code I am using to reference the Material. do { // Load the file data if let materialURL = Bundle.main.url(forResource: "BlackABSPlastic", withExtension: "usdz") { let materialData = try Data(contentsOf: materialURL) // Check the first few bytes of the data to see if it matches expected types let headerBytes = materialData.prefix(4) let headerString = String(decoding: headerBytes, as: UTF8.self) // Print out the header information for debugging print("File header: \(headerString)") // Attempt to load the ShaderGraphMaterial let ScratchedMetallicPaint = try await ShaderGraphMaterial( named: "BlackABSPlastic", from: materialData ) print(ScratchedMetallicPaint) } else { print("BlackABSPlastic.usdz file not found.") } } catch { // Catch the error and print it print("BlackABSPlastic load failed: \(error)") // Attempt to infer file type based on the error or file content if let error = error as? DecodingError { switch error { case .typeMismatch(let type, _): print("Type mismatch: Expected \(type)") case .dataCorrupted(let context): print("Data corrupted: \(context.debugDescription)") default: print("Decoding error: \(error)") } } else { print("Unexpected error: \(error)") } } I am receiving these errors: File header: PK TBB Global TLS count is not == 1, instead it is: 2 Unable to create stage from in-memory buffer. BlackABSPlastic load failed: internalImportError Unexpected error: internalImportError am I doing anything wrong? I am able to access the materials of the model entity easily but this seems to something different. How can this be resolved? Thanks.
1
0
400
Aug ’24
visionOS – Anchoring particle system to hand also anchors spawned particles
Dear Apple Developer Forums, I am just starting out developing in Swift, using RealityKit and Reality Composer Pro, as a project I'm working on is transitioning from using Unity to native only. I am trying to attach a particle system to the user's right hand, emitting from a single point, showing a 'spatial trail' of sorts, basically acting as a visualizer of the hand's spatial history. However, in Reality Composer Pro, when I anchor my particle emitter's parent entity using an Anchor component, even though the "Particles Inherit Transform" option is unticked (false), all of the spawned particles will also be anchored to the specified anchor position, as opposed to the expected behavior, which is that the emitter itself is anchored, but the spawned particles retain their spawn position in worldspace. Am I missing something, or does anchoring simply behave this way in relation to particle systems? Thank you! RCP 1.0, Xcode 15.4, visionOS 1.2
1
0
408
Aug ’24
VisionOS AvPlayer issue
I wanted to report an issue I've encountered with the latest Beta 6 update concerning the immersive space feature. Before this update, when I was in immersive space and clicked on a window button to play a video using AVPlayer, I had the option to keep other windows open and accessible within the environment. Could you please investigate this issue? It would be helpful to know if this is an intentional change or if there might be a bug affecting window management in immersive space. Thank you for your attention to this matter. I look forward to your response.
3
1
473
Aug ’24
Cinema 4D to Reality Composer Pro
Hello Dev team, 3 weeks I'm looking for how I can export a static Cinema 4D objects WITH TEXTURES to Reality Composer Pro ! I can export it directly on USDA format and it works well for the 3D model in Reality Composer Pro, BUT, I can't have the textures on my model. My model is simple not colored ! Of course I expect to have textures applied on the good place and same appearance I've in Cinema 4D. Could you give me a process to do that please ? I'm using Cinema 4D R25 and Last XCode and Reality Composer Pro beta versions. Big big thanks to the one could help me on this. It will unblock many things to me!!!! Cheers Mathis
1
0
371
Aug ’24
Weird error about EnvironmentResource
func createEnvironmentResource(image:UIImage) -> EnvironmentResource? { do { let cube = try TextureResource( cubeFromEquirectangular: image.cgImage!, quality: .normal, options: TextureResource.CreateOptions(semantic: .hdrColor) ) let environment = try EnvironmentResource( cube: cube, options: EnvironmentResource.CreateOptions( samplingQuality: .normal, specularCubeDimension: cube.width/2 // compression: .astc(blockSize: .block4x4, quality: .high) ) ) return environment }catch{ print("error: \(error)") } return nil } When I put this code in the project, it can run normally on the visionOS 2.0 simulator. When it is run on the real machine, an error is reported at startup: dyld[987]: Symbol not found: _$s10RealityKit19EnvironmentResourceC4cube7optionsAcA07TextureD0C_AC0A10FoundationE13CreateOptionsVtKcfC Referenced from: <DEC8652C-109C-3B32-BE6B-FE634EC0D6D5> /private/var/containers/Bundle/Application/CD2FAAE0-415A-4534-9700-37D325DFA845/HomePreviewDEV.app/HomePreviewDEV.debug.dylib Expected in: <403FB960-8688-34E4-824C-26E21A7F18BC> /System/Library/Frameworks/RealityFoundation.framework/RealityFoundation What is the reason and how to solve it ?
1
0
339
Aug ’24
Inquiry About the Precision of Apple Vision Pro LiDAR
Hello everyone, I am a developer working on the Apple Vision Pro platform, currently developing an application that relies heavily on the Vision Pro LiDAR sensor. To ensure the accuracy and performance of my application, I would like to gather more detailed information about the technical specifications of the LiDAR sensor, particularly in the following areas: 1. Distance Accuracy: How accurate is the LiDAR sensor at different distances? 2. Spatial Resolution: What is the smallest object size that the sensor can detect? 3. Environmental Impact: How does the performance of the LiDAR sensor vary under different lighting conditions or environmental factors (e.g., reflective surfaces, fog)? I would greatly appreciate any detailed information or technical documentation regarding these questions. If there are any developers or Apple staff members who have insights on this, your input would be highly valued. Thank you in advance for your assistance!
1
0
303
Aug ’24
The notification is invalid
I can execute an action by allowing Xcode to send a notification to Reality Composer Pro via NotificationCenter, or I can send notifications to Xcode through the Notification Action in Reality Composer Pro. However, I discovered that they were unable to accept notifications from both parties within my project. To ascertain whether there was an error in my code, I created a simple Demo project. I utilized the same code and determined that it functioned normally within the Demo project. It is perplexing that I am unable to resolve this issue. Do I require additional modifications?
1
0
366
Aug ’24
Drag Gesture on Entity with PhysicsBodyComponent is not behaving properly (glitching)
Hi everyone, I'm new to Swift and VisionOS development in general, so please go easy on me. Currently, I'm looking at a sample project from a WWDC23 session that uses RealityKit and ARKit to add a cube entity to a scene via tap gesture. The link to the sample project is here. Instead of adding a cube, I changed the code to adding a usdz model instead. Here is my code: func add3DModel(tapLocation: SIMD3<Float>) { let placementLocation = tapLocation + SIMD3<Float>(0, 0.1, 0) guard let entity = try? Entity.load(named: "cake-usdz", in: realityKitContentBundle) else { logger.error("failed to load 3D model") return } // calculate the collision box (the boundaries) let entitySize = entity.visualBounds(relativeTo: nil) let width = entitySize.max.x - entitySize.min.x let height = entitySize.max.y - entitySize.min.y let depth = entitySize.max.z - entitySize.min.z // logger.debug("width: \(width), height: \(height), depth: \(depth)") // set collision shape let collisionShape = ShapeResource.generateBox(size: SIMD3<Float>(width, height, depth)) entity.components.set(CollisionComponent(shapes: [collisionShape])) // set the position and input types to indirect entity.setPosition(placementLocation, relativeTo: nil) entity.components.set(InputTargetComponent(allowedInputTypes: .indirect)) let material = PhysicsMaterialResource.generate(friction: 0.8, restitution: 0.0) entity.components.set(PhysicsBodyComponent( shapes: [collisionShape], mass: 1.0, material: material, mode: .dynamic )) contentEntity.addChild(entity) } This works fine so far. But when I tried to add a Drag Gesture to drag the added entity around. There are weird glitches happening with the model. The model jumped up and down, and even rotating around it self sometimes. Below is my code for Drag Gesture. I placed it directly below the code for Spatial Tap Gesture in the sample project. .gesture(DragGesture().targetedToAnyEntity().onChanged({ value in let targetedEntity = value.entity targetedEntity.position = value.convert(value.location3D, from: .local, to: .scene) })) At first, I thought my code was wrong. But after looking around and removing the PhysicsBodyComponent for the added model, the entity was moving as intended while dragging. I can't figure out a solution to this. Could anyone help me? I'm currently on Xcode 16 beta 2, and visionOS 2.0. Because I'm on Beta, I'm unsure if this is a bug or if I just missed something. Thank you.
1
0
344
Aug ’24
Update World Anchor using object anchor
Hi. I display buildings in mixed immersive view. Right now the building appears in relation to the person when the view is opened. (world anchor) To position the building precisely, I want to use object tracking. Set up a project following the wwdc object tracking session. That works well sort of... With an object anchor, the 3D object related to the anchor disappears as soon as the Tracked object is out of view, and with the big objects you don't get the chance to look around. I figure I need to give my 3D object a world anchor, and only have that world anchor update if a change in the object anchor is detected. how do I do that? Preferable using the tools in Reality Composer pro (or very well explained, as I am new to code)
4
0
399
Aug ’24
Entity.applyTapForBehaviors() only works on Simulator, not device
I created a simple Timeline animation with only a "Play Audio" action in RCP. Also a Behaviors Component setting an "OnTap" trigger to fire this Timeline animation. In my code, I simply run Entity.applyTapForBehaviors() when something happened. The audio can be normally played on the simulator but cannot be played on the device. Any potential bug leads this behavior? Env below: Simulator Version: visionOS 2.0 (22N5286g) XCode Version: Version 16.0 beta 4 (16A5211f) Device Version: visionOS 2.0 beta (latest)
1
0
390
Aug ’24
How to trigger actions by OnCollision in Behaviors Component
It's all about notifications to trigger actions from RCP's new Timeline system. From Compose interactive 3D content in Reality Composer Pro I am actually starting to confuse why there was need to use Entity.applyTapForBehaviors in code to trigger content in Behaviors Component. Simply because in Behaviors Component, we have chosen OnTap to allow a "Tap Notification" to trigger our action (on a selected target object). Then I guess by selecting OnCollision this trigger, I should write something like CollisionEvent.entityA.applyCollisionForBehaviors, which we don't have. And ofc the collision on my object won't trigger this action (because I only did things in RCP not in code). Ignoring this post has pointed out we could use Behaviors Component's OnNotification to trigger something for now. I found that I could still use OnTap trigger but actually put my code Entity.applyTapForBehaviors under my subscribed collision's begin event. That actually works better than OnCollision So what is the design principles here? And how could I trigger a collision notification to let my Behaviors Component's OnCollision actually works?
0
0
318
Aug ’24