Reality Composer Pro

RSS for tag

Prototype and produce content for AR experiences using Reality Composer Pro.

Learn More

Post

Replies

Boosts

Views

Activity

Inquiry About the Precision of Apple Vision Pro LiDAR
Hello everyone, I am a developer working on the Apple Vision Pro platform, currently developing an application that relies heavily on the Vision Pro LiDAR sensor. To ensure the accuracy and performance of my application, I would like to gather more detailed information about the technical specifications of the LiDAR sensor, particularly in the following areas: 1. Distance Accuracy: How accurate is the LiDAR sensor at different distances? 2. Spatial Resolution: What is the smallest object size that the sensor can detect? 3. Environmental Impact: How does the performance of the LiDAR sensor vary under different lighting conditions or environmental factors (e.g., reflective surfaces, fog)? I would greatly appreciate any detailed information or technical documentation regarding these questions. If there are any developers or Apple staff members who have insights on this, your input would be highly valued. Thank you in advance for your assistance!
1
0
316
Aug ’24
Compose interactive 3D content in Reality Composer Pro -- Build Error
Compilation of the project for the WWDC 2024 session title Compose interactive 3D content in Reality Composer Pro fails. After applying the fix mentioned here (https://developer.apple.com/forums/thread/762030?login=true), the project still won't compile. Using Xcode 16 beta 7, I get these errors: error: [xrsimulator] Component Compatibility: EnvironmentLightingConfiguration not available for 'xros 1.0', please update 'platforms' array in Package.swift error: [xrsimulator] Component Compatibility: AudioLibrary not available for 'xros 1.0', please update 'platforms' array in Package.swift error: [xrsimulator] Component Compatibility: BlendShapeWeights not available for 'xros 1.0', please update 'platforms' array in Package.swift error: [xrsimulator] Exception thrown during compile: compileFailedBecause(reason: "compatibility faults") error: Tool exited with code 1
4
0
502
Aug ’24
Weird error about EnvironmentResource
func createEnvironmentResource(image:UIImage) -> EnvironmentResource? { do { let cube = try TextureResource( cubeFromEquirectangular: image.cgImage!, quality: .normal, options: TextureResource.CreateOptions(semantic: .hdrColor) ) let environment = try EnvironmentResource( cube: cube, options: EnvironmentResource.CreateOptions( samplingQuality: .normal, specularCubeDimension: cube.width/2 // compression: .astc(blockSize: .block4x4, quality: .high) ) ) return environment }catch{ print("error: \(error)") } return nil } When I put this code in the project, it can run normally on the visionOS 2.0 simulator. When it is run on the real machine, an error is reported at startup: dyld[987]: Symbol not found: _$s10RealityKit19EnvironmentResourceC4cube7optionsAcA07TextureD0C_AC0A10FoundationE13CreateOptionsVtKcfC Referenced from: <DEC8652C-109C-3B32-BE6B-FE634EC0D6D5> /private/var/containers/Bundle/Application/CD2FAAE0-415A-4534-9700-37D325DFA845/HomePreviewDEV.app/HomePreviewDEV.debug.dylib Expected in: <403FB960-8688-34E4-824C-26E21A7F18BC> /System/Library/Frameworks/RealityFoundation.framework/RealityFoundation What is the reason and how to solve it ?
1
0
361
Aug ’24
visionOS – Anchoring particle system to hand also anchors spawned particles
Dear Apple Developer Forums, I am just starting out developing in Swift, using RealityKit and Reality Composer Pro, as a project I'm working on is transitioning from using Unity to native only. I am trying to attach a particle system to the user's right hand, emitting from a single point, showing a 'spatial trail' of sorts, basically acting as a visualizer of the hand's spatial history. However, in Reality Composer Pro, when I anchor my particle emitter's parent entity using an Anchor component, even though the "Particles Inherit Transform" option is unticked (false), all of the spawned particles will also be anchored to the specified anchor position, as opposed to the expected behavior, which is that the emitter itself is anchored, but the spawned particles retain their spawn position in worldspace. Am I missing something, or does anchoring simply behave this way in relation to particle systems? Thank you! RCP 1.0, Xcode 15.4, visionOS 1.2
1
0
432
Aug ’24
Material Reference from Reality Composer Pro
I have a model entity (from Reality Composer Pro) I want to change the material of the model entity inside swift. The material is also imported in reality composer pro. I am copying the USDZ file of the material in the same directory as the script. This is the code I am using to reference the Material. do { // Load the file data if let materialURL = Bundle.main.url(forResource: "BlackABSPlastic", withExtension: "usdz") { let materialData = try Data(contentsOf: materialURL) // Check the first few bytes of the data to see if it matches expected types let headerBytes = materialData.prefix(4) let headerString = String(decoding: headerBytes, as: UTF8.self) // Print out the header information for debugging print("File header: \(headerString)") // Attempt to load the ShaderGraphMaterial let ScratchedMetallicPaint = try await ShaderGraphMaterial( named: "BlackABSPlastic", from: materialData ) print(ScratchedMetallicPaint) } else { print("BlackABSPlastic.usdz file not found.") } } catch { // Catch the error and print it print("BlackABSPlastic load failed: \(error)") // Attempt to infer file type based on the error or file content if let error = error as? DecodingError { switch error { case .typeMismatch(let type, _): print("Type mismatch: Expected \(type)") case .dataCorrupted(let context): print("Data corrupted: \(context.debugDescription)") default: print("Decoding error: \(error)") } } else { print("Unexpected error: \(error)") } } I am receiving these errors: File header: PK TBB Global TLS count is not == 1, instead it is: 2 Unable to create stage from in-memory buffer. BlackABSPlastic load failed: internalImportError Unexpected error: internalImportError am I doing anything wrong? I am able to access the materials of the model entity easily but this seems to something different. How can this be resolved? Thanks.
1
0
435
Aug ’24
How to use SpatialTapGesture to pin a SwiftUI view to entity
My goal is to pin an attachment view precisely at the point where I tap on an entity using SpatialTapGesture. However, the current code doesn't pin the attachment view accurately to the tapped point. Instead, it often appears in space rather than on the entity itself. The issue might be due to an incorrect conversion of coordinates or values. My code: struct ImmersiveView: View { @State private var location: GlobeLocation? var body: some View { RealityView { content, attachments in guard let rootEnity = try? await Entity(named: "Scene", in: realityKitContentBundle) else { return } content.add(rootEnity) }update: { content, attachments in if let earth = content.entities.first?.findEntity(named: "Earth"),let desView = attachments.entity(for: "1") { let pinTransform = computeTransform(for: location ?? GlobeLocation(latitude: 0, longitude: 0)) earth.addChild(desView) // desView.transform = desView.setPosition(pinTransform, relativeTo: earth) } } attachments: { Attachment(id: "1") { DescriptionView(location: location) } } .gesture(DragGesture().targetedToAnyEntity().onChanged({ value in value.entity.position = value.convert(value.location3D, from: .local, to: .scene) })) .gesture(SpatialTapGesture().targetedToAnyEntity().onEnded({ value in })) } func lookUpLocation(at value: CGPoint) -> GlobeLocation? { return GlobeLocation(latitude: value.x, longitude: value.y) } func computeTransform(for location: GlobeLocation) -> SIMD3<Float> { // Constants for Earth's radius. Adjust this to match the scale of your 3D model. let earthRadius: Float = 1.0 // Convert latitude and longitude from degrees to radians let latitude = Float(location.latitude) * .pi / 180 let longitude = Float(location.longitude) * .pi / 180 // Calculate the position in Cartesian coordinates let x = earthRadius * cos(latitude) * cos(longitude) let y = earthRadius * sin(latitude) let z = earthRadius * cos(latitude) * sin(longitude) return position } } struct GlobeLocation { var latitude: Double var longitude: Double }
7
0
435
Aug ’24
Object Tracking (moving objects)
From my early testing it seems like the object tracking works best for static objects. For example, if I am holding something in my hand the object tracker is slow to update. Is there anything that can be modified to decrease the tracking latency? I noticed that the Enterprise API has some override features is this something that can only be done using Enterprise?
1
0
457
Aug ’24
Export USDZ With Unlit Shader From Maya
Hello! I have the great fortune to be working on a joint Apple/(unnamed brand) app. I am teamed up with a photogrammetry vendor, and will be assembling a scene in Maya, using a UDIM workflow from retouched assets in Substance Painter. This will be for an immersive environment, similar to Joshua Tree and the Hawaiian environments on AVP, which are great. We did a first pass, bringing USDZ files into Reality Composer Pro and sending to headset - very cool. However the models have a specular component that is washing everything out. After further researching, I learned that I need to not be using a Physically based shader, but instead, an Unlit shader. I had to manually create this in RCP, then it viewed properly. The issue is, every time I need to add a new asset, will I now have to manually create numerous custom shaders? Ideally, I am wanting to either figure out how to get an Unlit shader specified in Substance Painter that will get exported out properly, in addition to getting it to export from Maya. I tried using a Maya SurfaceShader - which is essentially unlit - but that does not export properly. I found that a Maya StandardSurface will export properly, however, it still imports as Physically Based shader. The conundrum is, I'll be working with a UDIM workflow, and would like to avoid having to manually create and hookup what could be between 10-40 textures per USDZ file. I guess what I'm trying to ask is - what is the preferred shader to use in Maya when exporting to USDZ that will import as Unlit? Or, is there a way to easily switch from Physically Based to Unlit inside RCP? And NOT doing it in Xcode/Swift, because the files I need to deliver need to be USDZ, the the developer will be assembling themselves. I'm using Maya because I need to work on my PC for the heavy GPU lifting, and also, it's just easier to assemble everything there VS Reality Composer Pro, which is like the iMovie of game engines. ;) (please take that constructively, it really needs to be more industry standard). I just need to make sure I can use Maya with the proper shaders to export my pieces, quickly send to RCP (with proper Unlit specification) then over to headset so I can check stuff in realtime with my photogrammetry vendor. Any help or advice is greatly appreciated! I'm really excited to be working on my first Vision Pro app. Thx!
4
0
580
Aug ’24
Having trouble in loading audio file resources from RCP bundle.
RealityContentKit bundle resource issue Recently I always encounter weird loading bugs from RealityKitContent bundle. When I was trying to load audio resource as AudioFileResource or AudioFileGroupResource by loading from *.usda from RealityKitContent bundle, with this method. My code is nothing complicated but simple as below: let primPath: String = "/SampleAudios/SE_bounce_audio" guard let resource = try? AudioFileGroupResource.load(named: primPath, from: "MyScene.usda", in: realityKitContentBundle) else { return } And the runtime program "sometimes"(whenever I change something RCP it somethings work again but the behavior is unpredictable) reports that it "Cannot find MyScene.usda:/SampleAudios/SE_bounce_audio in RealityKitContent.bundle". I put MyScene.usda under the root folder of RealityKitContent package because I found that RealityKit just cannot find any *.usda scene if you didn't put that on the root level (could be a bug because of the way it indexes its files). I even double checked my .usda file with usdview, the primPath is absolutely correct. I think there are some unknown issues when RealityKitContent copy resources and build the package. I tried to play with the package Package.swift file a bit to see if I could manually copy my resources (everything) and let the package carry my resources but it just didn't work. So right now I just keep this file untouched below (just upgrade the swift-tools-version to 6.0 as only that can supports .visionOS(.v2)): // swift-tools-version:6.0 // The swift-tools-version declares the minimum version of Swift required to build this package. import PackageDescription let package = Package( name: "RealityKitContent", platforms: [ .visionOS(.v2) ], products: [ // Products define the executables and libraries a package produces, and make them visible to other packages. .library( name: "RealityKitContent", targets: ["RealityKitContent"]), ], dependencies: [ // Dependencies declare other packages that this package depends on. // .package(url: /* package url */, from: "1.0.0"), ], targets: [ // Targets are the basic building blocks of a package. A target can define a module or a test suite. // Targets can depend on other targets in this package, and on products in packages this package depends on. .target( name: "RealityKitContent" ), ] ) That is just issue one, RealityKitContent package build issue. Audio file format issue Another is about Audio File Format RCP supports. I remember is a place (WWDC?) saying .wav and .mp4 are supported to be used as audio source. But when I try to set up Spatial Audio, I find sometimes *.wav or *.mp3 can also be imported as AudioSourceFile. But the behavior is unpredictable. With two *.wav files SE_ball_hit_01.wav and SE_ball_hit_02.wav, only SE_ball_hit_01.wav is supported, 02 is reported as the format is not supported/ Check out my screenshots to see the details of two files. Two files have almost the same format (same sample rate or channel). I understand there might be different requirements for a source file to be used as Spatial or Ambient audio. But I haven't figured that out or there is nothing I can find helpful on Apple Documentation. So what is the rules? Thanks for reading and any thought is welcomed.
1
0
426
Aug ’24
Is it possible to load Reality Composer Pro scenes from URL? (VisionOS)
My visionOS app has over 150 3D models that work flawlessly via firebase URL. I also have some RealityKitContent scenes (stored locally) that are getting pretty large so I'm looking to move those into firebase as well and call on them as needed. I can't find any documentation that has worked for this and can't find anyone that's talked about it. Does anyone know if this possible? I tried exporting the Reality Composer Pro scenes as USDZ's and then importing them in but kept getting material errors. Maybe there's a way to call on each 3D model separately and have RealityKit build them into the scene? I'm not really sure. Any help would be much appreciated !
1
0
423
Aug ’24
Component with SIMD3<Float>
I want to use SIMD values with a design time component. public struct SomeComponent: Component, Codable { public var Magnitude: SIMD3 = .zero } Is extra work required? I had understood that serialization of simple values including SIMD would be handled by Reality Composer Pro. At run time I get the error: decodeComponent: Unexpected error: keyNotFound(CodingKeys(stringValue: "Magnitude", intValue: nil), Swift.DecodingError.Context(codingPath: [], debugDescription: "No value associated with key CodingKeys(stringValue: "Magnitude", intValue: nil) ("Magnitude").", underlyingError: nil)) Asset deserialization failed. Asset type "SceneAsset". Details: Failed to deserialize "/container/@shared/17/object". Reason: Failed to deserialize Swift Codable component of type RealityKitContent.SomeComponent.
2
0
338
Sep ’24
Post Notification to RCP but Timeline won't fire
I am trying to use onNofitication in BehaviorComponent to fire up my composed timeline actions. Which is formed up by one TransformTo action, one Hide action and followed by a Notification action indicating the other two actions are finished. With this post, I successfully send a notification to RCP to fire up my timeline with identification: NotificationCenter.default.post( name: NSNotification.Name("RealityKit.NotificationTrigger"), object: nil, userInfo: [ "RealityKit.NotificationTrigger.Scene": scene, "RealityKit.NotificationTrigger.Identifier": "onSomethingStart" ] ) On the other hand, to subscribe that Notification Action, I append a onReceive function below my RealityView, and succesfully received my notification private let notificationTrigger = NotificationCenter.default.publisher( for: Notification.Name("RealityKit.NotificationTrigger")) guard let entity = out.userInfo?["RealityKit.NotificationTrigger.SourceEntity"] as? Entity, let notificationName = out.userInfo?["RealityKit.NotificationTrigger.Identifier"] as? String else { return } debugPrint("Received notification: \(notificationName), entity name: \(entity.name)") Which means that my Timeline is fired up because I can received my notification in my Timeline. But the rest two actions just don't appear to be working. I played the timeline in RCP it works fine. Anything I missed to make it tick? XCode beta 16.1 VisionOS beta 9
2
0
443
Sep ’24
Open Reality Composer pro Scenes as were they files
Is there a good way to have an app open scenes from Reality Composer Pro, without that scene being part of the app? Kind of like you would browse any other file. I have a business where I provide walk throughs in building plans. I do this by building an app containing a Reality Composer Pro scene containing the customers building. For each customer I will duplicate the app and change the Scene content. (One app per customer) In order to scale my business I would love to be able to distribute the scenes to the customers so they could just open them in the app, but I don't see how to to that. If you have any idea as to how this can be done it would be great! I have very little experience in coding, so please assume I don't know what you are talking about when explaining
3
0
292
Sep ’24
Blurred Background (RealityKit) Shader Graph Node not working on iOS/macOS
The ShaderGraph Node Blurred Background (RealityKit) – https://developer.apple.com/documentation/shadergraph/realitykit/blurred-background-(realitykit) works fine within the RealityComposer Pro 2 editor but isn't working on iOS 18 or macOS 15. Instead of the blurred content it just renders as opaque in a single color (Screenshot 2). Interestingly it also fails to render within RealityComposer Pro when no other entities are within the scene, e.g only a background skybox set. Expected Behavior: It would be great if this node worked the same way as it does on visionOS since this would allow for really interesting and nice effects for scenes. Feedback ID: FB15081190
0
1
317
Sep ’24
ShaderGraphMaterial with Occlusion Surface Output fails to load on iOS and macOS
A ShaderGraphMaterial with an Occlusion Surface Output generated with RealityComposer 2 fails to load on iOS 18 and macOS 15 with the following error: RealityFoundation.ShaderGraphMaterial.LoadError.invalidTypeFound (https://developer.apple.com/documentation/realitykit/shadergraphmaterial/loaderror/invalidtypefound) This happens with both https://developer.apple.com/documentation/shadergraph/realitykit/occlusion-surface-(realitykit) and https://developer.apple.com/documentation/shadergraph/realitykit/shadow-receiving-occlusion-surface-(realitykit) RealityView { content in do { let bgEntity = ModelEntity(mesh: .generateCone(height: 0.5, radius: 0.1), materials: [SimpleMaterial(color: .red, isMetallic: true)]) bgEntity.position.z = -0.2 content.add(bgEntity) let occlusionMaterial = try await ShaderGraphMaterial(named: "/Root/OcclusionMaterial", from: "OcclusionMaterial") let testEntity = ModelEntity(mesh: .generateSphere(radius: 0.4), materials: [occlusionMaterial]) content.add(testEntity) content.cameraTarget = testEntity } catch { print("Shader Graph Load Error:") dump(error) } } .realityViewCameraControls(.orbit) .edgesIgnoringSafeArea(.all) Feedback ID: FB15081296
0
0
307
Sep ’24
How to remove the impact of AR real environment light sources on materials
The 3D furniture model I built uses some smooth specular reflection materials. I hope to only reflect the HDR image of the ImageBasedLight component I set myself, without reflecting the light source of the AR real environment. How to achieve this in the following scenario? How to avoid being affected by the light source of the AR real environment when using PBR materials When using Shader Graph, how can EnvironmentRadiance not be affected by the light source of the AR real environment?
2
0
360
Sep ’24
Weird Reality Composer Pro Loop Timeline bug
TLDR: Timeline does not play animation when Repeat Forever is checked. Hi! I have created a timeline for my model that does a built-in emphasize animation. Then I added a behavior to my model and has set OnAddedToScene with action to run that timeline. It works perfect well on my device. But I want the timeline to be looped. I realized that there's no loop option in the timeline, but I noticed that I can loop it if I insert it into another timeline(The loop checkbox shows up). So I did that and had my model's behavior to run that timeline. But then the model doesn't play the animation as intended. Note: I am not making a VisionPro app, but an iOS app leveraging ARKit and RealityKit Environment: iPhone 13 Pro Max with iOS18.0 Code: struct ARViewContainer: UIViewRepresentable { func makeUIView(context: Context) -> ARView { let arView = ARView(frame: .zero) arView.session.run() Task { do { let anchor = AnchorEntity(plane: .horizontal) let emojiScene = try await Entity(named: "SunglassesScene", in: bubbleAR anchor.addChild(emojiScene) arView.scene.addAnchor(anchor) } catch { print("Failed to load models: \(error)") } } return arView } } Thank you!
0
0
331
Sep ’24