I am trying to create a simple custom shader with an image as material and a depth map as bump map information. I have followed the official procedure from "Explore materials in Reality Composer Pro" but the depth map is not processed.
What am I doing wrong?
(attached is a screenshot that shows the setup. I removed the image ref for clarity)
Reality Composer Pro
RSS for tagPrototype and produce content for AR experiences using Reality Composer Pro.
Post
Replies
Boosts
Views
Activity
I would like to apply different textures to the front and back faces of a 3D material. Specifically, when applying a texture that cuts the object in half through opacity, I want to be able to observe the back face of the object and apply a different color to it compared to the front face.
In Unity, there is a 'isFrontFace' boolean node that allows for applying different colours to the front and rear faces. However, I am unsure of how to achieve the same effect in Reality Composer Pro!
3D Model is already two-sided.
Where can I find a specification document of displacement file "baked_mesh_disp0.exr" obtained from Full Quality result run by Reality Composer Pro?
I ran Reality Composer Pro, selected Full Quality and ran Create Model, and obtained *.usdz, which I renamed to *.zip and unzipped.
Then I found 5 maps including "baked_mesh_disp0.exr" and I want to know its data specification.
I create a simple primitive shape and I want to add colors on each faces of the cube. I was thinking using Shape Graph, but I have no idea on how to specify each faces with a different color. Any lead or help would be great. This tech is new so help documentions is very low
I have various .reality files published on a website as part of a learning product, which I deployed Feb. 2023 using the latest Reality Composer at the time.
Users informed me that none of the .reality files will open on iOS 17, which I have confirmed. They still open fine on iOS 16.
On iOS 17 the QuickLook viewer says "Object requires a newer version of iOS."
What gives? Did Apple deprecated .reality, or are these designed only to work on one version of iOS only?
I've tried converting a gltf to usdz but it only puts out a USDZ-conversion error. Under deatils it only says "An unexpected error occurred while converting this file to USDZ. Please fix any other errors and try again."
This error just showed up recently. I'm on MacOS 12.3.1 and Reality Converter 1.0 (47.1)
In my Reality Composer scene, I have added a spatial audio. How do I play this from my swift code?
I loaded the scene the following way:
myEntity = try await Entity(named: "grandScene", in: realityKitContentBundle)
I'm having trouble figuring out how to open Reality Composer in Xcode. I've tried opening Xcode, selecting "Xcode" from the toolbar, then choosing "Open Developer Tool" and "More Developer Tools" to access the download page, but Reality Composer is not listed there.
Additionally, when I try to check the details of Reality Composer through this link: https://developer.apple.com/augmented-reality/tools/, it redirects me to the iOS app page, and I can't find the download method for Mac. I would appreciate it if someone could provide guidance!
The .obj, .mtl and .jpg files created by the 3D scanner were used to create .usdz files with reality converter.
Does the created .usdz file contain data on the size of the object? And how can I display them in actual size using reality composer?
Thank you in advance.