Behavior:
Orbit animation doesn't show up. Both for OnTap trigger and OnAddedToScene trigger. It is not an issue with my code because I tested with an emphasize float animation and it works perfectly.
Environment:
ARKit + RealityKit, iOS18
My animation timeline settings:
A simple Orbit animation block with a target, a pivot entity. 1s duration, orbit direction clockwise, axis(0,1,0), 1 revolution, and blend layer 300.
My Behavior setting:
OnTap -> play the animation
Reality Composer Pro
RSS for tagPrototype and produce content for AR experiences using Reality Composer Pro.
Post
Replies
Boosts
Views
Activity
How to create fire effect in Reality Composer Pro? Should I use particle emitter?
I have no clue at all. Any suggestions are welcome. Thank you.
In RealityKit, I know that an HDR image is pre-calculated, and through the settings of the ImageBasedLight Component, a specified specular object can reflect the content of the HDR image.
If a mirror object is originally very large, such as a large-area continuous glass door, after specifying an IBL image for these glass doors, the image reflected by the mirror will be obviously deformed when it moves in space. Because IBL is a picture of the surrounding environment at a point, while the glass door is a surface.
Is there a truly real-time specular reflection calculation setup in RealityKit that can reflect the model on the opposite side of the glass door?
Hello! I am working on some cool project reconstructions for a client. They design lobby sized installations with LED walls. I am being tasked with converting these over to AR at scale. I've got my first test in headset and it looks great! However, the desire to just walk around more than the 10' x 10' safe area zone totally takes one out of the immersive VR experience - which is pretty counter intuitive. Is there any way for us developers to by pass this hard limit, so that clients who are requesting more room-scale options can actually enjoy this in VR?
Alternatively, is there a way to hook up a PS5 controller to a "player start" so I can navigate inside the VR volume? I'm really trying to embrace Reality Composer Pro, but it seems extremely limiting as I wait for Unreal Engine to get its act together. sigh. Thanks for any help or suggestions.
Hello, I am getting following error on console and my app crashes. It goes to dark and then Apple logo appears and app crashes
apply fence tx failed (client=0x61dbbfd7) [0xfffffecc (ipc/mig) server died]
[C:3] Error received: Connection interrupted.
Failed to commit transaction (client=0x94097449) [0x10000003 (ipc/send) invalid destination port]
Failed to commit transaction (client=0xe9684b50) [0x10000003 (ipc/send) invalid destination port]
[C:3-1] Error received: Connection interrupted.
Failed to commit transaction (client=0xbcac17e9) [0x10000003 (ipc/send) invalid destination port]
Failed to commit transaction (client=0x52392119) [0x10000003 (ipc/send) invalid destination port]
Failed to commit transaction (client=0xff841d17) [0x10000003 (ipc/send) invalid destination port]
Failed to commit transaction (client=0xdef5c915) [0x10000003 (ipc/send) invalid destination port]
Failed to commit transaction (client=0xefdc8bf3) [0x10000003 (ipc/send) invalid destination port]
Failed to commit transaction (client=0xd50c1eff) [0x10000003 (ipc/send) invalid destination port]
Failed to commit transaction (client=0x15690a46) [0x10000003 (ipc/send) invalid destination port]
Failed to commit transaction (client=0xf296f56b) [0x10000003 (ipc/send) invalid destination port]
Failed to commit transaction (client=0x61dbbfd7) [0x10000003 (ipc/send) invalid destination port]
apply fence tx failed (client=0x61dbbfd7) [0x10000003 (ipc/send) invalid destination port]
nw_read_request_report [C1] Receive failed with error "No message available on STREAM"
nw_read_request_report [C1] Receive failed with error "No message available on STREAM"
nw_read_request_report [C1] Receive failed with error "No message available on STREAM"
nw_read_request_report [C1] Receive failed with error "No message available on STREAM"
nw_read_request_report [C1] Receive failed with error "No message available on STREAM"
nw_read_request_report [C1] Receive failed with error "No message available on STREAM"
nw_read_request_report [C1] Receive failed with error "No message available on STREAM"
nw_read_request_report [C1] Receive failed with error "No message available on STREAM"
nw_read_request_report [C1] Receive failed with error "No message available on STREAM"
nw_protocol_socket_reset_linger [C1:2] setsockopt SO_LINGER failed [22: Invalid argument]
apply fence tx failed (client=0x61dbbfd7) [0x10000003 (ipc/send) invalid destination port]
Failed to set override status for bind point component member.
Failed to set override status for bind point component member.
Failed to set override status for bind point component member.
Message from debugger: Terminated due to signal 9
Could you please tell me what's the reason and how can I resolve this. When I loads 2,3 times then app works fine from that point onwards. But this happens time to time when debug.
Given my limited knowledge of physics, I would appreciate it if individuals with a solid understanding of the subject could provide insights into this matter. I have added a physical component to a entity in Reality Composer Pro, but I am seeking guidance on how to achieve the following:
Make an object float in the air (with a slight downward motion reminiscent of the moon’s surface)
Enable the object to move at a slow pace
Implement a strong rebound force
I would be grateful if you could provide appropriate values for these parameters. Thank you for your assistance.
I saw onnoffitacation in the Behavior configuration of Reality Composer pro, which asked me to enter the Nofficatition name, that is to say, this requires swift in Xcode to send a message. There is a message name in the message, so I hope you can write a list for me how to use Swift in Xcode to send a message containing the message name.(There is an answer in https://developer.apple.com/forums/thread/756978, but it doesn't work.)
and in the time line in Reality Composer Pro, there is a Notification action, which is used to send messages to swift. How can I ask swift to detect whether the Notification action has sent a message?(There is an answer in https://developer.apple.com/videos/play/wwdc2024/10102/, but it doesn't work.)
I have asked this question before (https://developer.apple.com/forums/thread/756978). Those answers were available before, but now they are all invalid in the latest system. I hope you can help me. Thank you.
If I create a visionOS app project, it automatically creates a RealityKitContent package. However, if I create a Multiplatform project (for visionOS, macOS, and iOS), the package is not added. Is it possible to add it manually? How? (using Xcode 16.1 beta 2)
Hello!
I would like to do exactly this:
https://youtu.be/Cun8K7ctKp0?si=TgWvtdw-VdlBVL0R
I can't seem to find any documentation on getting a PS5 controller hooked up properly in Reality Composer Pro and Xcode, driving a character with animation (or, moving an object around freely) then over to the Vision Pro.
Additionally, I would also like to learn how to use a controller to move a VR camera around a scene, so that we can navigate in custom built spaces - similar to Meta virtual environments, or Steam VR home environments.
Typically, I would do this with Unreal Engine, but unfortunately, AVP support is still in its infancy there. So I figured, why not try to do it natively?
Any help with concrete tutorials or documentation would be greatly appreciated.
Thx!
I have been experimenting some experiences in which I would like to use SharePlay to allow the app to be used by multiple users.
Currently I achieved sharing a volume containing a Reality Composer Pro scene inside of it, the scene contains some entities with an animation.
So far I have been able to correctly share the volume and its content, with the animation playing without problems, but once I activate SharePlay different users see different moments of the animation if no animation at all.
Is there a way to synchronize animations between all the users, no matter when someone entered the SharePlay session, aside from communicating the animation time once someone joins?
The documentation at https://developer.apple.com/documentation/visionos/designing-realitykit-content-with-reality-composer-pro states:
Reality Composer Pro treats your imported assets as read-only.
This is a huge obstacle for me, as I need to do multiple adjustments to the scene.
I somehow managed to actually import one of my assets into the scene and can manipulate it directly, but now I can't figure out how I did this.
As I have to prepare further assets and would like to do this directly in Reality Composer Pro, I'm looking for a way to actually load them into the scene.
Any idea how this can be done?
I have a rkasset package, from which I load my scene.
In the scene, I'm using entity.findEntity(named:"..") to find entities to activate/deactivate.
When I have entities deactivated in the *.usda, they are not found with this method. Further inspection shows that the deactivated entities seem not to be compiled into the build.
Is there anything I can set that prevents skipping the build for these deactivated entities?
I just do as the document in https://developer.apple.com/documentation/shadergraph/realitykit/cube-image-(realitykit)
I have a .ktx file, and use CubeImage node to load it, then Convert node, but it shows black.
I check it on my Vision Pro, it's still black, I don't know why? Is it something wrong?
ps: I also use Image node to load .ktx file, it shows one image, so I belive .ktx file is right. I alse checked it on Vision Pro.
Hello,
I am looking to create a shader to update an entity's rendering. As a basic example say I want to recolour an entity, but leave its original textures showing through:
I understand with VisionOS I need to use Reality Composer Pro to create the shader, but I'm lost as how to reference the original colour that I'm trying to update in the node graph. All my attempts appear to completely override the textures in the entity (and its sub-entities) that I want to impact. Also the tutorials / examples I've looked at appear to create materials, not add an effect on top of existing materials.
Any hints or pointers?
Assuming this is possible, I've been trying to load the material in code, and apply to an entity. But do I need to do this to all child entities, or just the topmost?
do {
let entity = MyAssets.createModelEntity(.plane) // Loads from bundle and performs config
let material = try await ShaderGraphMaterial(named: "/Root/TestMaterial", from: "Test", in: realityKitContentBundle)
entity.applyToChildren {
$0.components[ModelComponent.self]?.materials = [material]
}
root.addChild(entity)
} catch {
fatalError(error.localizedDescription)
}
I am new to the graph editor and was able to achieve some results. However, I am noticing that my graphs are getting very tangled, confusing, and hard to debug. I was wondering whether:
is it possible to define variables, to store the value of computations, and refer to them in other parts of the graph, without having to link them graphically? This would help in tidying the tangled mess I created. In the "Explore materials in Reality Composer Pro" video, I saw that it is possible to create "instances", but I am not sure if that is what I need. For example: does the shader compiler optimize them, so that there is no need to recompute each instance?
Is there any functionality to debug the graph, trying inputs and seeing what the numeric outputs would be?
We are developing a mixed reality app for Vision Pro using Reality Composer Pro, but we're consistently encountering a Protobuf-related crash whenever we enter the immersive space. Our Reality Composer Pro package is quite complex, with numerous objects. Could the complexity of the package be contributing to the issue, or could something else be at play? No errors are being flagged in the code or Reality Composer itself.
Here’s the error log:
[libprotobuf FATAL /Library/Caches/com.apple.xbs/Sources/REKit/ThirdParty/protobuf/src/google/protobuf/io/zero_copy_stream_impl_lite.cc:276] CHECK failed: (count) >= (0):
libc++abi: terminating due to uncaught exception of type google::protobuf::FatalException: CHECK failed: (count) >= (0):
Any insight on what might be causing this would be appreciated.
As you can see, it is a transparent spherical shell model with a ball inside. Everything is normal on the front side, but there are strange mesh triangles on the side and back view. I don't know if this is as expected and what I need to do to remove these strange effects.
I am able to create a custom node graph by selecting nodes and then choosing the "Compose Node Graph" option in the context menu. After that, when I select my custom node graph, I see in the top-right panel that it is possible to define inputs and outputs. However, I was not able to figure out how to link those to the inputs and outputs in the underlying nodes.
I've tried importing a 3D KTX file into Reality Composer Pro ShaderGraph, but got nothing output. Realitykit doesn't support Image3D yet?
First, I use pyktx to create a 3D KTX texture file and it seems correct in Finder preview.
import numpy as np
from pyktx import KtxTexture2, KtxTextureCreateInfo, KtxTextureCreateStorage, VkFormat
size = 32
image = np.zeros((size, size, size, 3), dtype=np.uint8)
for i in range(size):
image[i, :, :, 0] = np.interp(i, [0, size], [50, 255])
image[:, i, :, 1] = np.interp(i, [0, size], [100, 255])
image[:, :, i, 2] = np.interp(i, [0, size], [150, 255])
info = KtxTextureCreateInfo(
gl_internal_format=None,
vk_format=VkFormat.VK_FORMAT_R8G8B8_SRGB,
base_width=image.shape[2],
base_height=image.shape[1],
base_depth=image.shape[0],
num_dimensions=3,
num_levels=1,
num_layers=1,
num_faces=1,
generate_mipmaps=False,
is_array=False,
)
texture = KtxTexture2.create(info, KtxTextureCreateStorage.ALLOC)
for _ in range(1):
texture.set_image_from_memory(0, 0, _, image[_].tobytes())
texture.write_to_named_file(f'{size}.ktx')
Then I import it into ShaderGraph like this but seems nothing could be read.
In addition, I tested 2D KTX file and it worked.
I also tested different vk_format or KTX1/KTX2 but did not work.
I also tested Image3DPixel/Image2DArray/Image3DRead and none of them work as long as it's 3D.