Streaming is available in most browsers,
and in the Developer app.
-
Compose interactive 3D content in Reality Composer Pro
Discover how the Timeline view in Reality Composer Pro can bring your 3D content to life. Learn how to create an animated story in which characters and objects interact with each other and the world around them using inverse kinematics, blend shapes, and skeletal poses. We'll also show you how to use built-in and custom actions, sequence your actions, apply triggers, and implement natural movements.
Chapters
- 0:00 - Introduction
- 2:54 - Introducing timelines
- 18:22 - Inverse kinematics
- 22:55 - Animation actions
- 27:23 - Blend Shapes animation
- 30:35 - Skeletal poses
Resources
Related Videos
WWDC24
- Break into the RealityKit debugger
- Discover RealityKit APIs for iOS, macOS, and visionOS
- Enhance the immersion of media viewing in custom environments
- Optimize your 3D assets for spatial computing
WWDC23
-
Download
Hi, my name's Marin. I'm a RealityKit Tools engineer, and today I want to introduce you to a brand new feature in Reality Composer Pro called Timelines, which provides a new interactive way to make your 3D content come to life. Last year, we announced Reality Composer Pro to make the process of previewing and preparing 3D content for your apps easy. It consists of a visual editor to let you compose your scene, organize your assets, add interactivity, physics, or even fine-tune the look of materials on them using the ShaderGraph editor. There is a whole lot you can do with RealityComposer Pro. If you're new to Reality Composer Pro, or just want to get a refresher, I highly recommend you to go through these sessions from last year's WWDC.
Reality Composer Pro lets you visually design, edit, and preview RealityKit content. You can compose scenes, configure components, create complex materials, add audio, and much more, all in one place. This year we've added new features to help you create animations using the timeline editor, add lights and help you with environment authoring. In this session we'll focus on the timeline editor. I'll show you how to build an interactive app using Timelines, and some new animation APIs in RealityKit. Lets take a look.
You might have seen the Botanist app in some of the other sessions. Lets take the robot from that app and reboot it to make an interactive virtual experience of our own. In this app, we have 3 plants that start to wilt. If the plant is wilted, I can tap on it which triggers the robot to move to the plant and water it. The plant slowly becomes healthy and the robot goes back to the center of the platform. Theres also a second robot watching a butterfly and trying to reach out and touch it as it's moving. You can download the sample project linked to this session to follow along.
To build this app, we'll start by moving the robot to the plant using timelines in Reality Composer Pro.
Then, I'll show you how to utilize RealityKit's Full Body Inverse Kinematics system to make the robot reach out to water the plant.
After that, I'll walk through how to write animation actions in code which we use for rotating and moving the robot back to its starting position.
Next, I'll show you how we animated the plants using blend shape animation.
And last, we will add a skeletal pose animation for the second robot watching the butterfly.
Lets start with an introduction to timelines. Timelines is a brand new feature in Reality Composer Pro, that allows you to sequence actions to be executed in a particular order or at a particular time. Reality Composer Pro allows you to easily edit and configure those actions. On the left panel is a list of all the timelines.
The center is the main Timeline Editor, and the right panel is a list of all the available built-in actions. Once you have created a timeline you can initiate the timeline to play based on a trigger.
Lets look at our first animation. In our experience, when I tap on a plant, it triggers the robot to spin and move towards the tapped plant.
Let me show you how I built this timeline sequence. Here, in Reality Composer Pro, I have a new tab in my bottom panel called Timelines. In the center there is a Create Timeline button. I'll go ahead and select that to create a timeline.
On the left-hand side I have a list of all my available timelines, and in the center is my main timeline editor. This is where I can configure my actions and sequence them on a timeline. On the right hand side is where I have a list of all my pre-built actions that I can easily drag and drop onto my timeline. Lets rename our timeline, MoveToPoppy. I'll select the timeline, double tap it, then type MoveToPoppy.
First thing I will do is have the robot turn to face the tapped plant. For this, I will use a spin action from the list of prebuilt actions. I will drag the spin action onto my timeline.
I want to make my robot spin, so under the target field in the inspector panel I'll select choose, and then find my robot entity in the hierarchy or the viewport, and select it. When I am done with my selection, I will tap done.
Now lets configure how much the robot will spin by setting the revolutions in the inspector panel. Lets set it to point 12.
I don't want to spin a full rotation, just a slight turn. Lets see what this looks like. There's a play button in the top of my timeline editor, if I tap on that I can see a preview of what my action's going to look like.
Cool, I can see the robot turns and it looks like its facing the plant. After it spins, I am going to make the robot move. For this, I will use a Transform To action. I'll grab a Transform To action from the list of my pre-built actions, and drag and drop it onto my editor. I'll sequence it to happen about a second after the spin action. I'm going to target this action to my robot because I want to make the robot move towards the plant. So I'll select choose and find my robot in the hierarchy. Then tap done. Now I can see a visual cue of where the robot is going to move to. Currently, it's on the center of the platform, which we don't want. So, let's go ahead and use the manipulator to move the robot's destination right in front of the poppy plant.
All right, I think that looks pretty good. Now, let's change the duration of the transform action. I'll change it from its default duration of one second to a larger number. This will make the robot take a little longer to move to the plant. I can do this by dragging the trailing edge of my action in the editor.
Or, I can set the duration in the inspector panel.
I can drag actions and move them from one track to another. This allows you to sequence multiple actions to run simultaneously. I'll drag the spin action onto a new track that I'll use later. Now lets preview this to see what it looks like by tapping the play button.
All right, I think it looks pretty good. It would be nice to give the robot an animation and audio while its moving toward the plant. To do this, I'll create another timeline which will coordinate the walk animation and audio to go along with it. I'll tap the plus (+) button in the bottom of my timelines list.
Lets re-name this timeline RobotMove. I'll select the new timeline and double tap it, then type RobotMove.
To have an animation and audio play, I will use two new components: an AnimationLibraryComponent and AudioLibraryComponent. Lets take a look at these two components. An AnimationLibraryComponent is used to store and associate animations with an entity that will play them. You add an AnimationLibraryComponent to a rigged entity and then add animation resources to the AnimationLibraryComponent. Reality Composer Pro makes it easy to add animations into the AnimationLibraryComponent. You tap on the plus (+) button and select the animations from your project. This adds the animation resource to the AnimationLibraryComponent on that entity. You can then use this later for playing animations in code or using timelines in Reality Composer Pro.
The AudioLibraryComponent works very similarly to the AnimationLibraryComponent. An AudioLibraryComponent is used to store and associate audio resources with an entity that will play them. You add an AudioLibraryComponent to an entity and then add audio resources to the AudioLibraryComponent. To add audio resources into the AudioLibraryComponent, you tap on the plus (+) button and select the audio files from your project. This adds the audio resource to the AudioLibraryComponent on that entity.
You can then use this later for playing audio in code or using timelines. Lets go back into Reality Composer Pro and finish adding the audio and animations to our timeline. On the RobotMove timeline, I'll add an animation action using the new animation library component. I've already added a USD animation to my project browser which I'll use. To do this, I'll select the robot in the hierarchy. In the inspector panel, I can see that an animation library component has already been added for me. This is because this entity has a USD animation already associated with it. Reality Composer Pro automatically adds the component for me and lists its default animation in the animations list. I can also add an animation library component manually by selecting the add component button at the bottom of the inspector panel. If I select the default subtree animation, I can preview the animation using the play button or, slice it into clips using the scissors icon.
This animation is a little longer than I want, so lets slice it into two clips, a shorter beginning and end animation that I'll play back to back. To do this, I'll grab the play head and drag it to where I want to make the first slice.
Then, I'll click the scissors icon. This creates two clips. Now lets make another slice, so I can use just the ending. I'll drag the playhead again to where I want to make the second slice, and click the scissors icon.
Great, now I have two clips that I can play back to back. I'll rename the two clips startWalk and endWalk.
Now I'll select the RobotMove timeline so that I can sequence the animations back to back. I'll grab an animation action and drag it onto the timeline.
I'm going to target the action to my robot, so I will select choose and then pick the robot in the hierarchy. Then, set the animation to the startWalk clip.
I'll drag another animation action onto my timeline for the second clip.
I'll have the animation run immediately after the first. I'll also target the action to the robot, and set the animation to be the endWalk clip.
While the animation is running, I want to play an audio clip at the same time. To do this, I will use an AudioLibraryComponent and an audio file I've already added to my project. Let's select the robot in the hierarchy.
I'll add a new component by selecting the Add Component button in the inspector panel.
And pick Audio Library Component from the list.
I'll tap the plus (+) button and find the audio files needed. I'll add two audio files, one for the robot turning, and one for the robot moving.
Now, let's go back to the RobotMove timeline and set up when the audio action should run. I'm going to drag an audio action onto the timeline. I want the robot to emit the audio sound so I will select the robot as the emitter, and I want it to play the audio walk clip that I just added so I'll set the audio resource to be walk. Then, I'll drag the audio action so that it plays at the same time the walk animation runs.
All right, let's play this now.
Great, the animation and the audio played together. Now, I'm ready to use the RobotMove timeline in the MoveToPoppy timeline. Let's select our MoveToPoppy timeline, then I'm going to right-click on the RobotMove timeline and I will select the Insert into Timeline option. This will insert the RobotMove timeline into the MoveToPoppy timeline. Now, I have a timeline playing inside of another timeline. I can go ahead and move this nested timeline to any spot by dragging. I'd like to have the animation occur right after the spin finishes, and I will pair that with the transform action, so the animation and the transform occur simultaneously. I'll change the duration of the transform action so it starts moving a little bit after the animation starts and stops moving a little bit before the animation finishes. And I'll add an audio clip to play when the robot spins. I'll drag an audio action from my list of prebuilt actions and place it to happen along with the spin action. I'll set the robot to be the emitter, and the audio resource to be the rotation audio.
Let's go ahead and preview this to see what it looks like.
Awesome, the robot is moving to the plant with an animation and audio playing. Now that we've created our timeline, how do we initiate the timeline to start playing? Well, theres one of two ways we can do this. We can either, in code, use the RealityKit API to call entity.playAnimation on the animation resource instance for the timeline. Or, we can use Reality Composer Pro's UI to add a new behaviors component which will initiate the timeline and start playing without writing any code. To use Reality Composer Pro to initiate a timeline to play, you need to add a Behaviors component to the entity you want to trigger the timeline. We have a few different types of triggers that can be used to initiate a timeline to play such as tap, collision, when the entity is added to a scene, or based on a notification you post in code.
Lets jump back to Reality Composer Pro to setup a tap gesture on the plant to initiate the timeline to run. To set up the tap gesture, I'll start by selecting the plant that I want to tap on. In this case I'm going to select the poppy plant in the hierarchy. I'll add a new component, by tapping the Add Component button in the inspector panel, and find the Behaviors component.
Then, I can tap on the plus (+) button and select the tap trigger because I want the plant to respond to a user tapping on it. When I tap on the plant, I want the tap gesture to trigger the MoveToPoppy timeline to play so I will select the MoveToPoppy timeline in the list.
Now, we need to tell the RealityView to respond to tap gestures. So let's go into Xcode.
On the RealityView, I'm going to add a tap gesture.
The tap gesture will be targetedToAnyEntity. At the end of the tap, so when you lift your finger, we're going to get the entity from the value passed to us and apply the tap behavior to the entity.
What that means, is when I tap on the entity - in our case, it will be the poppy plant - I'm going to apply the tap behavior that I specified in Reality Composer which is the MoveToPoppy timeline.
I can now build and run to try this out. Lets tap on the poppy plant and see what happens.
Great! When I tap on the plant I see the robot turn and move to the plant. We successfully created our first animation and initiated it to play based on user input. Thats pretty cool! Now, for the second animation. After the robot arrives to the plant, I want to have the robot reach its arm out to the plant. What I will do for this is add a notification action and listen for the notification action in code. When the notification fires, I will then write some custom code to have the robot reach out to the plant. Let's go back into Reality Composer Pro and add a notification action.
From the list of pre-built actions I'll drag and drop a notification action onto the end of the timeline.
I will target this notification action to be the poppy plant. Then, I'll set the identifier of the notification to be ReachToPoppy.
This string is what will be passed to us when the notification fires. Now that we've set up the notification in Reality Composer Pro, let's add the code to listen for this notification in the immersive view. Back in Xcode, on the immersive view, I will add the notification name ReachToPoppy. Then add a publisher, for the notification.
Next, I will handle receiving the notification on the RealityView using onReceive. When I receive the notification from the output, I can grab the source entity off the user info dictionary and then write some code that will initiate the robot to reach out to the plant. Lets take a look at what the reach animation should look like. When the robot arrives at the plant, I want it to reach its arm out.
To achieve this, I'll use RealityKit's Full Body Inverse Kinematics system.
Inverse Kinematics is the use of kinematic equations to determine the motion of a joint in a rigged skeletal structure to reach a desired position. You specify the target position and any constraints on the affected joints. The intermediate joint positions are calculated for you. A common example of this is moving a hand to a desired position and having the elbow automatically adjust to reach that position. You can use this to achieve natural looking movements for characters. I'll explain the new Inverse Kinematics API through a schematic diagram and then show you how to use it in code. First, I will instantiate an IKRig. The IKRig defines how the Inverse Kinematics solver should work. I'll give the IKRig the model skeleton and a configuration such as the number of iterations. If you're playing an animation on the same entity, then you need to give weights to the animation and weights for IK so the solver will know what to override. I will then add constraints. Constraints are limitations on how the joints can change. This can be used to prevent unnatural movements, like bending an elbow in the wrong direction.
Then, I'll create an IKResource from the IKRig we defined. The IKResource is the runtime data that the IK solver uses for processing. I'll create an IKComponent from the IKResource, add that to an entity. You can setup this solver on instantiation of the component and update it at runtime, as often as every frame.
Its important to note that RealityKit's IK solver will solve over the full character skeleton simultaneously, and not just a on subset of the joint hierarchy. Lets write the code using Inverse Kinematics to have the robot reach out to the plant.
To setup the Inverse Kinematics solver, I'll start by initializing an empty rig, passing in the model skeleton. Then, I'll update the global rig settings. With some trial and error I found that setting the maxIterations to 30 and the global forward kinematics weight to .02 looked best. And I'll reference the joint names on the rig's skeleton. In this example, we need to move the robots hips, chest, and hand to have a natural looking animation of the robot reaching its arm to the plant.
Then, I'll define constraints for the rig. This example sets up two parent constraints and a point constraint. A parent constraint constrains a joint's position and orientation. A point constraint constrains a joint's position. Next, I'll make a resource containing the rig, and add an IKComponent with the given resource.
To update the IK target on every frame, I will find entity in the scene and position it.
First, I'll get the position where the hand should reach. The position should be relative to the entity that has the IKComponent on it. And change the x, y, and z position of the reachPosition so that it's slightly different each frame.
I also want to update the constraints for the left hand on each frame. So I'll get the IKComponent from the entity. Set the left hand target translation, and position of the left hand. A value of 0 means IK has no influence on the constraint, and a value of 1 means IK has full influence on the constraint.
To smoothly animate the robots arm when beginning and ending IK, I increment and decrement the weight of how much the position is driven by IK and how much it is driven by the base pose over time.
Then, commit the updated values to the component.
Lets run the code to see how this looks.
Nice! We reached our goal with that animation. After the robot reaches its arm, I use a particle effect to water the plant. Now, that the robot has finished watering the plant, lets make the robot return back to its starting position. For this, we'll use animation actions.
Animation actions allow you to setup when actions occur and sequence actions together over a duration of time. This is similar to a cut scene animation in a game but with animation actions you can do this in real time. A common use case for this would be foot fall events when a character is walking. Every time a character's foot lands on ground it makes a sound. Lets use animation actions to move the robot back to the center of the platform.
The actions API has built-in actions and a custom action. The built-in actions consist of a number of pre-built actions that you can easily configure like SpinAction or PlayAnimationAction. This is the underlying API that supports the timelines feature in Reality Composer Pro. If you have a need outside of what the built-in actions cover, you can use the custom action. This allows you to write your own action in code and have it synchronized on a timeline. To use the built-in actions API, you will first create an animation definition for an entity using a built in action. Then, you'll create an animation resource from the definition. Group the animation resources that you want to play at the same time. And sequence the animation resources to be played in a certain order. Last, you will call play animation on your entity. Let's checkout how to do this in code.
I have a few methods that contain logic in them which define the animations and return animation resources for rotating, moving, and aligning the robot. The logic for these animation resource instances has been defined using existing RealityKit APIs.
I'll sequence these animations together starting with a RotateAnimation, followed by a WalkAndMoveAnimation, then, an AlignAtHomeAction and finally the RobotTravelHomeCompleteAction. Then, I'll start playing these animations in that order using the existing playAnimation API.
For custom actions, you will first create your own protocol conformance of an EntityAction. Then, you'll create an AnimationResource from it using the makeActionAnimation API.
And group the animation resource with others or sequence it to be executed in a certain order.
Last, you will call playAnimation on your entity. Then, at runtime you can subscribe to start and end events for the custom actions.
Let's see these steps in code and make our robot return back to its starting position once it's done watering the plants.
I'll create a custom action named RobotMoveToHomeComplete, which I'll use to notify us when the move-to-home procedure completes. Then, create an instance of the RobotMoveToHomeComplete EntityAction. Generate an AnimationResource from it, using the makeActionAnimation API.
One thing to note, is that you can sequence or group multiple animations with a custom action. Since we only have one animation here, we won't do any sequencing.
And play the animation resource using the playAnimation API.
I subscribe to events when the EntityAction has started. The subscription closure will be called when the action starts. When the start event fires, we know the robot has arrived at its home destination. So I'll stop the playback controller to stop the animation.
I also subscribe to events, when the RobotMoveToHomeComplete EntityAction has ended.
When the closure is called, I transition the robot to the .arrivedHome state.
Now, lets run this animation sequence.
Alright, we're on a roll now.
Lets focus on the plants animation. For this, I'll use blend shape animation. Blend shape animations allow you to smoothly transition from one pose to another creating lifelike movements. These are commonly used for animating a character's face or body by blending between a series of different shapes. For our example, I'll animate the plants from a wilted to a healthy state and vice versa.
The blend shape animation API consists of a BlendShapeWeightsMapping, BlendShapeWeightsComponent, and two ways to execute the animations: Procedurally or via USD animation. The BlendShapeWeightsMapping allows you to set a weight associated to a blend target. You set up blend targets on an entity and each target can have weights associated to it. You can set a weight from 0-1. In our case the weight of 1 is a wilted plant, the weight of 0.5 is an in-between state, and the weight of 0 is a healthy plant. To animate the plant from wilted to healthy, I'll update the weight value over time. To use the blend shape animation API, First, I'll define the BlendShapeWeightsMapping, then I'll create a BlendShapeWeightsComponent from the mapping and add the component to an entity. Once the component is on the entity, I can query the BlendShapeWeightsComponent for the weight values and update them at any time. I'm now able to procedurally create either a FromToBy or Sampled blend shape animation. Or, I can play a USD blend shape animation using the existing RealityKit playAnimation API. Lets walk through creating this in code.
First, find the model entity that has the model component from the entity hierarchy.
Get its mesh resource and create the BlendShapeWeightsMapping from the mesh resource. Then, create the BlendShapeWeightsComponent from the mapping.
To update the blend weights at runtime, I will get the BlendShapeWeightsComponent from the entity.
Get a copy of the BlendShapeWeightsSet so I can assign weight values into this set.
Update all the weight values by blendWeightIndex in the BlendShapeWeightsSet and set it to 0 which indicates a healthy plant. In this case, we are setting the state of the plant to stay at a healthy state. To transition from healthy to wilted you'll gradually increment or decrement the value with an easing function.
Then, assign the new weight values to the BlendShapeWeightsComponent so they will take effect.
Lets run the code and wake these flowers up.
It's so nice to see our flowers bloom back to life. Now, lets check out what our second robot is up to.
The robot is reaching its hand towards the butterfly using Inverse Kinematics. The butterfly is flying around using an animation action created in Reality Composer Pro using a timeline, and a custom component modifying the butterfly's position. If we focus on the robot's head, we can see the head is following the butterfly's flight path. To achieve this movement, we will use the new Skeletal poses API.
A typical rigged 3D character consists of a skeleton structure, that is made of interconnected bones. Each bone corresponds to a different part of a character or object.
To animate the character you rotate its joints to pose the object into various positions and animate them. This is commonly used for making characters walk or run.
The skeletal poses API adds a new SkeletalPosesComponent. You can use an animation produced by RealityKit. Or you can modify the skeleton at runtime, with an interface to programmatically query the SkeletalPosesComponent for a specific joint and update the transforms on it.
We will use the skeletal poses API to make the robot's neck bone rotate as if it's watching the butterfly flutter around.
Any skinned mesh that's imported into RealityKit from a USD file will already have a SkeletalPosesComponent attached to the entity. So we don't need to initialize it, just grab the SkeletalPosesComponent from the entity.
Update the rotation of the joint. Its important to note that this is in local space. To have the neck rotation update on every frame, I will call this code from RealityKit's update function. RealityKit calls the update function on all registered systems, every frame.
And commit the updated values to the component. I can update the entire joint transform or just the translation, scale, or rotation. This can be updated at most every frame. Lets run the code and see the robot look at the butterfly fluttering around.
Well, I dont know about you, but that one made my heart flutter.
We've covered a lot today. Lets recap everything we talked about.
We learned how to use Reality Composer Pro's timeline feature to sequence actions on a timeline and initiate them to play with triggers. We made the robot's arm reach to objects using the new Full Body Inverse Kinematics API. How to use built-in and custom actions to sequence actions together. We looked at how to blend between a series of different shapes using blend shape animation. And we saw how to animate an object by rotating the joints to pose the object into various positions and animate them.
In addition to these features, Reality Composer Pro has added other exciting features this year. Such as Video Docking, the ability to deploy to iOS and macOS, alongside visionOS, environment authoring, and lighting.
To learn more about these topics, please checkout the sessions: "Enhance the immersion of media viewing in custom environments" and "Discover RealityKit APIs for iOS, macOS and visionOS" These new features will help you make your 3D content interactive and I cant wait to see what you build with them. Thank you and have a great WWDC.
-
-
20:31 - Setup IKComponent
// Setup IKComponent import RealityKit struct HeroRobotRuntimeComponent: Component { var rig = try? IKRig(for: modelSkeleton) rig.maxIterations = 30 rig.globalFkWeight = 0.02 let hipsJointName = "root/hips" let chestJointName = "root/hips/spine1/spine2/chest" let leftHandJointName = "root/hips/spine1/spine2/chest/…/L_arm3/L_arm4/L_arm5/L_wrist" rig.constraints = [ .parent(named: "hips_constraint", on: hipsJointName, positionWeight: SIMD3(repeating: 90.0), orientationWeight: SIMD3(repeating: 90.0)), .parent(named: "chest_constraint", on: chestJointName, positionWeight: SIMD3(repeating: 120.0), orientationWeight: SIMD3(repeating: 120.0)), .point(named: "left_hand_constraint", on: leftHandJointName, positionWeight: SIMD3(repeating: 10.0)) ] let resource = try? IKResource(rig: rig) modelComponentEntity.components.set(IKComponent(resource: resource)) }
-
21:33 - Update IKComponent
// Update IKComponent import RealityKit struct HeroRobotRuntimeComponent: Component { guard let reachTarget = sceneRoot.findEntity(named: "reachTargetName") else { return } var reachPosition = reachTarget.position(relativeTo: entity) let time = sin(simTime) reachPosition.x += (20.0 + 50.0 * time) reachPosition.y += (40.0 + 30.0 * abs(time)) reachPosition.z += (20.0 + 20.0 * abs(time)) guard let ikComponent = modelComponentEntity.components[IKComponent.self] else { return } var reachPosition = reachTarget.position(relativeTo: entity) ... var leftHandConstraint = ikComponent.solvers.first?.constraints["left_hand_constraint"] leftHandConstraint?.target.translation = reachPosition // A blendValue = 0 means no influence on the constraint. // A blendValue = 1 means full influence on the constraint. var blendValue = isEnabled ? (time / totalBlendTime) : (1.0 - time / totalBlendTime) leftHandConstraint?.animationOverrideWeight.position = blendValue modelComponentEntity.components.set(ikComponent) }
-
24:36 - Sequence and play animation actions
// Play Animation Actions import RealityKit struct HeroRobotRuntimeComponent: Component { let rotateAnimationResource = createRotateAnimationResource() let walkAndMoveAnimationGroup = createWalkAndMoveAnimationGroup() let alignAtHomeActionResource = createAlignAtHomeActionResource() let robotTravelHomeCompleteActionResource = createRobotTravelHomeCompleteAction() // Build a sequence of the rotate, move and align animations/actions to play. let moveHomeSequence = try? AnimationResource.sequence(with: [rotateAnimationResource, walkAndMoveAnimationGroup, alignAtHomeActionResource, robotTravelHomeCompleteActionResource]) // Play the move-to-home sequence. _ = robotEntity.playAnimation(moveHomeSequence) }
-
25:59 - Setup EntityActions
// Setup EntityActions import RealityKit struct HeroRobotRuntimeComponent: Component { struct RobotMoveToHomeComplete: EntityAction { var animatedValueType: (any AnimatableData.Type)? { nil } } let travelCompleteAction = RobotMoveToHomeComplete() let actionResource = try! AnimationResource.makeActionAnimation(for: travelCompleteAction, duration: 0.1) let _ = robotEntity.playAnimation(actionResource) }
-
26:39 - EntityAction subscription
// EntityAction subscription import RealityKit struct HeroRobotRuntimeComponent: Component { // Subscribe to know when the EntityAction has started. RobotMoveToHomeComplete.subscribe(to: .started) { event in if event.playbackController.entity != nil { event.playbackController.stop() } } // Possible states of the robot. public enum HeroRobotState: String, Codable { case available … case arrivedHome } // Subscribe to know when the EntityAction has ended. RobotMoveToHomeComplete.subscribe(to: .ended) { event in if let robotEntity = event.playbackController.entity, var component = robotEntity.components[HeroRobotRuntimeComponent.self] { component.setState(newState:.arrivedHome) } } }
-
29:17 - Setup BlendshapeWeightsComponent
// Setup BlendShapeWeightsComponent import RealityKit struct HeroPlantComponent: Component, Codable { guard let modelComponentEntity = findModelComponentEntity(entity: entity), let modelComponent = modelComponentEntity.components[ModelComponent.self] else { return } let blendShapeWeightsMapping = BlendShapeWeightsMapping(meshResource: modelComponent.mesh) // Create the blend shape weights component. entity.components.set(BlendShapeWeightsComponent(weightsMapping: blendShapeWeightsMapping)) }
-
29:38 - Update BlendshapeWeightsComponent
// Update BlendShapeWeightsComponent struct HeroPlantComponent: Component, Codable { guard let component = entity.components[BlendShapeWeightsComponent.self] else { return } var blendWeightSet = blendShapeComponent.weightSet // Update the weights in the BlendShapeWeightsSet for weightIndex in 0..<blendWeightSet[blendWeightsIndex].weights.count { blendWeightSet[blendWeightsIndex].weights[weightIndex] = 0.0 } // Assign the new weights to the blend shape component. for index in 0..<blendWeightSet.count { component?.weightSet[blendWeightsIndex].weights = blendWeightSet[index].weights } }
-
32:01 - Setup and update Skeletal Poses
// Update Skeletal Poses import RealityKit struct StationaryRobotRuntimeComponent: Component { guard var component = entity.components[SkeletalPosesComponent.self] else { return } let neckRotation = calculateRotation() component.poses.default?.jointTransforms[neckJointIndex].rotation = neckRotation }
-
-
Looking for something specific? Enter a topic above and jump straight to the good stuff.
An error occurred when submitting your query. Please check your Internet connection and try again.