RealityKit

RSS for tag

Simulate and render 3D content for use in your augmented reality apps using RealityKit.

RealityKit Documentation

Post

Replies

Boosts

Views

Activity

AirPlay Streaming Metal-Processed Output in Real TIme
Hey everyone, I'm working on an iOS app where I use AVPlayer to play videos, then process them through Metal to apply effects. The app has controls that let users tweak these effects in real-time, and I want the final processed video to be streamed via AirPlay. I use a custom rendering layer that uses a Metal texture to display the processed video on the screen an that works as intended. The problem is, when I try to AirPlay the video after feeding it the processed metal frames, it’s just streaming the original video from AVPlayer, not the version with all the Metal effects. The final processed output is a Metal texture that gets rendered in a MTKView. I even tried capturing that texture and sending it through a new AVPlayer setup, but AirPlay still grabs the original, unprocessed video instead of the final, fully-rendered output. It's also clear that the airplayed video has the full length of the original built in so it's not even that it's 'live streaming' the wrong feed. I need help figuring out how to make AirPlay stream the live, processed video with all the effects, not just the raw video. Any ideas? Happy to share my code if that helps but I'm not sure I have the right underlying approach yet. Thanks!
0
0
216
Aug ’24
Dynamic loading various USDZ files
I have various USDZ files in my visionOS app. Loading the USDZ files works quite well. I only have problems with the positioning of the 3D model. For example, I have a USDZ file that is displayed directly above me. I can't move the model or perform any other actions on it. If I sit on a chair or stand up again, the 3D model automatically moves with me. This is my source code for loading the USDZ files: struct ImmersiveView: View { @State var modelName: String @State private var loadedModel = Entity() var body: some View { RealityView { content in if let usdModel = try? await Entity(named: modelName) { print("====> \(modelName) : \(usdModel) <====") let bounds = usdModel.visualBounds(relativeTo: nil).extents usdModel.scale = SIMD3<Float>(1.0, 1.0, 1.0) usdModel.position = SIMD3<Float>(0.0, 0.0, 0.0) usdModel.components.set(CollisionComponent(shapes: [.generateBox(size: bounds)])) usdModel.components.set(HoverEffectComponent()) usdModel.components.set(InputTargetComponent()) loadedModel = usdModel content.add(usdModel) } } } } I only want the 3D models from the USDZ files to be displayed later, and later on, to be able to move them via gestures. Moving the models is step 2. First, I need to make sure the models are displayed correctly. What have I forgotten or done wrong?
1
0
369
Aug ’24
RealityView world tracking without camera feed?
Is it possible with iOS 18 to use RealityView with world tracking but without the camera feed as background? With content.camera = .worldTracking the background is always the camera feed, and with content.camera = .virtual the device's position and orientation don't affect the view point. Is there a way to make a mixture of both? My use case is that my app "Encyclopedia GalacticAR" shows astronomical objects and a skybox (a huge sphere), like a VR view of planets, as you can see in the left image. Now that iOS 18 offers RealityView for iOS and iPadOS, I would like to make use of it, but I haven't found a way to display my skybox as environment, instead of the camera feed. I filed the suggestion FB14734105 but hope that somebody knows a workaround...
4
1
513
Aug ’24
How can I create a laser shader?
I am trying to make a shader that resembles a laser like this: I've been experimenting with a basic Fresnel shader to start, but the Fresnel shader has a problem at high viewing angles where the top has a very different color than the rest of the capsule. This might work for a laser shader once inverted and fine tuned: However, when viewed from the top, it doesn't look so good anymore: Ideally, the purple edge is always ONLY on the edge, and the rest of the surface is the light pink color, no matter the viewing angle. How can I accomplish this to create something that looks like a laser?
0
0
306
Aug ’24
Game Center breaks RealityView world tracking
Has anyone come across the issue that setting GKLocalPlayer.local.authenticateHandler breaks a RealityView's world tracking on iOS / iPadOS 18 beta 5? I'm in the process of upgrading my app to make use of the much appreciated RealityView unification, using RealityView not only on visionOS but now also on iOS and iPadOS. In my RealityView, I enable world tracking on iOS like this: content.camera = .worldTracking However, device position and orientation were ignored (the camera remained static) and there was no camera pass-through. Then I discovered that the issue disappeared when I remove the line GKLocalPlayer.local.authenticateHandler = { viewController, error in // ... some more code ... } So I filed FB14731139 and hope that it will be resolved before the release of iOS / iPadOS 18.
2
1
345
Aug ’24
USD animation support in RealityKit vs. macOS Preview
This question is about USD animations playing correctly in macOS Preview but not with RealityKit on visionOS. I have a USD file created with 3D Studio Max that contains mesh-based smoke animation: https://drive.google.com/file/d/1L7Jophgvw0u0USSv-_0fPGuCuJtapmzo/view (5.6 MB) Apple's macOS 14.5 Preview app is able to play the animation correctly: However, when a visionOS app uses RealityKit to load that same USD file in visionOS 2.0 beta 4, built with 16.0 beta 3 (16A5202i), and Entity/playAnimation is called, the animation does not play as expected: This same app is able to successfully play animation of a hierarchy of solid objects read from a different USD file. When I inspect the RealityKit entities loaded from the USD file, the ground plane entity is a ModelEntity, as expected, but the smoke entity type is Entity, with no associated geometry. Why is it that macOS Preview can play the animation in the file, but RealityKit cannot? Thank you for considering this question.
1
0
519
Jul ’24
Camera Feed as a RealityKit Texture
Hello, I'm developing a RealityKit based app. As part of this, I would like to have a material applied to 3d objects which is essentially contains a texture which is the live camera feed from the arsession. I have the code below which does apply a texture of the camera feed to the box but it essentially only shows the camera snapshot at the time the app loads and doesn't update continuously. I think the issue might be that there is some issue with how the delegate is setup and captureOutput is only called when the app loads instead of every frame. Open to any other approach or insight that gets the job done. Thank you for the help! class CameraTextureViewController: UIViewController { var arView: ARView! var captureSession: AVCaptureSession! var videoOutput: AVCaptureVideoDataOutput! var material: UnlitMaterial? var displayLink: CADisplayLink? var currentPixelBuffer: CVPixelBuffer? var device: MTLDevice! var commandQueue: MTLCommandQueue! var context: CIContext! var textureCache: CVMetalTextureCache! override func viewDidLoad() { super.viewDidLoad() setupARView() setupCaptureSession() setupMetal() setupDisplayLink() } func setupARView() { arView = ARView(frame: view.bounds) arView.autoresizingMask = [.flexibleWidth, .flexibleHeight] view.addSubview(arView) let configuration = ARWorldTrackingConfiguration() configuration.planeDetection = [.horizontal, .vertical] arView.session.run(configuration) arView.session.delegate = self } func setupCaptureSession() { captureSession = AVCaptureSession() captureSession.beginConfiguration() guard let videoDevice = AVCaptureDevice.default(.builtInWideAngleCamera, for: .video, position: .back), let videoDeviceInput = try? AVCaptureDeviceInput(device: videoDevice), captureSession.canAddInput(videoDeviceInput) else { return } captureSession.addInput(videoDeviceInput) videoOutput = AVCaptureVideoDataOutput() videoOutput.setSampleBufferDelegate(self, queue: DispatchQueue(label: "cameraQueue")) videoOutput.videoSettings = [kCVPixelBufferPixelFormatTypeKey as String: kCVPixelFormatType_32BGRA] guard captureSession.canAddOutput(videoOutput) else { return } captureSession.addOutput(videoOutput) captureSession.commitConfiguration() DispatchQueue.global(qos: .userInitiated).async { [weak self] in self?.captureSession.startRunning() } } func setupMetal() { device = MTLCreateSystemDefaultDevice() commandQueue = device.makeCommandQueue() context = CIContext(mtlDevice: device) CVMetalTextureCacheCreate(nil, nil, device, nil, &textureCache) } func setupDisplayLink() { displayLink = CADisplayLink(target: self, selector: #selector(updateFrame)) displayLink?.preferredFrameRateRange = CAFrameRateRange(minimum: 60, maximum: 60, preferred: 60) displayLink?.add(to: .main, forMode: .default) } @objc func updateFrame() { guard let pixelBuffer = currentPixelBuffer else { return } updateMaterial(with: pixelBuffer) } func updateMaterial(with pixelBuffer: CVPixelBuffer) { let ciImage = CIImage(cvPixelBuffer: pixelBuffer) var tempPixelBuffer: CVPixelBuffer? let attrs = [ kCVPixelBufferCGImageCompatibilityKey: kCFBooleanTrue, kCVPixelBufferCGBitmapContextCompatibilityKey: kCFBooleanTrue, kCVPixelBufferMetalCompatibilityKey: kCFBooleanTrue ] as CFDictionary CVPixelBufferCreate(kCFAllocatorDefault, CVPixelBufferGetWidth(pixelBuffer), CVPixelBufferGetHeight(pixelBuffer), kCVPixelFormatType_32BGRA, attrs, &tempPixelBuffer) guard let tempPixelBuffer = tempPixelBuffer else { return } context.render(ciImage, to: tempPixelBuffer) var textureRef: CVMetalTexture? let width = CVPixelBufferGetWidth(tempPixelBuffer) let height = CVPixelBufferGetHeight(tempPixelBuffer) CVMetalTextureCacheCreateTextureFromImage(kCFAllocatorDefault, textureCache, tempPixelBuffer, nil, .bgra8Unorm, width, height, 0, &textureRef) guard let metalTexture = CVMetalTextureGetTexture(textureRef!) else { return } let ciImageFromTexture = CIImage(mtlTexture: metalTexture, options: nil)! guard let cgImage = context.createCGImage(ciImageFromTexture, from: ciImageFromTexture.extent) else { return } guard let textureResource = try? TextureResource.generate(from: cgImage, options: .init(semantic: .color)) else { return } if material == nil { material = UnlitMaterial() } material?.baseColor = .texture(textureResource) guard let modelEntity = arView.scene.anchors.first?.children.first as? ModelEntity else { let mesh = MeshResource.generateBox(size: 0.2) let modelEntity = ModelEntity(mesh: mesh, materials: [material!]) let anchor = AnchorEntity(world: [0, 0, -0.5]) anchor.addChild(modelEntity) arView.scene.anchors.append(anchor) return } modelEntity.model?.materials = [material!] } } extension CameraTextureViewController: AVCaptureVideoDataOutputSampleBufferDelegate { func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) { guard let pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else { return } currentPixelBuffer = pixelBuffer } } extension CameraTextureViewController: ARSessionDelegate { func session(_ session: ARSession, didUpdate frame: ARFrame) { // Handle AR frame updates if necessary } }
1
0
294
Jul ’24
Camera Aspect Ratio and Metal ARkit rendering
Hi I'm working on a project that uses RealityKit including the placement of 3d objects. However, I want to be able to run the background camera through Metal post-processing before being rendered but haven't been able to find a working approach. I'm open to it rendering directly into the ARview or a separate MTKview or swiftui layer. I've tried using the default xcode project of an Augmented Reality App with Metal Content. However it seems to use a 1.33 aspect camera by default instead of the iphone 15s standard ratio which works by default when I use the regular realitykit pathway and doesnt seem to have the proper ratio available as an option Open to any approach that gets the job done here. Thank you, Any direction would be
1
0
315
Jul ’24
How to overlay an image on part of a 3D model?
How to overlay an image in RealityKit on a 3D model using code so that it does not stretch to the entire object, but has its own height and width that I can change? I have a solution on how to do this, but then it will not be possible to change the height, width or place it anywhere on the 3D model. And this is to cut out a part of the object and overlay the image on the entire cutout area. How to overlay a 2D image on a 3D model without stretching the photo to the entire 3D object? If this is possible, please give an example of how to do this in code. I could not find on the Internet how to do this. Although in other engines this can be done, for example, in Blender or Unity. If I am not mistaken, this is done there using decals
0
1
308
Jul ’24
USDZ models look broken on iOS 18 / visionOS 2 beta
I noticed that with the 4th betas of iOS 18 and visionOS 2, some USDZ models' texture mapping looks completely broken. The issue occurs only with a device, not with the Simulator. It's a regression, the models look fine with iOS 17.5.1 and visionOS 1.2. The issue occurs if I load a model as an Entity in a RealityView iOS or visionOS, or in a SwiftUI 3DModel view on visionOS. Has anyone seen this too? Is there a workaround? I filed a bug report with a minimal example project, it's FB14473756. Screenshot on Vision Pro device: Screenshot on Vision Pro Simulator:
1
2
503
Jul ’24
EnvironmentLightingConfigurationComponent not working
Has anyone gotten EnvironmentLightingConfigurationComponent to work? I tried the code from https://developer.apple.com/documentation/realitykit/environmentlightingconfigurationcomponent to prevent a planet from being lit by the environment. My goal is that the side that isn't lit by the star appears pitch black. However, the code seems to have no effect on visionOS 2 and iPadOS 18 (I tried betas 1 through 4, on device, built with Xcode 16 beta 4). No matter if there is a PointLight or no light at all in the scene, no matter if I use SimpleMaterial or PhysicallyBasedMaterial, no matter if I use a texture or a color on the sphere. I filed a bug report, it's FB14470954. Or am I doing something wrong? Here's my code: var material = PhysicallyBasedMaterial() if let tex = try? await TextureResource(named: "planet.jpg") { material.baseColor = .init(texture: .init(tex)) material.emissiveIntensity = 0 let sphereMesh = MeshResource.generateSphere(radius: 0.5) let entity = ModelEntity() entity.components.set(ModelComponent(mesh: sphereMesh, materials: [material])) entity.position = [-1, 1.0, -1.0] let envLightingConfig = EnvironmentLightingConfigurationComponent(environmentLightingWeight: 0) entity.components.set(envLightingConfig) content.add(entity) }
1
1
437
Jul ’24
Particle Systems flicker when partly behind transparent objects
I am having a difficult time to create particle systems in Reality Composer Pro (visionOS beta 3). They tend to start to flicker and all particles disappear and reappear in semi-random intervals. I can clearly see that happening with one effect that I put inside a small box consisting of 4 transparent walls that has a solid floor. When I change the view angle the particle system starts to flicker when viewed from below its emission height. I tried all combinations of particle rendering: billboard->free, additive etc and it does not change anything. I am using the default particle image. Any help appreciated
2
0
410
Jul ’24
Implementing a bouncing surface
I am trying to simulate a pinball game and I want to use PhysicsBody & PhysicsMotion to achieve that. I tuned the parameters around in PhysicsBodyComponent, but the result is not quite ideal for now. Imagine a fully inflated basketball bouncing high off the ground (ground vs basketball). I assign PhysicsBodyComponent and CollisionComponent to both basketball and the ground. For basket ball, I set it as: dynamic mode mass 1, inertia .one Material.Restitution 1 Angular Damping and Linear Damping to 0 AddForce to make the basketball move to hit the ground For ground, I set it as: static mode mass 1, inertia .zero Material.Restitution 1 Angular Damping and Linear Damping to 0 However, when the basket ball hit the ground, it isn't that bouncy, the basketball behaves like hitting to a cotton and the linear speed just dumps fast. Wonder how I could achieve the bouncing effect like real basketball vs ground.
4
0
727
Jul ’24
Trying to traverse through a usdz file to copy materials from another usdz file to the traversed mesh
Hi All, I am using RealityKit along with ARKit and Swift UI to develop an app where I am augmenting a usdz model of a complex geometry like that of a car. I have some other usdz files with a simple plane geometry having the material properties embedded within them which also i am loading as model entities. I want to traverse through my car usdz file such that i can pick the material from simple usdz file and apply it to the car as car paint. To do this i know the name of the mesh holding the car paint as well as the name of the material applied. I have tried to traverse through the usdz files using both RealityKit and SceneKit but I am not successful to reach to the lowest mesh and copy the material properties to it. With RealityKit, I have tried to get the instance data using modelEntity as follows :- "sourceModel?.model?.mesh.contents.instances". But this returns instance id, model name and transform only. Any help will be highly appreciated. Thank You
1
0
423
Jul ’24
RealityKit, DrawableQueue, and synchronizing scene updates
I have a visionOS app that utilizes DrawableQueue and CADisplayLink to update an Entity, TextureResource tied to the drawable, and a Material that uses that TextureResource. TextureResource gets updated with when a video frame is ready. Material properties can get updated from the video or from other sources. Current process: when each video frame is ready, we get the next drawable, render to it, present it, and make an Entity update (e.g. transform). However, I’m experiencing jitter in the rendered content where it seems that the updates to the entity and the drawable being presented are milliseconds off from each other. Should I be using Drawable.presentOnSceneUpdate() to ensure all updates happen in the same update cycle? And if so, do you have any additional details on how to correctly use this function (the docs are unclear)?
0
1
372
Jul ’24