Delve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.

All subtopics

Post

Replies

Boosts

Views

Activity

Pink Screen with VideoMaterial in ARKit
Hi everyone, I'm developing an ARKit app using RealityKit and encountering an issue where a video displayed on a 3D plane shows up as a pink screen instead of the actual video content. Here's a simplified version of my setup: func createVideoScreen(video: AVPlayerItem, canvasWidth: Float, canvasHeight: Float, aspectRatio: Float, fitsWidth: Bool = true) -> ModelEntity { let width = (fitsWidth) ? canvasWidth : canvasHeight * aspectRatio let height = (fitsWidth) ? canvasWidth * (1/aspectRatio) : canvasHeight let screenPlane = MeshResource.generatePlane(width: width, depth: height) let videoMaterial: Material = createVideoMaterial(videoItem: video) let videoScreenModel = ModelEntity(mesh: screenPlane, materials: [videoMaterial]) return videoScreenModel } func createVideoMaterial(videoItem: AVPlayerItem) -> VideoMaterial { let player = AVPlayer(playerItem: videoItem) let videoMaterial = VideoMaterial(avPlayer: player) player.play() return videoMaterial } Despite following the standard process, the video plane renders pink. Has anyone encountered this before, or does anyone know what might be causing it? Thanks in advance!
4
0
463
Aug ’24
VisionOS crashes Loading Entities from Disk Bundles with EXC_BREAKPOINT
Summary: I’m working on a VisionOS project where I need to dynamically load a .bundle file containing RealityKit content from the app’s Application Support directory. The .bundle is saved to disk after being downloaded or retrieved as an On-Demand Resource (ODR). Sample project with the issue: Github repo. Play the target test-odr to use with the local bundle and have the crash. Overall problem: Setup: Add a .bundle named RealityKitContent_RealityKitContent.bundle to the app’s resources. This bundle contains a Reality file with two USDA,: “Immersive” and “Scene”. Save to Disk: save the bundle to the Application Support directory, ensuring that the file is correctly copied and saved. Load the Bundle: load the bundle from the saved URL using Bundle(url: bundleURL) to initialize the Bundle object. Load Entity from Bundle: load a specific entity (“Scene”) from the bundle. When trying to load the entity using let storedEntity = try await Entity(named: "Scene", in: bundle), the app crashes with an EXC_BREAKPOINT error. ContentsOf Method Issue: If I use the Entity.load(contentsOf:realityFileURL, withName: entityName) method, it always loads the first root entity found (in this case, “Immersive”) rather than “Scene”, even when specifying the entity name. This is why I want to use the Bundle to load entities by name more precisely. Issue: The crash consistently occurs on the Entity(named: "Scene", in: bundle) line. I have verified that the bundle exists and is accessible at the specified path and that it contains the expected .reality file with multiple entities (“Immersive” and “Scene”). The error code I get is EXC_BREAKPOINT (code=1, subcode=0x1d135d4d0). What I’ve Tried: • Ensured the bundle is properly saved and accessible. • Checked that the bundle is initialized correctly from the URL. • Tested loading the entity using the contentsOf method, which works fine but always loads the “Immersive” entity, ignoring the specified name. Hence, I want to use the Bundle-based approach to load multiple USDA entities selectively. Question: Has anyone faced a similar issue or knows why loading entities using Entity(named:in:) from a disk-based bundle causes this crash? Any advice on how to debug or resolve this, especially for managing multiple root entities in a .reality file, would be greatly appreciated.
1
0
425
Sep ’24
Particle Systems flicker when partly behind transparent objects
I am having a difficult time to create particle systems in Reality Composer Pro (visionOS beta 3). They tend to start to flicker and all particles disappear and reappear in semi-random intervals. I can clearly see that happening with one effect that I put inside a small box consisting of 4 transparent walls that has a solid floor. When I change the view angle the particle system starts to flicker when viewed from below its emission height. I tried all combinations of particle rendering: billboard->free, additive etc and it does not change anything. I am using the default particle image. Any help appreciated
2
0
523
Jul ’24
Why is an OpenGL positional (spot) light interfering with non-positional light
I have a legacy OpenGL fixed-pipeline app which has been ported from Windows (32-bit) to MacOS 64-bit. The problem is that if I have a scene with a non-positional light, everything works great. If I add a positional spotlight the two lights interact, and I get incorrect results. This problem does not occur on X86_64 Macs. It does occur when the app is X86_64 running under Rosetta or native ARM64. So it's either an Apple Silicon OpenGL driver behaviour my code is triggering, or something with the on-chip Apple Silicon graphics. Here is the "normal" case: the spotlight is to the right: Here, I have moved the spotlight down (Y = 1). Notice the black areas on the cube. That's incorrect. Now, I turn off the spotlight by commenting out the "makeALight" call for the spotlight (light 6). Now, the cube is evenly lit. Here is the test code I use to generate the lights. You will need to install glfw with brew to build it. main.cpp
0
0
446
Sep ’24
Leaderboard/achievements and testing
Hi, I created a leaderboard in my application, and a method to record a new score: GKLeaderboard.loadLeaderboards(IDs: [leaderboardID]) { (leaderboards, error) in if let error = error { print("Error loading leaderboards: \(error.localizedDescription)") } guard let leaderboard = leaderboards?.first else { print("Leaderboard not found") return } leaderboard.submitScore(score, context: 0, player: self.localPlayer) { error in if let error = error { print("Error reporting score: \(error.localizedDescription)") } else { print("Score reported successfully!") } } } } When debuging, this method is correctly called and I have a success, so I tried to test it with an internal TestFlight release. The leaderboard is never updated. Is there a way to perform a test of a leaderboard before publishing the app? I have the same question for achievements: let achievement = GKAchievement(identifier: identifier) achievement.percentComplete = percentComplete GKAchievement.report([achievement]) { error in if let error = error { print("Error reporting achievement: \(error.localizedDescription)") } } } Thanks!
2
0
448
Sep ’24
GameCenter scores are not being posted to the leaderboard
Hello! Bare with me here, as there is a lot to explain! I am working on implementing a Game Center high score leaderboard into my game. I have looked around for examples of how to properly implement this code, but have come up short on finding much material. Therefore, I have tried implementing it myself based off information I found on apples documentation. Long story short, I am getting success printed when I update my score, but no scores are actually being posted (or at-least no scores are showing up on the Game Center leaderboard when opened). Before I show the code, one thing I have questioned is the fact that this game is still in development. In AppStoreConnect, the status of the leaderboard is "Not Live". Does this affect scores being posted? Onto the code. I have created a GameCenter class which handles getting the leaderboards and posting scores to a specific leaderboard. I will post the code in whole, and will discuss below what is happening. PLEASE VIEW ATTACHED TEXT TO SEE THE GAMECENTER CLASS! GameCenter class - https://developer.apple.com/forums/content/attachment/0dd6dca8-8131-44c8-b928-77b3578bd970 In a different GameScene, once the game is over, I request to post a new high score to Game Center with this line of code: GameCenter.shared.submitScore(id: GameCenterLeaderboards.HighScore.rawValue) Now onto the logic of my code. For the longest time I struggled to figure out how to submit a score. I figured out that in Xcode 12, they deprecated a lot of functions that previously worked for me. Not is seems that we have to load all leaderboards (or the ones we want). That is the purpose behind the leaderboards private variable in the Game Center class. On the start up of the app, I call authenticate player. Once this callback is reached, I call loadLeaderboards which will load the leaderboards for each string id in an enum that I have elsewhere. Each of these leaderboards will be created as a Leaderboard object, and saved in the private leaderboard array. This is so I have access to these leaderboards later when I want to submit a score. Once the game is over, I am calling submitScore with the leaderboard id I want to post to. Right now, I only have a high score, but in the future I may add a parameter to this with the value so it works for other leaderboards as well. Therefore, no value is passed in since I am pulling from local storage which holds the high score. submitScore will get the leaderboard from the private leaderboard array that has the same id as the one passed in. Once I get the correct leaderboard, I submit a score to that leaderboard. Once the callback is hit, I receive the output "Successfully submitted score to leaderboard". This looks promising, except for the fact that no score is actually posted. At startup, I am calling updatePlayerHighScore, which is not complete - but for the purpose of my point, retrieves the high score of the player from the leaderboard and is printing it out to the console. It is printing out (0), meaning that no score was posted. The last thing I have questions about is the context when submitting a score. According to the documentation, this seems to just be metadata that GameCenter does not care about, but rather something the developer can use. Therefore, I think I can cross this off as causing the problem. I believe I implemented this correctly, but for some reason, nothing is posting to the leaderboard. This was ALOT, but I wanted to make sure I got all my thoughts down. Any help on why this is NOT posting would be awesome! Thanks so much! Mark
7
1
4.1k
Dec ’20
Game Center Notifications do not include GKMessageImage.png
Hello, Asking the following as, I was unable to find answers via search on the forum and in the documentation: Invitations sent via iMessage seem to work correctly with my custom image ( GKMessageImage.png ) however, notifications sent to Game Center Friends via invites generated in Game Center do not include the custom image ( GKMessageImage.png ). Questions: Is this expected behavior? Is there a different way to customize the image in the notification? Note the Game Center notification includes the App name correctly. I also noted in the WWDC session in 2016 ( saw video recently ) that there was some mention of no longer adding friends via Game Center. Is that currently true? Thanks in advance.
1
0
526
Jul ’24
Can an SDF be rendered using RealityKit?
I'm trying to ray-march an SDF inside a RealityKit surface shader. For the SDF primitive to correctly render with other primitives, the depth of the fragment needs to be set according to the ray-surface intersection point. Is there a way to do that within a RealityKit surface shader? It seems the only values I can set are within surface::surface_properties. If not, can an SDF still be rendered in RealityKit using ray-marching?
1
1
469
Sep ’24
RealityKit 3D texture
In my Metal-based app, I ray-march a 3D texture. I'd like to use RealityKit instead of my own code. I see there is a LowLevelTexture (beta) where I could specify a 3D texture. However on the Metal side, there doesn't seem to be any way to access a 3D texture (realitykit::texture::textures::custom returns a texture2d). Any work-arounds? Could I even do something icky like cast the texture2d to a texture3d in MSL? (is that even possible?) Could I encode the 3d texture into an argument buffer and get that in somehow?
1
0
442
Aug ’24
Game Center breaks RealityView world tracking
Has anyone come across the issue that setting GKLocalPlayer.local.authenticateHandler breaks a RealityView's world tracking on iOS / iPadOS 18 beta 5? I'm in the process of upgrading my app to make use of the much appreciated RealityView unification, using RealityView not only on visionOS but now also on iOS and iPadOS. In my RealityView, I enable world tracking on iOS like this: content.camera = .worldTracking However, device position and orientation were ignored (the camera remained static) and there was no camera pass-through. Then I discovered that the issue disappeared when I remove the line GKLocalPlayer.local.authenticateHandler = { viewController, error in // ... some more code ... } So I filed FB14731139 and hope that it will be resolved before the release of iOS / iPadOS 18.
2
1
469
Aug ’24
Disable Automatic Color Space conversion on Vision Pro Metal Shader
I am trying to convert a ThreeJS project to Metal for the Vision Pro. The issue is ThreeJS doesn't do any color space conversion (when I output a color in a fragment shader and then read it using the digital color meter in SRGB mode I get the same value I inputed in the fragment shader) This is not the case when using metal. When setting up my LayerRenderer I set the colorFormat to rgba16Unorm since it is the only non srgb color format supported on the vision pro apps. However switching between bgra8Unorm_srgb and rgba16Unorm seems to have no affect. when I set up the renderPassDescriptor I use the drawable colorTexture renderPassDescriptor.colorAttachments[0].texture = drawable.colorTextures[0] and when printing its pixel format it seems to be passed from the configuration. If there is anyway to disable this behavior or perform an inverse function of such that I get the original value out from the shader, that would be appreciated.
0
0
381
Aug ’24
SteamVR working, Headset not connecting
So, I've been messing around with SteamVR on Apple Silicon and it runs as expected under Rosetta translation, I've even got a game to run. But for some reason SteamVR cannot detect a headset, even when using one that SteamVR has drivers for such as the 2017 Vive headset. Would there be any explanation as to why this is because SteamVR works as expected so that leads me to believe it's something with MacOS.
1
0
339
Aug ’24
The elements in the attachment cannot add translation.
UI: Attachment(id: "tooptip") { if isRecording { TooltipView { HStack(spacing: 8) { Image(systemName: "waveform") .font(.title) .frame(minWidth: 100) } } .transition(.opacity.combined(with: .scale)) } } Trigger: Button("Toggle") { withAnimation{ isRecording.toggle() } } The above code did not show the animation effect when running. When I use isRecording to drive an element in a common SwiftUI view, there is an animation effect.
0
0
306
Aug ’24
Taking snapshots from WKWebView on visionos
Hi, I'm trying to capture some images from WKWebView on visionOS. I know that there's a function 'takeSnapshot()' that can get the image from the web page. But I wonder if 'drawHierarchy()' cannot work properly on WKWebView because of GPU content, is there any other methods I can call to capture images correctly? Furthermore, as I put my webview into an immersive space, is there any way I can get the texture of this UIView attachment? Thank you
0
0
289
Aug ’24
Crash on iOS18 Beta5 when adding a SCNAnimation animationDidStop closure with Swift6 selected
I'm trying to update my projects to use Swift6, if I change the project settings to use Swift6 then my app crashes when I add a closure to the SCNAnimation animationDidStop property. The error is inside the SceneKit renderingQueue and indicates that the callback is being called on the wring queue. Maybe I need to do something in the code to fix this but I can't seem to make it work, maybe a SceneKit bug? If you create a new game template in Xcode using SceneKit and replace the contents of GameViewController.swift with the following you will see the app crash after it is launched. import UIKit import SceneKit class GameViewController: UIViewController { let player: SCNAnimationPlayer = { let a = CABasicAnimation(keyPath: "opacity") return SCNAnimationPlayer(animation: SCNAnimation(caAnimation: a)) }() override func viewDidLoad() { super.viewDidLoad() let scnView = self.view as! SCNView scnView.scene = SCNScene() // Change the project settings to use Swift6 // Setting this closure will then cause a _dispatch_assert_queue_fail // EXC_BREAKPOINT error in the scenekit.renderingQueue.SCNView queue, // the only thing on the stack is: // "%sBlock was %sexpected to execute on queue [%s (%p)]" player.animation.animationDidStop = { (a: SCNAnimation, b: SCNAnimatable, c: Bool) in print("stopped") } scnView.scene?.rootNode.addAnimationPlayer(player, forKey: nil) player.play() } }
2
2
497
Aug ’24
CIImageProcessorKernel using Metal Compute Pipeline error
Greetings! I have been battling with a bit of a tough issue. My use case is running a pixelwise regression model on a 2D array of images using CIImageProcessorKernel and a custom Metal Shader. It mostly works great, but the issue that arises is that if the regression calculation in Metal takes too long, an error occurs and the resulting output texture has strange artifacts, for example: The specific error is: Error excuting command buffer = Error Domain=MTLCommandBufferErrorDomain Code=1 "Internal Error (0000000e:Internal Error)" UserInfo={NSLocalizedDescription=Internal Error (0000000e:Internal Error), NSUnderlyingError=0x60000320ca20 {Error Domain=IOGPUCommandQueueErrorDomain Code=14 "(null)"}} (com.apple.CoreImage) There are multiple levels of concurrency: Swift Concurrency calling the Core Image code (which shouldn't have an impact) and of course the Metal command buffer. Is there anyway to ensure the compute command encoder can complete its work? Here is the full implementation of my CIImageProcessorKernel subclass: class ParametricKernel: CIImageProcessorKernel { static let device = MTLCreateSystemDefaultDevice()! override class var outputFormat: CIFormat { return .BGRA8 } override class func formatForInput(at input: Int32) -> CIFormat { return .BGRA8 } override class func process(with inputs: [CIImageProcessorInput]?, arguments: [String : Any]?, output: CIImageProcessorOutput) throws { guard let commandBuffer = output.metalCommandBuffer, let images = arguments?["images"] as? [CGImage], let mask = arguments?["mask"] as? CGImage, let fillTime = arguments?["fillTime"] as? CGFloat, let betaLimit = arguments?["betaLimit"] as? CGFloat, let alphaLimit = arguments?["alphaLimit"] as? CGFloat, let errorScaling = arguments?["errorScaling"] as? CGFloat, let timing = arguments?["timing"], let TTRThreshold = arguments?["ttrthreshold"] as? CGFloat, let input = inputs?.first, let sourceTexture = input.metalTexture, let destinationTexture = output.metalTexture else { return } guard let kernelFunction = device.makeDefaultLibrary()?.makeFunction(name: "parametric") else { return } guard let commandEncoder = commandBuffer.makeComputeCommandEncoder() else { return } let imagesTexture = Texture.textureFromImages(images) let pipelineState = try device.makeComputePipelineState(function: kernelFunction) commandEncoder.setComputePipelineState(pipelineState) commandEncoder.setTexture(imagesTexture, index: 0) let maskTexture = Texture.textureFromImages([mask]) commandEncoder.setTexture(maskTexture, index: 1) commandEncoder.setTexture(destinationTexture, index: 2) var errorScalingFloat = Float(errorScaling) let errorBuffer = device.makeBuffer(bytes: &errorScalingFloat, length: MemoryLayout<Float>.size, options: []) commandEncoder.setBuffer(errorBuffer, offset: 0, index: 1) // Other buffers omitted.... let threadsPerThreadgroup = MTLSizeMake(16, 16, 1) let width = Int(ceil(Float(sourceTexture.width) / Float(threadsPerThreadgroup.width))) let height = Int(ceil(Float(sourceTexture.height) / Float(threadsPerThreadgroup.height))) let threadGroupCount = MTLSizeMake(width, height, 1) commandEncoder.dispatchThreadgroups(threadGroupCount, threadsPerThreadgroup: threadsPerThreadgroup) commandEncoder.endEncoding() } }
3
0
555
Aug ’24
Is anyone still using GKVoiceChat since it has been deprecated?
Hello, I would like to know if anyone has or still using the GKVoiceChat capabilities in their apps. I wanted to use it for my online game but I am coming across issues using it and wondering if their are alternatives?. The documentation mentions to use Share-play but that wont be possible with random online players. Any help will be appreciated!.
2
0
509
Aug ’24
How to access RealityRenderer
I have a RealityView in my visionOS app. I can't figure out how to access RealityRenderer. According to the documentation (https://developer.apple.com/documentation/realitykit/realityrenderer) it is available on visionOS, but I can't figure out how to access it for my RealityView. It is probably something obvious, but after reading through the documentation for RealityView, Entities, and Components, I can't find it.
1
2
769
Jul ’23