Delve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.

Post

Replies

Boosts

Views

Activity

RealityKit visualize the virtual depth texture from post-process callback
I am using RealityKit and ARView PostProcessContext to get the sourceDepthTexture of the current virtual scene in RealityKit, using .nonAR camera mode. My experience with Metal is limited to RealityKit GeometryModifier and SurfaceShader for CustomMaterial, but I am excited to learn more! Having studied the Underwater sample code I have a general idea of how I want to explore the capabilities of a proper post processing pipeline in my RealityKit project, but right now I just want to visualize this MTLTexture to see what the virtual depth of the scene looks like. Here’s my current approach, trying to create a depth UIImage from the context sourceDepthTexture: func postProcess(context: ARView.PostProcessContext) { let depthTexture = context.sourceDepthTexture var uiImage: UIImage? // or cg/ci if processPost { print("#P Process: Post Processs BLIT") // UIImage from MTLTexture uiImage = try createDepthUIImage(from: depthTexture) let blitEncoder = context.commandBuffer.makeBlitCommandEncoder() blitEncoder?.copy(from: context.sourceColorTexture, to: context.targetColorTexture) blitEncoder?.endEncoding() getPostProcessed() } else { print("#P No Process: Pass-Through") let blitEncoder = context.commandBuffer.makeBlitCommandEncoder() blitEncoder?.copy(from: context.sourceColorTexture, to: context.targetColorTexture) blitEncoder?.endEncoding() } } func createUIImage(from metalTexture: MTLTexture) throws -> UIImage { guard let device = MTLCreateSystemDefaultDevice() else { throw CIMError.noDefaultDevice } let descriptor = MTLTextureDescriptor.texture2DDescriptor( pixelFormat: .depth32Float_stencil8, width: metalTexture.width, height: metalTexture.height, mipmapped: false) descriptor.usage = [.shaderWrite, .shaderRead] guard let texture = device.makeTexture(descriptor: descriptor) else { throw NSError(domain: "Failed to create Metal texture", code: -1, userInfo: nil) } // Blit! let commandQueue = device.makeCommandQueue() let commandBuffer = commandQueue?.makeCommandBuffer() let blitEncorder = commandBuffer?.makeBlitCommandEncoder() blitEncorder?.copy(from: metalTexture, to: texture) blitEncorder?.endEncoding() commandBuffer?.commit() // Raw pixel bytes let bytesPerRow = 4 * texture.width let dataSize = texture.height * bytesPerRow var bytes = [UInt8](repeating: 0, count: dataSize) //var depthData = [Float](repeating: 0, count: dataSize) bytes.withUnsafeMutableBytes { bytesPtr in texture.getBytes( bytesPtr.baseAddress!, bytesPerRow: bytesPerRow, from: .init(origin: .init(), size: .init(width: texture.width, height: texture.height, depth: 1)), mipmapLevel: 0 ) } // CGDataProvider from the raw bytes let dataProvider = CGDataProvider(data: Data(bytes: bytes, count: bytes.count) as CFData) // CGImage from the data provider let cgImage = CGImage(width: texture.width, height: texture.height, bitsPerComponent: 8, bitsPerPixel: 32, bytesPerRow: bytesPerRow, space: CGColorSpaceCreateDeviceRGB(), bitmapInfo: CGBitmapInfo(rawValue: CGImageAlphaInfo.premultipliedLast.rawValue), provider: dataProvider!, decode: nil, shouldInterpolate: true, intent: .defaultIntent) // Return as UIImage return UIImage(cgImage: cgImage!) } I have hacked together the ‘createUIImage’ function with generative aid and online research to provide some visual feedback, but it looks like I am converting the depth values incorrectly — or somehow tapping into the stencil component of the pixels in the texture. Either way I am out of my depth, and would love some help. Ideally, I would like to produce a grayscale depth image, but really any guidance on how I can visualize the depth would be greatly appreciated. As you can see from the magnified view on the right, there are some artifacts or pixels that are processed differently than the core stencil. The empty background is transparent in the image as expected.
0
0
553
Feb ’24
Metal API on visionOS?
Is it possible to use the Metal API on vision Pro? I noticed that using MTKView in my visionOS app is not recognized, and also noticed other forum posts from months ago saying that MTKView is not yet supported. If it is still not an option, if and when will it be supported? Also wondering about metal-cpp support as well, since my app involves integrating an existing C++ library with visionOS (see here: https://github.com/MinVR/MinVR). Is this possible?
3
1
1.8k
Feb ’24
Several Mapping SDKs from Unity throwing same error
I'm testing all of the existing mapping SDKs from Unity via the PolySpatial workflow to see if any of them work on the Vision Pro. ArcGIS and Bing SDKs both play successfully in Editor, and Build successfully from Unity, but they both hit the same errors when building in Xcode (captured in screenshot attached). Is this a common error in Xcode? I can't find much on it. Thanks!
0
0
473
Feb ’24
OpenGL and NSScreen.mainScreen.backingScaleFactor
I init openGL now i wanna set glPixelZoom(pixelSizeX, -pixelSizeY); to dispaly a image with width and height to the entire window for this I do: rect:= window.frame; WindowBackingRect := window.convertRectToBacking(rect); pixelSizeX := WindowBackingRect.size.width / width / NSScreen.mainScreen.backingScaleFactor; pixelSizeY := WindowBackingRect.size.height / height / NSScreen.mainScreen.backingScaleFactor; under High Sierra 10.13.6 on an intel mac from 2011 NSScreen.mainScreen.backingScaleFactor return 1 because there is no retina. under Sonoma 14.3 an an intel mac from 2020 NSScreen.mainScreen.backingScaleFactor return 2 because of retina. this works correct. under Venture 13.6 on an aarch64 mac from 2020 NSScreen.mainScreen.backingScaleFactor returns 2 BUT the image is only half so big as it should be. (if i let the scale factor away on the new intel mac, the image is twice big as it should be. On the new AArch64 mac it is correct.) what to do?
0
0
529
Feb ’24
Gamma issue when display linear color
Hi, I'm displaying linear gray by CAMetalLayer with the shader below. fragment float4 fragmentShader(VertexOut in [[stage_in]], texture2d<float, access::sample> BGRATexture [[ texture(0) ]]) { float color = in.texCoordinates.x; return float4(float3(color), 1.0); } And my CAMetalLayer has been set to linearSRGB. metalLayer.colorspace = CGColorSpace(name: CGColorSpace.linearSRGB) metalLayer.pixelFormat = .bgra8Unorm Why the display seem add gamma? Apparently the middle gray is 187 but not 128.
1
0
876
Feb ’24
"metallib" generates unpredictable results.
Hi, I am using metallib to generate shader cache shaders offline, but I have noticed that for certain .air files, metallib's behavior is unpredictable. Sometimes it runs correctly, sometimes it may crash or generate an invalid .lib file. // crash info 0x00007FF6705AA821 (0x000001C5F7218E30 0x00000084F9F8F089 0x000001C5F72B57E0 0x000001C5F7218E30) 0x00007FF6705A9062 (0x00007FF6709200E0 0x000001C5F7208CE0 0x0000000000000002 0x00007FF670920140) 0x00007FF6704C8FD6 (0x00007FF6709200E0 0x00007FF600000000 0x00007FF670920140 0x0000000000000000) 0x00007FF66FF7F1F4 (0x0000000000000000 0x000001C5F71EC210 0x000001C5F71EC210 0x0000000000000000) 0x00007FF66FF6C8D1 (0x0000000000000004 0x000001C5F71EC210 0x0000000000000000 0x0000000000000000) 0x00007FF670633974 (0x0000000000000000 0x0000000000000000 0x0000000000000000 0x0000000000000000) 0x00007FFB9C137614 (0x0000000000000000 0x0000000000000000 0x0000000000000000 0x0000000000000000), BaseThreadInitThunk() + 0x14 bytes(s) 0x00007FFB9DB826A1 (0x0000000000000000 0x0000000000000000 0x0000000000000000 0x0000000000000000), RtlUserThreadStart() + 0x21 bytes(s) I tried the latest version (Metal Tools for Windows 4.1), but the issue still exists. I placed the input .air file at: https://drive.google.com/file/d/1MQQRbwKi-bcEZ9jy_dimRjJBovjB0ru2/view?usp=drive_link
4
0
470
Feb ’24
Crash in Metal Framework macOS 14.4 beta
Hello, I have a crash in the Metal framework under Sonoma 14.4 public beta on a Mac Mini M1 2020: Thread 1 crashed with ARM Thread State (64-bit): x0: 0x0000000000000000 x1: 0x0000000000000000 x2: 0x0000000000000000 x3: 0x0000000000000000 x4: 0x0000000000000000 x5: 0x0000000000000000 x6: 0x0000000000000000 x7: 0x0000000000000000 x8: 0x17c2770b7ca20001 x9: 0x17c2770b7ca20001 x10: 0x0000000000000025 x11: 0x0000000000000001 x12: 0x000000016bb555b2 x13: 0x0000000000000000 x14: 0x0000000104acc7e9 x15: 0x0000000207c5c5b0 x16: 0xfffffffffffffff4 x17: 0x0000000211f42c48 x18: 0x0000000000000000 x19: 0x000000016bb55898 x20: 0x0000600002901180 x21: 0x0000600003cd0e20 x22: 0x0000000000000003 x23: 0x0000000277b7e040 x24: 0x00000000000002ec x25: 0x0000000000000001 x26: 0x0000000000000000 x27: 0x0000000000000000 x28: 0x0000000207c96b50 fp: 0x000000016bb55880 lr: 0x2d648001a439d394 sp: 0x000000016bb557b0 pc: 0x00000001a439d394 cpsr: 0x60001000 far: 0x0000000000000000 esr: 0xf2000001 (Breakpoint) brk 1 Binary Images: 0x139c00000 - 0x139c6bfff com.apple.AppleMetalOpenGLRenderer (1.0) <8b69c871-19c2-3d46-b8de-8dbc62e532cd> /System/Library/Extensions/AppleMetalOpenGLRenderer.bundle/Contents/MacOS/AppleMetalOpenGLRenderer 0x109b74000 - 0x109baffff libjogl_mobile.dylib () <9c3ef505-8828-36ab-a776-5ffdb9d4cd79> /Applications/scilab-2024.0.0.app/Contents/lib/thirdparty/libjogl_mobile.dylib 0x13b494000 - 0x13b50ffff libjogl_desktop.dylib () <543b42ae-90a4-325c-8850-84951b1fa6ee> /Applications/scilab-2024.0.0.app/Contents/lib/thirdparty/libjogl_desktop.dylib 0x108588000 - 0x10858ffff libnativewindow_macosx.dylib (*) <2c256988-735b-38b7-9712-0bfc58c3ff90> /Applications/scilab-2024.0.0.app/Contents/lib/thirdparty/libnativewindow_macosx.dylib How can I get rid of ot ? S.
2
0
587
Feb ’24
Game not muted with ring/silent switch
Hi, I'm building a simple game for iOS. I have background music. The ring/silent switch is not disabling sound when switched to silent. So far I'm testing on devices through TestFlight (still internal testing, not beta). Do I need to code this function myself or does iOS know it's a game and disable sound automatically? and/or would my game be rejected if the switch doesn't disable sound? (I have an internal setting to enable/disable sounds in the game) Due to the way it's coded (capacitor app), I can't access the ring/silent switch to disable/enable sound. Thanks, this problem makes me feel like a preserved moose.
0
0
588
Feb ’24
CGImageSourceCreateThumbnailAtIndex not generating cgImage on 17.4
CGImageSourceCreateThumbnailAtIndex function isn't generating cgImage for majority of the images on iOS 17.4 OS version. It works if I pass in option kCGImageSourceThumbnailMaxPixelSize, but doesn't work if this key is missing. This function works with and without kCGImageSourceThumbnailMaxPixelSize in stable OS versions. Is this a new change in iOS 17.4 beta versions?
1
2
603
Feb ’24
visionOS, bfloat is not supported on this target
Hi, i'm trying to adapt our project to run on visionOS and faced with problem from the topic while running command: xcrun --sdk xros metal --target=arm64-apple-xros1.0 input.metal -c -o output.air The full command output looks like that: While building module 'metal_types' imported from <built-in>:1: In file included from <built-in>:1: In file included from /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/metal/ios/lib/clang/32023.98/include/metal/metal_types:90: /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/metal/ios/lib/clang/32023.98/include/metal/metal_extended_vector:121:49: error: bfloat is not supported on this target typedef __attribute__((__ext_vector_type__(2))) bfloat bfloat2; ^ /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/metal/ios/lib/clang/32023.98/include/metal/metal_extended_vector:122:49: error: bfloat is not supported on this target typedef __attribute__((__ext_vector_type__(3))) bfloat bfloat3; ^ /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/metal/ios/lib/clang/32023.98/include/metal/metal_extended_vector:123:49: error: bfloat is not supported on this target typedef __attribute__((__ext_vector_type__(4))) bfloat bfloat4; ^ While building module 'metal_types' imported from <built-in>:1: In file included from <built-in>:1: In file included from /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/metal/ios/lib/clang/32023.98/include/metal/metal_types:91: /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/metal/ios/lib/clang/32023.98/include/metal/metal_packed_vector:121:52: error: bfloat is not supported on this target typedef __attribute__((__packed_vector_type__(2))) bfloat packed_bfloat2; ^ /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/metal/ios/lib/clang/32023.98/include/metal/metal_packed_vector:122:52: error: bfloat is not supported on this target typedef __attribute__((__packed_vector_type__(3))) bfloat packed_bfloat3; ^ /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/metal/ios/lib/clang/32023.98/include/metal/metal_packed_vector:123:52: error: bfloat is not supported on this target typedef __attribute__((__packed_vector_type__(4))) bfloat packed_bfloat4; I'm using Xcode 15.2 (15C500b) on MacBook 16 Pro (M1 Pro) and xcrun --sdk xros metal --version gives me this: Apple metal version 32023.98 (metalfe-32023.98) Target: air64-apple-darwin23.2.0 Thread model: posix InstalledDir: /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/metal/ios/bin
1
0
419
Jan ’24
Metal stereo shader on Vision Pro
Hi there, I have some existing metal rendering / shader views that I would like to use to present stereoscopic content on the Vision Pro. Is there a metal shader function / variable that lets me know which eye we're currently rendering to inside my shader? Something like Unity's unity_StereoEyeIndex? I know RealityKit has GeometrySwitchCameraIndex, so I want something similar (but outside of a RealityKit context). Many thanks, Rich
2
0
1.1k
Jan ’24
Only 4 Cores reported to game
Hi, I have a MBP 2023 M3 Max 64GB with 16 Core CPU ( 4 efficiency, 12 Performance) and 40C GPU. I've got a Game (Cities Skylines 2) successfully working using Whisky. However, only 4 Cores are reported to the game which leads to a situation where there are many calculations batched up while at the same time my CPU performance cores almost idle, while the efficiency cores are well utilized. I suspect this is because the game only sees 4 cores and has some logic to batch the calculations differently depending on how much cores are available. Is there a way to override how many cores the game sees? e.g. by using an environment variable or something? Thanks, Dominik
0
1
682
Jan ’24
Show USDZ file in RealityView
I captured my office using 3D Scanner and get a USDZ file. The file contains a 3-D Model and a Physically based material. I can view the file correctly with texture on Xcode and Reality Composer Pro. But when using RealityView to present the model in immersive space. I got the model in whole black. My guess is my Material doesn't have a shader graph? Does anyone caught into similar issue? How to solve it?
1
0
713
Jan ’24
Multiplayer test mode
I’m really sorry if this is not the proper place for this. I’m developing a game with online mode what works just fine in WiFi mode but not in mobile network. If someone has two Apple devices and can try if with the mobile network the multiplayer mode works I will appreciate A LOT https://apps.apple.com/es/app/ufo-snowboard/id6474542185
0
0
468
Jan ’24
Apple GameKit EntryPointNotFoundException
I have build the plugins and added to project but when I run game I am getting these errors https://github.com/apple/unityplugins EntryPointNotFoundException: AppleCore_GetRuntimeEnvironment assembly: type: member:(null) Apple.Core.Availability.OnApplicationStart () (at Library/PackageCache/com.apple.unityplugin.core@d07a9d20344c/Runtime/Availability.cs:40) EntryPointNotFoundException: GKLocalPlayer_GetLocal assembly: type: member:(null)
2
0
772
Jan ’24