Delve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.

Post

Replies

Boosts

Views

Activity

FxPlug resolution change w/o scaling - FCPX AI/ML Upscale Effect via Motion?
Namaste! I'm putting together a FCPX Effect that is supposed to increase the resolution with AI upscale, but the only way to add resolution is by scaling. The problem is that scaling causes the video to clip. I want to be able to give a 480 video this "Resolution Upscale" Effect and have it output a 720 or 1080 AI upscaled video, however both FxPlug and Motion Effects does not allow such a thing. The FxPlug is always getting 640x480 input (correct) but only 640x480 output. What is the FxPlug code or Motion Configuration/Cncept for upscaling the resolution without affecting the scale? Is there a way to do this in Motion/FxPlug? Scaling up by FxPlug effect, but then scaling down in a parent Motion Group doesn't do anything. Setting the Group 2D Fixed Resolution doesn't output different dimensions; the debug output from the FxPlug continues saying the input and output is 640x480, even when the group is set at fixed resolution 1920x1080. Doing a hierarchy of Groups with different settings for 2D Fixed Resolution and 3D Flatten do not work. In these instances, the debug output continues saying 640x480 for both input and output. So the plug in isn't aware of the Fixed Resolution change. Does there need to be a new FxPlug property, via [properties:...], like "kFxPropertyKey_ResolutionChange" and an API for changing the dest image resolution? (and without changing the dest rect size) How do we do this?
0
0
597
Feb ’24
Is it possible to change usdz objects inside a scene programmatically?
Let's say I've created a scene with 3 models inside side by side. Now upon user interaction, I'd like to change these models to another model (that is also in the same reality composer pro project). Is that possible? How can one do that? One way I can think of is to just load all the individual models in RealityView and then just toggle the opacity to show/hide the models. But this doesn't seem like the right way for performance/memory reasons. How do you swap in and out usdz models?
1
0
694
Feb ’24
Metal API on visionOS?
Is it possible to use the Metal API on vision Pro? I noticed that using MTKView in my visionOS app is not recognized, and also noticed other forum posts from months ago saying that MTKView is not yet supported. If it is still not an option, if and when will it be supported? Also wondering about metal-cpp support as well, since my app involves integrating an existing C++ library with visionOS (see here: https://github.com/MinVR/MinVR). Is this possible?
3
1
1.9k
Feb ’24
RealityKit Target Framerate
I'm porting a scenekit app to RealityKit, eventually offering an AR experience there. I noticed that when I run it on my iPhone 15 Pro and iPad Pro with the 120Hz screen, the framerate seems to be limited to 60fps. Is there a way to increase the target framerate to 120 like I can with sceneKit? I'm setting up my arView like so: @IBOutlet private var arView: ARView! { didSet { arView.cameraMode = .nonAR arView.debugOptions = [.showStatistics] } }
0
0
762
Feb ’24
Unsupported method: -[MTLComputeCommandEncoder. encodeStartWhile: offset: comparison: referenceValue:]
it appears that the Metal Debugging interface does not support this method, at least the function hashing algorithm does not have a pattern for it in the symbol dictionary as presented. Where do we get updated C- libraries and functions that sync with the things that are presented in the Demo Kits and Samples that Apple puts in the user domain? Why does this stuff get out into the wild insufficiently tested? It seems thet the demo kits made available to users should be included in the test domain used to verify new code releases. I came from a development environment where the 6 month release cycle involved automated execution of the test suite before it went beta or anywhere else.
1
0
713
Feb ’24
RealityKit visualize the virtual depth texture from post-process callback
I am using RealityKit and ARView PostProcessContext to get the sourceDepthTexture of the current virtual scene in RealityKit, using .nonAR camera mode. My experience with Metal is limited to RealityKit GeometryModifier and SurfaceShader for CustomMaterial, but I am excited to learn more! Having studied the Underwater sample code I have a general idea of how I want to explore the capabilities of a proper post processing pipeline in my RealityKit project, but right now I just want to visualize this MTLTexture to see what the virtual depth of the scene looks like. Here’s my current approach, trying to create a depth UIImage from the context sourceDepthTexture: func postProcess(context: ARView.PostProcessContext) { let depthTexture = context.sourceDepthTexture var uiImage: UIImage? // or cg/ci if processPost { print("#P Process: Post Processs BLIT") // UIImage from MTLTexture uiImage = try createDepthUIImage(from: depthTexture) let blitEncoder = context.commandBuffer.makeBlitCommandEncoder() blitEncoder?.copy(from: context.sourceColorTexture, to: context.targetColorTexture) blitEncoder?.endEncoding() getPostProcessed() } else { print("#P No Process: Pass-Through") let blitEncoder = context.commandBuffer.makeBlitCommandEncoder() blitEncoder?.copy(from: context.sourceColorTexture, to: context.targetColorTexture) blitEncoder?.endEncoding() } } func createUIImage(from metalTexture: MTLTexture) throws -> UIImage { guard let device = MTLCreateSystemDefaultDevice() else { throw CIMError.noDefaultDevice } let descriptor = MTLTextureDescriptor.texture2DDescriptor( pixelFormat: .depth32Float_stencil8, width: metalTexture.width, height: metalTexture.height, mipmapped: false) descriptor.usage = [.shaderWrite, .shaderRead] guard let texture = device.makeTexture(descriptor: descriptor) else { throw NSError(domain: "Failed to create Metal texture", code: -1, userInfo: nil) } // Blit! let commandQueue = device.makeCommandQueue() let commandBuffer = commandQueue?.makeCommandBuffer() let blitEncorder = commandBuffer?.makeBlitCommandEncoder() blitEncorder?.copy(from: metalTexture, to: texture) blitEncorder?.endEncoding() commandBuffer?.commit() // Raw pixel bytes let bytesPerRow = 4 * texture.width let dataSize = texture.height * bytesPerRow var bytes = [UInt8](repeating: 0, count: dataSize) //var depthData = [Float](repeating: 0, count: dataSize) bytes.withUnsafeMutableBytes { bytesPtr in texture.getBytes( bytesPtr.baseAddress!, bytesPerRow: bytesPerRow, from: .init(origin: .init(), size: .init(width: texture.width, height: texture.height, depth: 1)), mipmapLevel: 0 ) } // CGDataProvider from the raw bytes let dataProvider = CGDataProvider(data: Data(bytes: bytes, count: bytes.count) as CFData) // CGImage from the data provider let cgImage = CGImage(width: texture.width, height: texture.height, bitsPerComponent: 8, bitsPerPixel: 32, bytesPerRow: bytesPerRow, space: CGColorSpaceCreateDeviceRGB(), bitmapInfo: CGBitmapInfo(rawValue: CGImageAlphaInfo.premultipliedLast.rawValue), provider: dataProvider!, decode: nil, shouldInterpolate: true, intent: .defaultIntent) // Return as UIImage return UIImage(cgImage: cgImage!) } I have hacked together the ‘createUIImage’ function with generative aid and online research to provide some visual feedback, but it looks like I am converting the depth values incorrectly — or somehow tapping into the stencil component of the pixels in the texture. Either way I am out of my depth, and would love some help. Ideally, I would like to produce a grayscale depth image, but really any guidance on how I can visualize the depth would be greatly appreciated. As you can see from the magnified view on the right, there are some artifacts or pixels that are processed differently than the core stencil. The empty background is transparent in the image as expected.
0
0
560
Feb ’24
Several Mapping SDKs from Unity throwing same error
I'm testing all of the existing mapping SDKs from Unity via the PolySpatial workflow to see if any of them work on the Vision Pro. ArcGIS and Bing SDKs both play successfully in Editor, and Build successfully from Unity, but they both hit the same errors when building in Xcode (captured in screenshot attached). Is this a common error in Xcode? I can't find much on it. Thanks!
0
0
479
Feb ’24
Crash in Metal Framework macOS 14.4 beta
Hello, I have a crash in the Metal framework under Sonoma 14.4 public beta on a Mac Mini M1 2020: Thread 1 crashed with ARM Thread State (64-bit): x0: 0x0000000000000000 x1: 0x0000000000000000 x2: 0x0000000000000000 x3: 0x0000000000000000 x4: 0x0000000000000000 x5: 0x0000000000000000 x6: 0x0000000000000000 x7: 0x0000000000000000 x8: 0x17c2770b7ca20001 x9: 0x17c2770b7ca20001 x10: 0x0000000000000025 x11: 0x0000000000000001 x12: 0x000000016bb555b2 x13: 0x0000000000000000 x14: 0x0000000104acc7e9 x15: 0x0000000207c5c5b0 x16: 0xfffffffffffffff4 x17: 0x0000000211f42c48 x18: 0x0000000000000000 x19: 0x000000016bb55898 x20: 0x0000600002901180 x21: 0x0000600003cd0e20 x22: 0x0000000000000003 x23: 0x0000000277b7e040 x24: 0x00000000000002ec x25: 0x0000000000000001 x26: 0x0000000000000000 x27: 0x0000000000000000 x28: 0x0000000207c96b50 fp: 0x000000016bb55880 lr: 0x2d648001a439d394 sp: 0x000000016bb557b0 pc: 0x00000001a439d394 cpsr: 0x60001000 far: 0x0000000000000000 esr: 0xf2000001 (Breakpoint) brk 1 Binary Images: 0x139c00000 - 0x139c6bfff com.apple.AppleMetalOpenGLRenderer (1.0) <8b69c871-19c2-3d46-b8de-8dbc62e532cd> /System/Library/Extensions/AppleMetalOpenGLRenderer.bundle/Contents/MacOS/AppleMetalOpenGLRenderer 0x109b74000 - 0x109baffff libjogl_mobile.dylib () <9c3ef505-8828-36ab-a776-5ffdb9d4cd79> /Applications/scilab-2024.0.0.app/Contents/lib/thirdparty/libjogl_mobile.dylib 0x13b494000 - 0x13b50ffff libjogl_desktop.dylib () <543b42ae-90a4-325c-8850-84951b1fa6ee> /Applications/scilab-2024.0.0.app/Contents/lib/thirdparty/libjogl_desktop.dylib 0x108588000 - 0x10858ffff libnativewindow_macosx.dylib (*) <2c256988-735b-38b7-9712-0bfc58c3ff90> /Applications/scilab-2024.0.0.app/Contents/lib/thirdparty/libnativewindow_macosx.dylib How can I get rid of ot ? S.
2
0
600
Feb ’24
CGImageSourceCreateThumbnailAtIndex not generating cgImage on 17.4
CGImageSourceCreateThumbnailAtIndex function isn't generating cgImage for majority of the images on iOS 17.4 OS version. It works if I pass in option kCGImageSourceThumbnailMaxPixelSize, but doesn't work if this key is missing. This function works with and without kCGImageSourceThumbnailMaxPixelSize in stable OS versions. Is this a new change in iOS 17.4 beta versions?
1
2
610
Feb ’24
OpenGL and NSScreen.mainScreen.backingScaleFactor
I init openGL now i wanna set glPixelZoom(pixelSizeX, -pixelSizeY); to dispaly a image with width and height to the entire window for this I do: rect:= window.frame; WindowBackingRect := window.convertRectToBacking(rect); pixelSizeX := WindowBackingRect.size.width / width / NSScreen.mainScreen.backingScaleFactor; pixelSizeY := WindowBackingRect.size.height / height / NSScreen.mainScreen.backingScaleFactor; under High Sierra 10.13.6 on an intel mac from 2011 NSScreen.mainScreen.backingScaleFactor return 1 because there is no retina. under Sonoma 14.3 an an intel mac from 2020 NSScreen.mainScreen.backingScaleFactor return 2 because of retina. this works correct. under Venture 13.6 on an aarch64 mac from 2020 NSScreen.mainScreen.backingScaleFactor returns 2 BUT the image is only half so big as it should be. (if i let the scale factor away on the new intel mac, the image is twice big as it should be. On the new AArch64 mac it is correct.) what to do?
0
0
541
Feb ’24
"metallib" generates unpredictable results.
Hi, I am using metallib to generate shader cache shaders offline, but I have noticed that for certain .air files, metallib's behavior is unpredictable. Sometimes it runs correctly, sometimes it may crash or generate an invalid .lib file. // crash info 0x00007FF6705AA821 (0x000001C5F7218E30 0x00000084F9F8F089 0x000001C5F72B57E0 0x000001C5F7218E30) 0x00007FF6705A9062 (0x00007FF6709200E0 0x000001C5F7208CE0 0x0000000000000002 0x00007FF670920140) 0x00007FF6704C8FD6 (0x00007FF6709200E0 0x00007FF600000000 0x00007FF670920140 0x0000000000000000) 0x00007FF66FF7F1F4 (0x0000000000000000 0x000001C5F71EC210 0x000001C5F71EC210 0x0000000000000000) 0x00007FF66FF6C8D1 (0x0000000000000004 0x000001C5F71EC210 0x0000000000000000 0x0000000000000000) 0x00007FF670633974 (0x0000000000000000 0x0000000000000000 0x0000000000000000 0x0000000000000000) 0x00007FFB9C137614 (0x0000000000000000 0x0000000000000000 0x0000000000000000 0x0000000000000000), BaseThreadInitThunk() + 0x14 bytes(s) 0x00007FFB9DB826A1 (0x0000000000000000 0x0000000000000000 0x0000000000000000 0x0000000000000000), RtlUserThreadStart() + 0x21 bytes(s) I tried the latest version (Metal Tools for Windows 4.1), but the issue still exists. I placed the input .air file at: https://drive.google.com/file/d/1MQQRbwKi-bcEZ9jy_dimRjJBovjB0ru2/view?usp=drive_link
4
0
488
Feb ’24
Game not muted with ring/silent switch
Hi, I'm building a simple game for iOS. I have background music. The ring/silent switch is not disabling sound when switched to silent. So far I'm testing on devices through TestFlight (still internal testing, not beta). Do I need to code this function myself or does iOS know it's a game and disable sound automatically? and/or would my game be rejected if the switch doesn't disable sound? (I have an internal setting to enable/disable sounds in the game) Due to the way it's coded (capacitor app), I can't access the ring/silent switch to disable/enable sound. Thanks, this problem makes me feel like a preserved moose.
0
0
600
Feb ’24
I will quit from Apple development no profit from my developer account unfortunately besides other companies I work to I have profit!
Hello community this post is for show my complete unsatisfied with Apple specially on developing games for Apples platforms there is lack of support for it for example some new gaming technologies and still that there is no profit or worth from all the work and money invested to develop for it I will close the journey with Apple very unsatisfied I'm going to give opportunities with my business to other platforms that are really worth it and give support to all new technologies in gaming and yes Apple destroyed other gaming makers with their new services like arcade and seems no future for gaming in Apples platforms. Quit goodbye and good luck to everyone.
3
0
1.7k
Mar ’23
visionOS, bfloat is not supported on this target
Hi, i'm trying to adapt our project to run on visionOS and faced with problem from the topic while running command: xcrun --sdk xros metal --target=arm64-apple-xros1.0 input.metal -c -o output.air The full command output looks like that: While building module 'metal_types' imported from <built-in>:1: In file included from <built-in>:1: In file included from /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/metal/ios/lib/clang/32023.98/include/metal/metal_types:90: /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/metal/ios/lib/clang/32023.98/include/metal/metal_extended_vector:121:49: error: bfloat is not supported on this target typedef __attribute__((__ext_vector_type__(2))) bfloat bfloat2; ^ /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/metal/ios/lib/clang/32023.98/include/metal/metal_extended_vector:122:49: error: bfloat is not supported on this target typedef __attribute__((__ext_vector_type__(3))) bfloat bfloat3; ^ /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/metal/ios/lib/clang/32023.98/include/metal/metal_extended_vector:123:49: error: bfloat is not supported on this target typedef __attribute__((__ext_vector_type__(4))) bfloat bfloat4; ^ While building module 'metal_types' imported from <built-in>:1: In file included from <built-in>:1: In file included from /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/metal/ios/lib/clang/32023.98/include/metal/metal_types:91: /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/metal/ios/lib/clang/32023.98/include/metal/metal_packed_vector:121:52: error: bfloat is not supported on this target typedef __attribute__((__packed_vector_type__(2))) bfloat packed_bfloat2; ^ /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/metal/ios/lib/clang/32023.98/include/metal/metal_packed_vector:122:52: error: bfloat is not supported on this target typedef __attribute__((__packed_vector_type__(3))) bfloat packed_bfloat3; ^ /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/metal/ios/lib/clang/32023.98/include/metal/metal_packed_vector:123:52: error: bfloat is not supported on this target typedef __attribute__((__packed_vector_type__(4))) bfloat packed_bfloat4; I'm using Xcode 15.2 (15C500b) on MacBook 16 Pro (M1 Pro) and xcrun --sdk xros metal --version gives me this: Apple metal version 32023.98 (metalfe-32023.98) Target: air64-apple-darwin23.2.0 Thread model: posix InstalledDir: /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/metal/ios/bin
1
0
428
Jan ’24