Our app encountered the following error:
Execution of the command buffer was aborted due to an error during execution. Ignored (for causing prior/excessive GPU errors) (00000004:kIOGPUCommandBufferCallbackErrorSubmissionsIgnored)
Delve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.
Post
Replies
Boosts
Views
Activity
I am creating a 3D model from multiple images using the photogrammetry session. Now, when the session generates an OBJ file and I measure the distance between two points, the distance is displayed sporadically in different units. Sometimes it's meters, then centimeters, or another unit altogether. How can I tell the photogrammetry session to always create the model in millimeters?
How many 32-bit variables can I use concurrently in a single thread of a Metal compute kernel without worrying about the variables getting spilled into the device memory? Alternatively: how many 32-bit registers does a single thread have available for itself?
Let's say that each thread of my compute kernel needs to store and work with its own array of N float variables, where N can be 128, 256, 512 or more. To achieve maximum possible performance, I do not want to the local thread variables to get spilled into the slow device memory. I want all N variables to be stored "on-chip", in the thread memory space.
To make my question more concrete, let's say there is an array thread float localArray[N]. Assuming an unrealistic hypothetical scenario where localArray is the only variable in the whole kernel, what is the maximum value of N for which no portion of localArray would get spilled into the device memory?
I searched in the Metal feature set tables, but I could not find any details.
Minecraft Launcher gives message "minecraft launcher quit unexpectedly"
when opened, this began happening after I updated to macOS Sequoia Beta 15.0 (24A5327a)
Anyone know a fix?
Starting with Xcode Beta 4+, any ModelEntity I load from usdz that contain a skeletal pose has no pins. The pins used to be accessible from a ModelEntity so you could use alignment with other pins.
Per the documentation, any ModelEntity with a skeletal pose should have pins that are automatically generated and contained on the entity.pins object itself.
https://developer.apple.com/documentation/RealityKit/Entity/pins
Is this a bug with the later Xcode betas or is the documentation wrong?
I'm trying to clone an entity that's somewhere deeper in hierarchy and I want it together with transform that takes into account parents.
Initially I made something that would go back through parents, get their transforms and then reduce them to single one. Then I realized that what I'm doing is same as .transformMatrix(relativeTo: rootEntity), but to validate that what I made gives same results I started to print them both and I noticed that for some reason the last row instead of stable (0,0,0,1) is sometimes (0,0,0,0.9999...). I know that there are rounding errors, but I'd assume that 0 and 1 are "magical" in FP world.
The only way I can try to explain it, is that .transformMatrix is using some fancy accelerated matrix multiplication and those produce some bigger rounding errors. That would explain slight differences in other fields between my version and function call, but still - the 1 seems weird.
Here's function I'm using to compare:
func cloneFlattened(entity: Entity, withChildren recursive: Bool) -> Entity {
let clone = entity.clone(recursive: recursive)
var transforms = [entity.transform.matrix]
var parent: Entity? = entity.parent
var rootEntity: Entity = entity
while parent != nil {
rootEntity = parent!
transforms.append(parent!.transform.matrix)
parent = parent!.parent
}
if transforms.count > 1 {
clone.transform.matrix = transforms.reversed().reduce(simd_diagonal_matrix(simd_float4(repeating: 1)), *)
print("QWE CLONE FLATTENED: \(clone.transform.matrix)")
print("QWE CLONE RELATIVE : \(entity.transformMatrix(relativeTo: rootEntity))")
}
else {
print("QWE CLONE SINGLE : \(clone.transform.matrix)")
}
return clone
}
Sometimes last one is not 1
QWE CLONE FLATTENED: [
[0.00042261832, 0.0009063079, 0.0, 0.0],
[-0.0009063079, 0.00042261832, 0.0, 0.0],
[0.0, 0.0, 0.0010000002, 0.0],
[-0.0013045187, -0.009559666, -0.04027118, 1.0]
]
QWE CLONE RELATIVE : [
[0.00042261826, 0.0009063076, -4.681872e-12, 0.0],
[-0.0009063076, 0.00042261826, 3.580335e-12, 0.0],
[3.4256328e-12, 1.8047965e-13, 0.0009999998, 0.0],
[-0.0013045263, -0.009559661, -0.040271178, 0.9999997]
]
Sometimes it is
QWE CLONE FLATTENED: [
[0.0009980977, -6.1623556e-05, -1.7382005e-06, 0.0],
[-6.136851e-05, -0.0009958588, 6.707259e-05, 0.0],
[-5.8642554e-06, -6.683835e-05, -0.0009977464, 0.0],
[-1.761913e-06, -0.002, 0.0, 1.0]
]
QWE CLONE RELATIVE : [
[0.0009980979, -6.1623556e-05, -1.7382023e-06, 0.0],
[-6.136855e-05, -0.0009958589, 6.707254e-05, 0.0],
[-5.864262e-06, -6.6838256e-05, -0.0009977465, 0.0],
[-1.758337e-06, -0.0019999966, -3.7252903e-09, 1.0]
]
0s in last row seem to be stable.
It happens both for entities that are few levels deep and those that have only anchor as parent.
So far I've never seen any value that would not be "technically a 1", but my hierarchies are not very deep and it makes me wonder if this rounding could get worse.
Or is it just me doing something stupid? :)
LogUnrealMathTest: FAILED: VectorCos: Ref vs Vec
]LogUnrealMathTest: Bad(0.000000): (0.707107 0.500000 0.342019 0.173648) (0.000000 0.000000 0.000000 0.000000)
FAILED: VectorCos: Ref vs Vec
LogUnrealMathTest: Bad(0.000000): (-0.707107 -0.500000 -0.342021 -0.173648) (-0.000000 -0.000000 -0.000000 -0.000000)
LogMac: Error: appError called: Fatal error: [File:/Users/enginej3/Desktop/UE4/Engine/Source/Runtime/Core/Private/Tests/Math/UnrealMathTest.cpp] [Line: 1652]
VectorIntrinsics Failed.
this error when running after Xcode build success.this error case unreal editor crash.
I tried to understand the view matrix.
The part from original code as below:
private func updateGameState() {
/// Update any game state before rendering
uniforms[0].projectionMatrix = projectionMatrix
let rotationAxis = SIMD3<Float>(1, 1, 0)
let modelMatrix = matrix4x4_rotation(radians: rotation, axis: rotationAxis)
let viewMatrix = matrix4x4_translation(0.0, 0.0, -8.0)
uniforms[0].modelViewMatrix = simd_mul(viewMatrix, modelMatrix)
rotation += 0.01
}
If the view matrix is initialed in x = -0.5, as:let viewMatrix = matrix4x4_translation(-0.5, 0.0, -8.0)
The cube in the MetalView will move left.
I think it should move to right hand side because View Matrix is camera position, am I wrong?
I'm trying to load up a virtual skybox, different from the built-in default, for a simple macOS rendering of RealityKit content.
I was following the detail at https://developer.apple.com/documentation/realitykit/environmentresource, and created a folder called "light.skybox" with a single file in it ("prairie.hdr"), and then I'm trying to load that and set it as the environment on the arView when it's created:
let ar = ARView(frame: .zero)
do {
let resource = try EnvironmentResource.load(named: "prairie")
ar.environment.lighting.resource = resource
} catch {
print("Unable to load resource: \(error)")
}
The loading always fails when I launch the sample app, reporting "Unable to load resource ..." and when I look in the App bundle, the resource that's included there as Contents/Resources/light.realityenv is an entirely different size - appearing to be the default lighting.
I've tried making the folder "light.skybox" explicitly target the app bundle for inclusion, but I don't see it get embedded with it toggle that way, or just default.
Is there anything I need to do to get Xcode to process and include the lighting I'm providing?
(This is inspired from https://stackoverflow.com/questions/77332150/realitykit-how-to-disable-default-lighting-in-nonar-arview, which shows an example for UIKit)
我用的iPhone14ProMax iOS 18.0 beta5
AirPods是最新os
我玩暗区突围时队友说耳机有电流声/滋滋声 噪音很大 有的时候没有有的时候突然就有了
我使用手机麦克风就没有此问题
AirPods不在召唤范围但在保
I have a 3х3 Matrix which I need to apply to UIImage and save it in Documents folder. I successfully converted the 3x3 Matrix (represented as [[Double]]) to CATrasform3D and then I have broken my head with trying to figure out how to apply it to UIImage.
The only property where I can I apply it is UIView(or UIImageView in case with working with UIImage) transform property. But it has nothing to do with UIImage itself. I can't save the UIImage from transformed the UIImageView with all the transformations.
And all the CoreGraphic methods (like concatenate for CGContext) only work with affine transformations which not suits for me.
Please give me a hint what direction I should look.
Does Apple has native methods or I have to use 3rd party frameworks for this functionality?
Hello, I'm writing to report an issue (or a documentation error).
I am using the Entity/Component Architecture incorporated in the GamePlayKit framework. Additionally, I want to take advantage of the user interface provided by the Scene Editor. This is essential for me if I want to involve more people in the project.
The issue occurs when linking the user interface data with the GKScene of the aforementioned framework.
The first issue arises when adding a component through the interface as shown in the image
Then at that moment:
if let scene = GKScene(fileNamed: "GameScene") {
// Get the SKScene from the loaded GKScene
if let sceneNode = scene.rootNode as! GameScene? {
Scene.rootNode is nil, and the scene is not presented.
However, I can work around this issue by initializing the scene separately:
if let scene = GKScene(fileNamed: "GameScene") {
// Get the SKScene loaded separately
if let sceneNode = SKScene(fileNamed: "GameScene") as! GameScene? {
But from here, two issues arise:
The node contains a component, and the scene has been loaded separately
When trying to access a specific entity through its SKSpriteNode:
self.node?.entity // Is nil
It becomes very difficult to access a specific entity. When adding a component, an entity is automatically created. This is demonstrated here:
The node contains a component, and the scene has been loaded separately.
I only have one way to access this entity, and since there is only one, it's easy:
sceneNode.entities[0]
But even so, it's not very useful because when I try to access its components, it turns out they don't exist.
I just wanted to mention this because it would be very helpful for me if this issue could be resolved.
Thank you very much in advance.
Hello Guys,
I'm looking for how to look at the 3D object around me, pinch it (from far with eyes and 2 fingers, not touch them) and display a little information on top of it (like little text).
If you have the solution you will make me happy (and a bit less stupid too :) )
Cheers
Mathis
Using Reality Composer Pro 2.0, I created a simple shader graph that displays a texture on an unlit surface:
On visionOS 2 beta, I can successfully use ShaderGraphMaterial(named:from:in:) to load that shader graph material and assign it to a model entity.
However, on visionOS 1.2 and earlier, either in Simulator or on the device, ShaderGraphMaterial(named:from:in:) fails and I see the following logged to the console:
If, using Reality Composer Pro 1.0, I experimentally open the same project and delete and recreate exactly the same nodes above, then ShaderGraphMaterial(named:from:in:) works as expected on visionOS 1.2.
Is it a known issue that Reality Composer 2 can't be used with visionOS 1?
Is this intentional behavior?
I've submitted feedback as FB14828873, including a sample project and repro steps.
If possible, I would appreciate guidance from an Apple engineer, like "This is a known issue for [list of node types]" or "Reality Composer Pro 2 is not supported for visionOS 1 development, please refer to [documentation]" or "We recommend [workaround]."
Thank you.
Hello,
I would like to know if anyone has or still using the GKVoiceChat capabilities in their apps. I wanted to use it for my online game but I am coming across issues using it and wondering if their are alternatives?. The documentation mentions to use Share-play but that wont be possible with random online players. Any help will be appreciated!.
Hello,
I’m playing around with making an fully immersive multiplayer, air to air dogfighting game, but I’m having trouble figuring out how to attach a camera to an entity.
I have a plane that’s controlled with a GamePad. And I want the camera’s position to be pinned to that entity as it moves about space, while maintaining the users ability to look around.
Is this possible?
--
From my understanding, the current state of SceneKit, ARKit, and RealityKit is a bit confusing with what can and can not be done.
SceneKit
Full control of the camera
Not sure if it can use RealityKits ECS system.
2D Window. - Missing full immersion.
ARKit
Full control of the camera* - but only for non Vision Pro devices. Since Vision OS doesn't have a ARView.
Has RealityKits ECS system
2D Window. - Missing full immersion.
RealityKit
Camera is pinned to the device's position and orientation
Has RealityKits ECS system
Allows full immersion
I have tested my application in iOS 15, 16, 17 Version in that vision kit reading value in Horizontal direction once I got updated my device to iOS 18.0 beta value was reading as in vertical direction
The build was generated in Xcode 13.4.1.
Team please help to understand why this and need to change anything in code level
Hello, I try to invite a friend to play my app , however when the friend try to press invite link component via iMessage, it shows “Retrieving” and then disappear, nothing happens,
it doesn't redirect to my app, what I'm missing? or doing wrong I can leave some part of my code
import Foundation
import GameKit
extension RealTimeGame: GKLocalPlayerListener {
/// Handles when the local player sends requests to start a match with other players.
func player(_ player: GKPlayer, didRequestMatchWithRecipients recipientPlayers: [GKPlayer]) {
print("\n\nSending invites to other players.")
}
/// Presents the matchmaker interface when the local player accepts an invitation from another player.
func player(_ player: GKPlayer, didAccept invite: GKInvite) {
// Present the matchmaker view controller in the invitation state.
if let viewController = GKMatchmakerViewController(invite: invite) {
viewController.matchmakerDelegate = self
rootViewController?.present(viewController, animated: true) { }
}
}
}
also I don't have "<key>CFBundleURLTypes</key>" in my info.plist, I don't know I need that or not...
I'm building an iOS/iPadOS app for iOS 18+ using the new RealityView in SwiftUI. (I may add visionOS, but I'm not focusing on it right now.) The 3D scene I'm rendering is fairly simple (just a few dozen vertices and a couple of textures), and I'd like to render it at 120fps on ProMotion devices if possible. I tried setting CADisableMinimumFrameDurationOnPhone to true in the info plist, but it had no effect. The frame rate in the GPU Report in Xcode stays capped at 60fps, and the gauge even tops out at 60.
My question is kind of the opposite of this post, which asks how to limit the frame rate of a RealityView.
I'm on Xcode 16 beta 5 on macOS Sonoma and iOS 18.0 beta 6 on my iPhone 15 Pro.
Why is the GameController framework loaded
I am checking the launch time of our app, using Instruments->App launch
I was confused to find the GameConroller framework loaded
I check the project, in the plist file, no configuration GCSupportedGameControllers, GCSupportsControllerUserInteraction related key.
What else causes the GameController framework to load?