Hi team,
We have an iOS app. Since July 15, 2022, our users met a kind of app crash due to an invalid memory fetch. The time is when Apple released iOS 16 beta officially. After Sep 12, crash count started to increase drastically. The time is Apple released iOS 16 officially.
Crash backtrace can be seen as follows.
Thread 14 Crashed:
0 libsystem_platform.dylib 0x00000001f8810930 _platform_memmove + 96
1 CoreGraphics 0x00000001adb64104 CGDataProviderCreateWithCopyOfData + 20
2 CoreGraphics 0x00000001adb4cdb4 CGBitmapContextCreateImage + 172
3 VisionKitCore 0x00000001ed813f10 -[VKCRemoveBackgroundResult _createCGImageFromBGRAPixelBuffer:cropRect:] + 348
4 VisionKitCore 0x00000001ed813cc0 -[VKCRemoveBackgroundResult createCGImage] + 156
5 VisionKitCore 0x00000001ed8ab6f8 __vk_cgImageRemoveBackgroundWithDownsizing_block_invoke + 64
6 VisionKitCore 0x00000001ed881474 __63-[VKCRemoveBackgroundRequestHandler performRequest:completion:]_block_invoke.5 + 436
7 MediaAnalysisServices 0x00000001eec58968 __92-[MADService performRequests:onPixelBuffer:withOrientation:andIdentifier:completionHandler:]_block_invoke.38 + 400
8 CoreFoundation 0x00000001abff0a14 __invoking___ + 148
9 CoreFoundation 0x00000001abf9cf2c -[NSInvocation invoke] + 428
10 Foundation 0x00000001a6464d38 __NSXPCCONNECTION_IS_CALLING_OUT_TO_REPLY_BLOCK__ + 16
11 Foundation 0x00000001a64362fc -[NSXPCConnection _decodeAndInvokeReplyBlockWithEvent:sequence:replyInfo:] + 520
12 Foundation 0x00000001a6a10f44 __88-[NSXPCConnection _sendInvocation:orArguments:count:methodSignature:selector:withProxy:]_block_invoke_5 + 188
13 libxpc.dylib 0x00000001f89053e4 _xpc_connection_reply_callout + 124
14 libxpc.dylib 0x00000001f88f8580 _xpc_connection_call_reply_async + 88
15 libdispatch.dylib 0x00000001b340205c _dispatch_client_callout3 + 20
16 libdispatch.dylib 0x00000001b341ff58 _dispatch_mach_msg_async_reply_invoke + 344
17 libdispatch.dylib 0x00000001b340956c _dispatch_lane_serial_drain + 376
18 libdispatch.dylib 0x00000001b340a214 _dispatch_lane_invoke + 436
19 libdispatch.dylib 0x00000001b3414e10 _dispatch_workloop_worker_thread + 652
20 libsystem_pthread.dylib 0x00000001f88a4df8 _pthread_wqthread + 288
21 libsystem_pthread.dylib 0x00000001f88a4b98 start_wqthread + 8
Last but not the least. The users who met this kind of app crash use iOS16+. We think this crash is related to iOS 16 SDK. We're appreciate that you can provide some clues how to fix this kind of crash.
Delve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.
Post
Replies
Boosts
Views
Activity
Hi,
I want to begin by saying thank you Apple for making the Spatial framework! Please add a million more features ;-)
I'm using the following code to make an object "look at" another point, but at a particular rotation the object "flips" its rotations.
See a video here: https://www.dropbox.com/s/5irxt0gxou4c2j6/QuaternionFlip.mov?dl=0
I shake the mouse cursor when it happens to make it obvious to you.
import Spatial
let lookAtRotation = Rotation3D(eye: Point3D(position), target: Point3D(x: 0, y: 0, z: 0), up: Vector3D(x: 0, y: 1, z: 0))
myObj.quaternion = lookAtRotation.quaternion
So my question is why is this happening, and how can I fix it?
thx
We are using apple unity plugin (gamekit) to authorized player using game center account. To get player the info we run a task from the plugin,
var fetchItemsResponse = await GKLocalPlayer.Local.FetchItems();
But when this code run, there is an error on the xcode application on mac. The error is the following,
Thread 1: EXC_BAD_ACCESS (code=257, address=0x2)
I'm trying to create a Apple Watch game using Xcode 14.2 and watchOS 9. Getting started creating a watchOS App seems pretty straight forward, and getting started creating a game project via the starter template seems easy enough. Trying to put these two together though doesn't seem to work (or is not straight foward). The documentation specifies limitations on what libraries can be used with watchOS noting WKInterfaceSKScene, but doesn't give any specific examples of how to start out a WatchOS project using this. Additionally nearly every online tutorial that I'm able to find uses Storyboards to create a watchOS game, which does not seem to be supported in the latest version of watchOS or Xcode. Can anyone provide example starter code using the watchOS App project starter that that loads with a small colored square on the screen that moves from left to right using the WKInterfaceSKScene library?
I've tried the Apple documentation, asking ChatGPT for a sample or reference links, and various tutorials on YouTube and elsewhere.
We are using the Apple Unity plugin (Gamekit) to authorize players, using a Game Center account. To get player info we run a task from the plugin,
var fetchItemsResponse = await GKLocalPlayer.Local.FetchItems();
But when this code run, there is an error in Xcode. The error is the following,
Thread 1: EXC_BAD_ACCESS (code=257, address=0x2)
MacOS 12.5 Monterey
Unity 2021.3.4f1
XCode 14.2
AppleCore Unity Package - 1.0.2
AppleGameKit Unity Package - 1.0.3
Crashes when calling FetchItems in Unity
Installed in iPhone XR (iOS 15.2.1)
Hello,
How do we support behaviors in out custom parameters in FxPlug?
A simple example would simply be recreating the float parameter.
Once we have done that we would like to support multi-dimensional vectors of floats.
Thanks,
Nikki
Hello everyone! After some time to think about I proceed with graphics api, I figured opengl will be my first since I'm completely new to graphics programming. As in my last post you may find, I was speaking on moltenvk and might just use metal instead, along with the demos I found using metal. So for now, and I know this is said MANY TIMES, apple deprecated opengl but wish to use it because I'm new to graphics programming and want to develop an app(a rendering engine really) for the iPhone 14 Pro Max and macOS Ventura 13.2(I think this is the latest). So what do you guys think? Can I still use opengl es on the 14 max, along with opengl 4+ on latest macOS even though is deprecated?
It seems that something must of changed in the reality composer export feature to USDZ. Importing any animated .usdz file into reality composer and then exporting it is reducing the playback frame rate to about 30%. The same file imported and then exported as a .reality file plays back just fine.
Anyone else experiencing this issue, as its happening for every usdz file imported and also across 2 different apple laptops running the software?
Hello,
In one of my apps, I'm trying to modify the pixel buffer from a ProRAW capture to then write the modified DNG.
This is what I try to do:
After capturing a ProRAW photo, I work in the delegate function func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) { ... }
In here I can access the photo.pixelBuffer and get its base address:
guard let buffer = photo.pixelBuffer else { return }
CVPixelBufferLockBaseAddress(buffer, [])
let pixelFormat = CVPixelBufferGetPixelFormatType(buffer)
// I check that the pixel format corresponds with ProRAW . This is successful, the code enters the if block
if (pixelFormat == kCVPixelFormatType_64RGBALE) {
guard let pointer = CVPixelBufferGetBaseAddress(buffer) else { return }
// We have 16bits per component, 4 components
let count = CVPixelBufferGetWidth(buffer) * CVPixelBufferGetHeight(buffer) * 4
let mutable = pointer.bindMemory(to: UInt16.self, capacity: count)
// As a test, I want to replace all pixels with 65000 to get a white image
let finalBufferArray : [Float] = Array.init(repeating: 65000, count: count) vDSP_vfixu16(finalBufferArray, 1, mutable, 1, vDSP_Length(finalBufferArray.count))
// I create an vImage Pixel buffer. Note that I'm referencing the photo.pixelBuffer to be sure that I modified the underlying pixelBuffer of the AVCapturePhoto object
let imageBuffer = vImage.PixelBuffer<vImage.Interleaved16Ux4>(referencing: photo.pixelBuffer!, planeIndex: 0)
// Inspect the CGImage
let cgImageFormat = vImage_CGImageFormat(bitsPerComponent: 16, bitsPerPixel: 64, colorSpace: CGColorSpace(name: CGColorSpace.displayP3)!, bitmapInfo: CGBitmapInfo(rawValue: CGImageAlphaInfo.last.rawValue | CGBitmapInfo.byteOrder16Little.rawValue))!
let cgImage = imageBuffer.makeCGImage(cgImageFormat: cgImageFormat)!
// I send the CGImage to the main view controller. This is successful, I can see a white image when rendering the CGImage into a UIImage. This lets me think that I successfully modified the photo.pixelBuffer
firingFrameDelegate?.didSendCGImage(image: cgImage)
}
// Now I try to write data. Unfortunately, this does not work. The photo.fileDataRepresentation() writes the data corresponding to the original, unmodified pixelBuffer
`if let photoData = photo.fileDataRepresentation() {
// Sending the data to the view controller and rendering it in the UIImage displays the original photo, not the modified pixelBuffer
firingFrameDelegate?.didSendData(data: photoData)
thisPhotoData = photoData
}`
CVPixelBufferUnlockBaseAddress(buffer, [])
The same happens if I try to write the data to disk. The DNG file displays the original photo and not the data corresponding to the modified photo.pixelBuffer.
Do you know why this code should not work? Do you have any ideas on how I can modify the ProRAW pixel buffer so that I can write the modified buffer into a DNG file?
My goal is to write a modified file, so, I'm not sure I can use CoreImage of vImage to output a ProRAW file.
Hello community this post is for show my complete unsatisfied with Apple specially on developing games for Apples platforms there is lack of support for it for example some new gaming technologies and still that there is no profit or worth from all the work and money invested to develop for it I will close the journey with Apple very unsatisfied I'm going to give opportunities with my business to other platforms that are really worth it and give support to all new technologies in gaming and yes Apple destroyed other gaming makers with their new services like arcade and seems no future for gaming in Apples platforms. Quit goodbye and good luck to everyone.
I am running the RoomPlan Demo app and keep getting the above error and when I try to find someplace to get the archive in the Metal Libraries my searches come up blank. There are no files that show up in a search that contain such identifiers. A number of messages are displayed about "deprecated" interfaces also. Is it normal to send out demo apps that are hobbled in this way?
I'm trying to resize NSImages on macOS. I'm doing so with an extension like this.
extension NSImage {
// MARK: Resizing
/// Resize the image to the given size.
///
/// - Parameter size: The size to resize the image to.
/// - Returns: The resized image.
func resized(toSize targetSize: NSSize) -> NSImage? {
let frame = NSRect(x: 0, y: 0, width: targetSize.width, height: targetSize.height)
guard let representation = self.bestRepresentation(for: frame, context: nil, hints: nil) else {
return nil
}
let image = NSImage(size: targetSize, flipped: false, drawingHandler: { (_) -> Bool in
return representation.draw(in: frame)
})
return image
}
}
The problem is, as far as I can tell, the image that comes out of the drawing handler has lost the original color profile of the original image rep. I'm testing it with a wide color gamut image, attached.
This becomes pure red when examing the image result.
If this was UIKit I guess I'd use the UIGraphicsImageRenderer and select the right UIGraphicsImageRendererFormat.Range and so I'm suspecting I need to use the NSGraphicsContext here to do the rendering but I can't see what on that I would set to make it use wide color or how I'd use it?
We are using AVAssetWriter to write videos using both HEVC and H.264 encoding. Occasionally, we get reports of choppy footage in which frames appear out of order when played back on a Mac (QuickTime) or iOS device (stock Photos app). This occurs extremely unpredictably, often not starting until 20+ minutes of filming, but occasionally happening as soon as filming starts.
Interestingly, users have reported the issue goes away while editing or viewing on a different platform (e.g. Linux) or in the built-in Google Drive player, but comes back as soon as the video is exported or downloaded again. When this occurs in an HEVC file, converting to H.264 seems to resolve it. I haven't found a similar fix for H.264 files.
I suspect an AVAssetWriter encoding issue but haven't been able to uncover the source.
Running a stream analyzer on HEVC files with this issue reveals the following error:
Short-term reference picture with POC = [some number] seems to have been removed or not correctly decoded.
However, running a stream analyzer on H.264 files with the same playback issue seems to show nothing wrong.
At a high level, our video pipeline looks something like this:
Grab a sample buffer in captureOutput(_ captureOutput: AVCaptureOutput!, didOutputVideoSampleBuffer sampleBuffer: CMSampleBuffer!)
Perform some Metal rendering on that buffer
Pass the resulting CVPixelBuffer to the AVAssetWriterInputPixelBufferAdaptor associated with our AVAssetWriter
Example files can be found here: https://drive.google.com/drive/folders/1OjDZ3XaC-ubD5hyDiNvMQGl2NVqZbWnR?usp=sharing
This includes a video file suffering this issue, the same file fixed after converting to mp4, and a screen recording of the distorted playback in QuickTime.
Can anyone help point me in the right direction to solving this issue? I can provide more details as necessary.
Hi All!
I'm being asked to migrate an app which utilizes iCloud KVS (Key Value Storage). This ability is a new-ish feature, and the documentation about this is sparse [1]. Honestly, the entire documentation about the new iCloud transfer functionality seems to be missing. Same with Game Center / GameKit. While the docs say that it should work, I'd like to understand the process in more detail.
Has anyone migrated an iCloud KVS app? What happens after the transfer goes through, but before the first release? Do I need to do anything special? I see that the Entitlements file has the TeamID in the Key Value store - is that fine?
<key>com.apple.developer.ubiquity-kvstore-identifier</key>
<string>$(TeamIdentifierPrefix)$(CFBundleIdentifier)</string>
Can someone please share their experience?
Thank you!
[1] https://developer.apple.com/help/app-store-connect/transfer-an-app/overview-of-app-transfer
Hi! I am currently trying to upload my iOS app to App Store Connect. Unfortunately, code signing fails with the following error: "Code object is not signed at all.", referencing a binary Metallib (created with metal-tt and an mtlp-json script). I am using Xcode's automatically managed signing and the binary metallib is located inside the "Resources" directory of a framework that I am including with "Embed and sign" in the app. Could anyone give some guidance on what I need to change to make code signing work? Thank you.
Will visionOS support SceneKit?
I'm really excited about the Object Capture APIs being moved to iOS, and the complex UI shown in the WWDC session.
I have a few unanswered questions:
Where is the sample code available from?
Are the new Object Capture APIs on iOS limited to certain devices?
Can we capture images from the front facing cameras?
After building jaxlib as per the instructions and installing jax-metal, upon testing upon an existing model which works fine using CPU (and GPU on linux), I get the following error.
jax._src.traceback_util.UnfilteredStackTrace: jaxlib.xla_extension.XlaRuntimeError: UNKNOWN: /Users/adam/Developer/Pycharm Projects/gpy_flow_test/sparse_gps.py:66:0: error: failed to legalize operation 'mhlo.cholesky'
/Users/adam/Developer/Pycharm Projects/gpy_flow_test/sparse_gps.py:66:0: note: called from
/Users/adam/Developer/Pycharm Projects/gpy_flow_test/sparse_gps.py:66:0: note: see current operation: %406 = "mhlo.cholesky"(%405) {lower = true} : (tensor<50x50xf32>) -> tensor<50x50xf32>
A have tried to reproduce this with the following minimal example, but this works fine.
from jax import jit
import jax.numpy as jnp
import jax.random as jnr
import jax.scipy as jsp
key = jnr.PRNGKey(0)
A = jnr.normal(key, (100,100))
def calc_cholesky_decomp(test_matrix):
psd_test_matrix = test_matrix @ test_matrix.T
col_decomp = jsp.linalg.cholesky(psd_test_matrix, lower=True)
return col_decomp
calc_cholesky_decomp(A)
jitted_calc_cholesky_decomp = jit(calc_cholesky_decomp)
jitted_calc_cholesky_decomp(A)
I am unable to attach the full error message has it exceeds all the restricts placed on uploads attached to a post.
I am more than happy to try a more complex model if you have any suggestions.
Hello,
I’ve started testing the Metal Shader Converter to convert my HLSL shaders to metallib directly, and I was wondering if the option ’-frecord-sources’ was supported in any way?
Usually I’m compiling my shaders as follows (from Metal):
xcrun -sdk macosx metal -c -frecord-sources shaders/shaders.metal -o shaders/shaders.air
xcrun -sdk macosx metallib shaders/shaders.air -o shaders/shaders.metallib
The -frecord-sources allow me to see the source when debugging and profiling a Metal frame.
Now with DXC we have a similar option, I can compile a typical HLSL shader with embedded debug symbols with:
dxc -T vs_6_0 -E VSMain shaders/triangle.hlsl -Fo shaders/triangle.dxil -Zi -O0 -Qembed_debug
The important options here are ’-Zi` and ’-Qembed_debug’, as they make sure debug symbols are embedded in the DXIL.
It seems that right now Metal Shader Converter doesn’t pass through the DXIL debug information, and I was wondering if it was possible. I’ve looked at all the options in the utility and haven’t seen anything that looked like it.
Right now debug symbols in my shaders is a must-have, so I’ll explore other routes to convert my HLSL shaders to Metal (I’ve been testing spir-v cross to do the conversion, I haven’t actually tested the debug symbols yet, I’ll report back later).
Thank you for your time!
Is MTKView intentionally unavailable on visionOS or is this an issue with the current beta?