i have received a lot of crash log only in iOS16
the crash occured when i called :
[[PHImageManager defaultManager] requestImageDataForAsset:asset options:options resultHandler:resultHandler]
here is the crash log
Exception Type: NSInternalInconsistencyException
ExtraInfo:
Code Type: arm64
OS Version: iPhone OS 16.0 (20A5328h)
Hardware Model: iPhone14,3
Launch Time: 2022-07-30 18:43:25
Date/Time: 2022-07-30 18:49:17
*** Terminating app due to uncaught exception 'NSInternalInconsistencyException', reason:Unhandled error (NSCocoaErrorDomain, 134093) occurred during faulting and was thrown: Error Domain=NSCocoaErrorDomain Code=134093 "(null)"
Last Exception Backtrace:
0 CoreFoundation 0x00000001cf985dc4 0x1cf97c000 + 40388
1 libobjc.A.dylib 0x00000001c8ddfa68 0x1c8dc8000 + 96872
2 CoreData 0x00000001d56d2358 0x1d56cc000 + 25432
3 CoreData 0x00000001d56fa19c 0x1d56cc000 + 188828
4 CoreData 0x00000001d5755be4 0x1d56cc000 + 564196
5 CoreData 0x00000001d57b0508 0x1d56cc000 + 935176
6 PhotoLibraryServices 0x00000001df1783e0 0x1df0ed000 + 570336
7 Photos 0x00000001df8aa88c 0x1df85d000 + 317580
8 PhotoLibraryServices 0x00000001df291de0 0x1df0ed000 + 1723872
9 CoreData 0x00000001d574e518 0x1d56cc000 + 533784
10 libdispatch.dylib 0x00000001d51fc0fc 0x1d51f8000 + 16636
11 libdispatch.dylib 0x00000001d520b634 0x1d51f8000 + 79412
12 CoreData 0x00000001d574e0a0 0x1d56cc000 + 532640
13 PhotoLibraryServices 0x00000001df291d94 0x1df0ed000 + 1723796
14 PhotoLibraryServices 0x00000001df291434 0x1df0ed000 + 1721396
15 Photos 0x00000001df8a8380 0x1df85d000 + 308096
16 Photos 0x00000001df89d050 0x1df85d000 + 262224
17 Photos 0x00000001df87f62c 0x1df85d000 + 140844
18 Photos 0x00000001df87ee94 0x1df85d000 + 138900
19 Photos 0x00000001df87e594 0x1df85d000 + 136596
20 Photos 0x00000001df86b5c8 0x1df85d000 + 58824
21 Photos 0x00000001df86d938 0x1df85d000 + 67896
22 Photos 0x00000001dfa37a64 0x1df85d000 + 1944164
23 Photos 0x00000001dfa37d18 0x1df85d000 + 1944856
24 youavideo -[YouaImageManager requestImageDataForAsset:options:resultHandler:] (in youavideo) (YouaImageManager.m:0) 27
25 youavideo -[YouaAlbumTransDataController requstTransImageHandler:] (in youavideo) (YouaAlbumTransDataController.m:0) 27
26 youavideo -[YouaAlbumTransDataController requstTransWithHandler:] (in youavideo) (YouaAlbumTransDataController.m:77) 11
27 youavideo -[YouaUploadTransDataOperation startTrans] (in youavideo) (YouaUploadTransDataOperation.m:102) 19
28 Foundation 0x00000001c9e78038 0x1c9e3c000 + 245816
29 Foundation 0x00000001c9e7d704 0x1c9e3c000 + 268036
30 libdispatch.dylib 0x00000001d51fa5d4 0x1d51f8000 + 9684
31 libdispatch.dylib 0x00000001d51fc0fc 0x1d51f8000 + 16636
32 libdispatch.dylib 0x00000001d51ff58c 0x1d51f8000 + 30092
33 libdispatch.dylib 0x00000001d51febf4 0x1d51f8000 + 27636
34 libdispatch.dylib 0x00000001d520db2c 0x1d51f8000 + 88876
35 libdispatch.dylib 0x00000001d520e338 0x1d51f8000 + 90936
36 libsystem_pthread.dylib 0x00000002544b9dbc 0x2544b9000 + 3516
37 libsystem_pthread.dylib 0x00000002544b9b98 0x2544b9000 + 2968
i can't find the error code 134093 definition
i don't know what's going wrong in iOS16
Would anyone have a hint of why this could happen and how to resolve it?
thanks very much
Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.
Post
Replies
Boosts
Views
Activity
Hi,
trying to wrap my head around Xcode's FXPlug. I already sell Final Cut Pro titles for a company. These titles were built in motion.
However, they want me to move them to an app and I'm looking for any help on how to accomplish this
*What the app should do is:
Allow users with an active subscription to our website the ability to access titles within FCPX and if they are not an active subscriber, for access to be denied.
Hello, this is building off of another post in which several other posters and I had already attempted solving the issue in hacky ways.
I am using MPMusicPlayerController.applicationQueuePlayer.
My end goal here is to dynamically add items to the queue when it has ended based on my application's business logic. There is no way for me to know what these items will be when I am initially setting the queue.
I have an updated implementation that seems to cover most edge cases, except for a glaringly obvious one – if there is just one item in the queue, and the user skips the track via MPRemoteCommandCenter (eg. lock screen), then it does not work.
Currently, when I receive a MPMusicPlayerControllerPlaybackStateDidChange notification, I run this block:
if player.playbackState == .paused,
player.currentPlaybackTime == 0,
player.indexOfNowPlayingItem == 0 {
EndOfQueueManager.handle()
}
In the absence of a mechanism to detect the end of the queue from the framework, I would love to add the ability to add a target to MPRemoteCommand, like you can do for AVPlayer. I have tried to do exactly that, but it does not work:
MPRemoteCommandCenter.shared().nextTrackCommand.addTarget { (event) -> MPRemoteCommandHandlerStatus in
if queue.count == 1 {
EndOfQueueManager.handle()
}
return .success
}
I already have a functioning AVPlayer implementation that achieves my goal without any compromises or edge cases. I would be very disappointed if there is no way to do this with MPMusicPlayerController – being notified about the queue ending feels like a fairly rudimentary API hook.
I'm creating app that listening other app's sound. in this use case, screen data is not needed.
but if I don't call SCStream#addStreamOutput(_, type: .screen, ...), console shows this error:
[ERROR] _SCStream_RemoteVideoQueueOperationHandlerWithError:701 stream output NOT found. Dropping frame
currently I'm setting SCStreamConfiguration#minimumFrameInterval to large value (e.g. 0.1fps) as workaround, but it would be good if i can completely disable screen capture for best performance.
there is any way to disable screen capture and only captures apps audio?
Title says it all.
I have an AVPlayerViewController in my app playing video and my largest source of errors is NSURLErrorDomain Code=-1008. The underlying error they provide is Error Domain=CoreMediaErrorDomain Code=-12884 "(null)". I couldn't find what the error implies or is caused by - osstatus.com also does not have a reference to it.
Hi there,
I have a related forum thread here and a Feedback Assistant ticket open, but this issue seems different.
Sometime within the last 2-3 weeks, code related to MusicLibrary has stopped working. None of my code has changed.
For example, the below two snippets used to work fine:
for track in newTracks {
try await MusicLibrary.shared.add(track, to: targetPlaylist)
}
try await MusicLibrary.shared.edit(targetPlaylist, items: items)
newTracks and items are both fetched using:
try await targetPlaylist.with(.tracks, preferredSource: .catalog).tracks
Using preferredSource: .catalog was a workaround used to address the issue in the aforementioned post above.
All iOS 16 capable functions are decorated with:
@available(iOS 16, *)
or in an if block:
if #available(iOS 16, *) {...
What's happening is that the following is showing up in the console:
2022-11-28 23:31:11.279648+0700 MyApp[38653:6736450] [core] Attempted to register account monitor for types client is not authorized to access: {(
"com.apple.account.iTunesStore"
)}
2022-11-28 23:31:11.279718+0700 MyApp[38653:6736450] [Default] <ICUserIdentityStoreACAccountBackend: 0x282adb520> Failed to register for account monitoring. err=Error Domain=com.apple.accounts Code=7 "(null)"
2022-11-28 23:31:11.279758+0700 MyApp[38653:6736450] [Default] ICUserIdentity - Unable to retrieve DSID for userIdentity=<ICUserIdentity 0x2806eb120: [Active Account: <unresolved>]> - error=Error Domain=com.apple.accounts Code=7 "(null)"
These errors are not caught by a do/catch block, but I assume they are related to the issue, and I believe they have to do with MyApp trying to access things that Music Kit thinks it's not supposed to. For example, if MyApp attempts to work with a playlist that it did not create, errors would be expected, thrown errors.
The thing is that I know I'm working with resources that are created by MyApp.
In fact, in trying to test this, I just tried to create a playlist with the below, and the same behavior is occurring:
@available(iOS 16, *)
func createPlaylist2(name: String, description: String) async -> MusicKit.Playlist? {
do {
Logger.log(.info, "Creating Playlist: \(name)")
Logger.log(.shrug, "Does this work?")
let newPlaylist = try await MusicLibrary.shared.createPlaylist(name: name, description: description) // <= Things stop here!
Logger.log(.success, "New playlist created: \(newPlaylist)") // <= this isn't logged.
return newPlaylist // <= nothing is returned
} catch {
Logger.log(.error, "Could not create new playlist: \(error)") // <= no error logged.
}
return nil
}
The result is:
2022-11-29 00:15:01.875064+0700 MyApp[38794:6760471] [core] Attempted to register account monitor for types client is not authorized to access: {(
"com.apple.account.iTunesStore"
)}
2022-11-29 00:15:01.875372+0700 MyApp[38794:6760471] [Default] <ICUserIdentityStoreACAccountBackend: 0x283005720> Failed to register for account monitoring. err=Error Domain=com.apple.accounts Code=7 "(null)"
2022-11-29 00:15:01.876677+0700 MyApp[38794:6760323] [EntityQuery] Finished executing query in 0.000999928s
2022-11-29 00:15:01.889055+0700 MyApp[38794:6760323] [EntityQuery] Finished fetching results in 0.0120001s
2022-11-29 00:15:01.891235+0700 MyApp[38794:6760329] [core] Attempted to register account monitor for types client is not authorized to access: {(
"com.apple.account.iTunesStore"
)}
2022-11-29 00:15:01.891684+0700 MyApp[38794:6760329] [Default] <ICUserIdentityStoreACAccountBackend: 0x283005720> Failed to register for account monitoring. err=Error Domain=com.apple.accounts Code=7 "(null)"
📘 Creating Playlist: TEST PLAYLIST
🤷🏻♀️ Does this work?
2022-11-29 00:15:06.697374+0700 MyApp[38794:6760329] [] nw_path_necp_check_for_updates Failed to copy updated result (22)
What's really nasty is that errors are not thrown, so they can't be caught and handled in a catch block.
I know that iOS 16.1 got released around the end of October, but I really don't know what's going on here.
The behavior is showing up in both prod and when testing locally.
Any help would be most appreciated.
@JoeKhun: Did I miss the memo?
Hi all,
Apple dropping on-going development for FireWire devices that were supported with the Core Audio driver standard is a catastrophe for a lot of struggling musicians who need to both keep up to date on security updates that come with new OS releases, and continue to utilise their hard earned investments in very expensive and still pristine audio devices that have been reduced to e-waste by Apple's seemingly tone-deaf ignorance in the cries for on-going support.
I have one of said audio devices, and I'd like to keep using it while keeping my 2019 Intel Mac Book Pro up to date with the latest security updates and OS features.
Probably not the first time you gurus have had someone make the logical leap leading to a request for something like this, but I was wondering if it might be somehow possible of shoe-horning the code used in previous versions of Mac OS that allowed the Mac to speak with the audio features of such devices to run inside the Ventura version of the OS.
Would it possible? Would it involve a lot of work? I don't think I'd be the only person willing to pay for a third party application or utility that restored this functionality.
There has to be 100's of thousands of people who would be happy to spare some cash to stop their multi-thousand dollar investment in gear to be so thoughtlessly resigned to the scrap heap.
Any comments or layman-friendly explanations as to why this couldn’t happen would be gratefully received!
Thanks,
em
Can I officially use the ScreenCaptureKit framework without worrying that it contains some bugs or may be exposed to future changes?
I'm making a request to get 10 artists with their top songs at once, but for some artists it will always fail with a 504. The response is also in HTML which leads to a decoding error. This is my code
var request = MusicCatalogResourceRequest<Artist>(matching: \.id, memberOf: ids)
request.properties = properties
let response = try await request.response()
where ids is MusicItemId. Below I have an input which will always fail 100% of the time, even when retried.
10 elements
- 0 : "51639"
- 1 : "331584"
- 2 : "120199"
- 3 : "45058"
- 4 : "284786497"
- 5 : "44984"
- 6 : "37299"
- 7 : "518462"
- 8 : "39525"
- 9 : "73568"
Example response:
[DataRequesting] Failed to parse body of response with status code Unknown (504):
<!DOCTYPE html>
<html lang="en">
<head>
<style>
body {
font-family: "Helvetica Neue", "HelveticaNeue", Helvetica, Arial, sans-serif;
font-size: 15px;
font-weight: 200;
line-height: 20px;
color: #4c4c4c;
text-align: center;
}
.section {
margin-top: 50px;
}
</style>
</head>
<body>
<div class="section">
<h1></h1>
<h3>Gateway Timeout</h3>
<p>Correlation Key: WFRI6Q5HXAUJYXGNRKQ6YTBYIM</p>
</div>
</body>
</html>
I have also tried batching these into 2 requests of 5 artists instead of 1 request of 10 artists which still fails. However, I do have sets of 10 artists that work fine. Anyone know why?
I take a picture using the iPhone's camera. The taken resolution is 3024.0 x 4032. I then have to apply a watermark to this image. After a bunch of trial and error, the method I decided to use was taking a snapshot of a watermark UIView, and drawing that over the image, like so:
// Create the watermarked photo.
let result: UIImage=UIGraphicsImageRenderer(size: image.size).image(actions: { _ in
image.draw(in: .init(origin: .zero, size: image.size))
let watermark: Watermark = .init(
size: image.size,
scaleFactor: image.size.smallest / self.frame.size.smallest
)
watermark.drawHierarchy(in: .init(origin: .zero, size: image.size), afterScreenUpdates: true)
})
Then with the final image — because the client wanted it to have a filename as well when viewed from within the Photos app and exported from it, and also with much trial and error — I save it to a file in a temporary directory. I then save it to the user's Photo library using that file. The difference as compared to saving the image directly vs saving it from the file is that when saved from the file the filename is used as the filename within the Photos app; and in the other case it's just a default photo name generated by Apple.
The problem is that in the image saving code I'm getting the following error:
[Metal] 9072 by 12096 iosurface is too large for GPU
And when I view the saved photo it's basically just a completely black image. This problem only started when I changed the AVCaptureSession preset to .photo. Before then there was no errors.
Now, the worst problem is that the app just completely crashes on drawing of the watermark view in the first place. When using .photo the resolution is significantly higher, so the image size is larger, so the watermark size has to be commensurately larger as well. iOS appears to be okay with the size of the watermark UIView. However, when I try to draw it over the image the app crashes with this message from Xcode:
So there's that problem. But I figured that could be resolved by taking a more manual approach to the drawing of the watermark then using a UIView snapshot. So it's not the most pressing problem. What is, is that even after the drawing code is commented out, I still get the iosurface is too large error.
Here's the code that saves the image to the file and then to the Photos library:
extension UIImage {
/// Save us with the given name to the user's photo album.
/// - Parameters:
/// - filename: The filename to be used for the saved photo. Behavior is undefined if the filename contain characters other than what is represented by this regular expression [A-Za-z0-9-_]. A decimal point for the file extension is permitted.
/// - location: A GPS location to save with the photo.
fileprivate func save(_ filename: String, _ location: Optional<Coordinates>) throws {
// Create a path to a temporary directory. Adding filenames to the Photos app form of images is accomplished by first creating an image file on the file system, saving the photo using the URL to that file, and then deleting that file on the file system.
// A documented way of adding filenames to photos saved to Photos was never found.
// Furthermore, we save everything to a `tmp` directory as if we just tried deleting individual photos after they were saved, and the deletion failed, it would be a little more tricky setting up logic to ensure that the undeleted files are eventually
// cleaned up. But by using a `tmp` directory, we can save all temporary photos to it, and delete the entire directory following each taken picture.
guard
let tmpUrl: URL=try {
guard let documentsDirUrl=NSSearchPathForDirectoriesInDomains(.documentDirectory, .userDomainMask, true).first else {
throw GeneralError("Failed to create URL to documents directory.")
}
let url: Optional<URL> = .init(string: documentsDirUrl + "/tmp/")
return url
}()
else {
throw GeneralError("Failed to create URL to temporary directory.")
}
// A path to the image file.
let filePath: String=try {
// Reduce the likelihood of photos taken in quick succession from overwriting each other.
let collisionResistantPath: String="\(tmpUrl.path(percentEncoded: false))\(UUID())/"
// Make sure all directories required by the path exist before trying to write to it.
try FileManager.default.createDirectory(atPath: collisionResistantPath, withIntermediateDirectories: true, attributes: nil)
// Done.
return collisionResistantPath + filename
}()
// Create `CFURL` analogue of file path.
guard let cfPath: CFURL=CFURLCreateWithFileSystemPath(nil, filePath as CFString, CFURLPathStyle.cfurlposixPathStyle, false) else {
throw GeneralError("Failed to create `CFURL` analogue of file path.")
}
// Create image destination object.
//
// You can change your exif type here.
// This is a note from original author. Not quite exactly sure what they mean by it. Link in method documentation can be used to refer back to the original context.
guard let destination=CGImageDestinationCreateWithURL(cfPath, UTType.jpeg.identifier as CFString, 1, nil) else {
throw GeneralError("Failed to create `CGImageDestination` from file url.")
}
// Metadata properties.
let properties: CFDictionary={
// Place your metadata here.
// Keep in mind that metadata follows a standard. You can not use custom property names here.
let tiffProperties: Dictionary<String, Any>=[:]
return [
kCGImagePropertyExifDictionary as String: tiffProperties
] as CFDictionary
}()
// Create image file.
guard let cgImage=self.cgImage else {
throw GeneralError("Failed to retrieve `CGImage` analogue of `UIImage`.")
}
CGImageDestinationAddImage(destination, cgImage, properties)
CGImageDestinationFinalize(destination)
// Save to the photo library.
PHPhotoLibrary.shared().performChanges({
guard let creationRequest: PHAssetChangeRequest = .creationRequestForAssetFromImage(atFileURL: URL(fileURLWithPath: filePath)) else {
return
}
// Add metadata to the photo.
creationRequest.creationDate = .init()
if let location=location {
creationRequest.location = .init(latitude: location.latitude, longitude: location.longitude)
}
}, completionHandler: { _, _ in
try? FileManager.default.removeItem(atPath: tmpUrl.absoluteString)
})
}
}
If anyone can provide some insight as to what's causing the iosurface is too large error and what can be done to resolve it, that'd be awesome.
Hi,
I'm having trouble saving user presets in the plugin for Audio Units. This works well for saving the user presets in the Host, but I get an error when trying to save them in the plugin.
I'm not using a parameter tree, but instead using the fullState's getter and setter for saving and retrieving a dictionary with the state.
With some simplified parameters it looks something like this:
var gain: Double = 0.0
var frequency: Double = 440.0
private var currentState: [String: Any] = [:]
override var fullState: [String: Any]? {
get {
// Save the current state
currentState["gain"] = gain
currentState["frequency"] = frequency
// Return the preset state
return ["myPresetKey": currentState]
}
set {
// Extract the preset state
currentState = newValue?["myPresetKey"] as? [String: Any] ?? [:]
// Set the Audio Unit's properties
gain = currentState["gain"] as? Double ?? 0.0
frequency = currentState["frequency"] as? Double ?? 440.0
}
}
This works perfectly well for storing user presets when saved in the host. When trying to save them in the plugin to be able to reuse them across hosts, I get the following error in the interface: "Missing key in preset state map". Note that I am testing mostly in AUM.
I could not find any documentation for what the missing key is about and how can I get around this. Any ideas?
I have a custom app running on a Mac Studio with Ventura that grabs a snapshot image from a network camera. It then adds some extra information into the EXIF "MakerNote" field. However the metadata cannot be read back out of the image when running Ventrua, it can however be read out of the same image file on a Mac that is not running Ventura.
It would appear Apple has removed support for reading MakerNote in Ventura but still supports writing MakerNote in Ventura.
This code is about 7 years old and written in ObjC and has worked with no issue until Ventura came along.
Calls used
CGImageDestinationAddImageFromSource(); // used to write the image to disk with the extra metadata - Works on Ventura
CGImageSourceCopyPropertiesAtIndex(); // used to read the meta data from an image - does not return "MakeNote" data
Is there a new way to read EXIF "MakeNote" data from image files that was introduced with Ventura?
I am calling AVSampleBufferDisplayLayer.flush from a background queue but this seems to occasionally crash the app. I am calling it from the same thread I pass to - (void)requestMediaDataWhenReadyOnQueue:(dispatch_queue_t)queue usingBlock:(void (^)(void))block;.
My question is, is this API threadsafe, or do I need to call flush from the main thread? Or is there another issue that I am not considering? It seems strange to me that this API would trigger an autolayout pass.
0 CoreFoundation 0x00000001bb384e38 __exceptionPreprocess + 164
1 libobjc.A.dylib 0x00000001b451b8d8 objc_exception_throw + 59
2 CoreAutoLayout 0x00000001d7e09e84 _AssertAutoLayoutOnAllowedThreadsOnly + 327
3 CoreAutoLayout 0x00000001d7e00e60 -[NSISEngine withBehaviors:performModifications:] + 35
4 UIKitCore 0x00000001be58fd40 -[UIView _postMovedFromSuperview:] + 671
5 UIKitCore 0x00000001bd56dfec -[UIView(Internal) _addSubview:positioned:relativeTo:] + 1903
6 UIKitCore 0x00000001bda57ccc -[_UITextLayoutCanvasView textViewportLayoutController:configureRenderingSurfaceForTextLayoutFragment:] + 455
7 UIFoundation 0x00000001c588bc9c __48-[NSTextViewportLayoutController layoutViewport]_block_invoke_4 + 151
8 UIFoundation 0x00000001c5836b50 __80-[NSTextLayoutManager enumerateViewportElementsFromLocation:options:usingBlock:]_block_invoke + 43
9 UIFoundation 0x00000001c580e158 __83-[NSTextLayoutManager enumerateTextLayoutFragmentsFromLocation:options:usingBlock:]_block_invoke_2 + 535
10 CoreFoundation 0x00000001bb385350 __NSARRAY_IS_CALLING_OUT_TO_A_BLOCK__ + 23
11 CoreFoundation 0x00000001bb3b24dc -[__NSSingleObjectArrayI enumerateObjectsWithOptions:usingBlock:] + 91
12 UIFoundation 0x00000001c580de28 __83-[NSTextLayoutManager enumerateTextLayoutFragmentsFromLocation:options:usingBlock:]_block_invoke + 775
13 UIFoundation 0x00000001c57f7504 -[NSTextLayoutManager enumerateTextLayoutFragmentsFromLocation:options:usingBlock:] + 659
14 UIFoundation 0x00000001c57f7264 -[NSTextLayoutManager enumerateViewportElementsFromLocation:options:usingBlock:] + 99
15 UIFoundation 0x00000001c57f6d7c -[NSTextViewportLayoutController layoutViewport] + 1299
16 UIKitCore 0x00000001bd580a3c +[UIView(Animation) performWithoutAnimation:] + 75
17 UIKitCore 0x00000001bd5582d0 -[_UITextLayoutCanvasView layoutSubviews] + 139
18 UIKitCore 0x00000001bd5544c8 -[UIView(CALayerDelegate) layoutSublayersOfLayer:] + 1979
19 QuartzCore 0x00000001bca277fc CA::Layer::layout_if_needed(CA::Transaction*) + 499
20 QuartzCore 0x00000001bca3aeb0 CA::Layer::layout_and_display_if_needed(CA::Transaction*) + 147
21 QuartzCore 0x00000001bca4c234 CA::Context::commit_transaction(CA::Transaction*, double, double*) + 443
22 QuartzCore 0x00000001bca81630 CA::Transaction::commit() + 651
23 MediaToolbox 0x00000001ca8d0da0 videoQueueRemote_SetProperty + 367
24 AVFCore 0x00000001cad191b4 __63-[AVSampleBufferVideoRenderer _setContentLayerOnFigVideoQueue:]_block_invoke + 179
25 libdispatch.dylib 0x00000001c299cf88 _dispatch_client_callout + 19
26 libdispatch.dylib 0x00000001c29ac574 _dispatch_lane_barrier_sync_invoke_and_complete + 55
27 AVFCore 0x00000001cad190d0 -[AVSampleBufferVideoRenderer _setContentLayerOnFigVideoQueue:] + 167
28 AVFCore 0x00000001cad14674 -[AVSampleBufferVideoRenderer _createVideoQueue:errorStep:] + 195
29 AVFCore 0x00000001cad14ac8 -[AVSampleBufferVideoRenderer createVideoQueue:] + 55
30 AVFCore 0x00000001cad179cc -[AVSampleBufferVideoRenderer flushWithRemovalOfDisplayedImage:completionHandler:] + 439
31 App 0x0000000102370214 -[AVSampleBufferDisplayLayer flush] + 51
My current app implements a custom video player, based on a AVSampleBufferRenderSynchronizer synchronising two renderers:
an AVSampleBufferDisplayLayer receiving decoded CVPixelBuffer-based video CMSampleBuffers,
and an AVSampleBufferAudioRenderer receiving decoded lpcm-based audio CMSampleBuffers.
The AVSampleBufferRenderSynchronizer is started when the first image (in presentation order) is decoded and enqueued, using avSynchronizer.setRate(_ rate: Float, time: CMTime), with rate = 1 and time the presentation timestamp of the first decoded image.
Presentation timestamps of video and audio sample buffers are consistent, and on most streams, the audio and video are correctly synchronized.
However on some network streams, on iOS, the audio and video aren't synchronized, with a time difference that seems to increase with time.
On the other hand, with the same player code and network streams on macOS, the synchronization always works fine.
This reminds me of something I've read, about cases where an AVSampleBufferRenderSynchronizer could not synchronize audio and video, causing them to run with independent and potentially drifting clocks, but I cannot find it again.
So, any help / hints on this sync problem will be greatly appreciated! :)
ApplicationMusicPlayer is available on the Mac! 🎉🎉🎉 Enormous thanks to @JoeKun and the team. I've already gotten my app up and running through Catalyst, and I've successfully played music! I also got some timeouts, but that was happening on my phone a lot that day too, so maybe my local CDN was just having a bad day.
I wanted to ask this question in a lab this week, but the timing didn't work out: Do you expect the experience to be the same using ApplicationMusicPlayer on a Catalyst vs a macOS target? I'm hoping to reuse much of my iPad app and go the Catalyst route, but I wanted to double check that the new support wasn't just for macOS.
I'm building a UIKit app that reads user's Apple Music library and displays it. In MusicKit there is the Artwork structure which I need to use to display artwork images in the app. Since I'm not using SwiftUI I cannot use the ArtworkImage view that is recommended way of displaying those images but the Artwork structure has a method that returns url for the image which can be used to read the image.
The way I have it setup is really simple:
extension MusicKit.Song {
func imageURL(for cgSize: CGSize) -> URL? {
return artwork?.url(
width: Int(cgSize.width),
height: Int(cgSize.height)
)
}
func localImage(for cgSize: CGSize) -> UIImage? {
guard let url = imageURL(for: cgSize),
url.scheme == "musicKit",
let data = try? Data(contentsOf: url) else {
return nil
}
return .init(data: data)
}
}
Now, everytime I access .artwork property (so a lot of times) the main thread gets blocked and the console output gets bombared with messages like these:
2023-07-26 11:49:47.317195+0200 Plum[998:297199] [Artwork] Failed to create color analysis for artwork: <MPMediaLibraryArtwork: 0x289591590> with error; Error Domain=NSCocoaErrorDomain Code=4099 "The connection to service named com.apple.mediaartworkd.xpc was invalidated: failed at lookup with error 159 - Sandbox restriction." UserInfo={NSDebugDescription=The connection to service named com.apple.mediaartworkd.xpc was invalidated: failed at lookup with error 159 - Sandbox restriction.}
2023-07-26 11:49:47.317262+0200 Plum[998:297199] [Artwork] Failed to create color analysis for artwork: file:///var/mobile/Media/iTunes_Control/iTunes/Artwork/Originals/4b/48d7b8d349d2de858413ae4561b6ba1b294dc7
2023-07-26 11:49:47.323099+0200 Plum[998:297013] [Plum] IIOImageWriteSession:121: cannot create: '/var/mobile/Media/iTunes_Control/iTunes/Artwork/Caches/320x320/4b/48d7b8d349d2de858413ae4561b6ba1b294dc7.sb-f9c7943d-6ciLNp'error = 1 (Operation not permitted)
My guess is that the most performance-heavy task here is performing the color analysis for each artwork but IMO the property backgroundColor should not be a stored property if that's the case. I am not planning to use it anywhere and if so it should be a computed async property so it doesn't block the caller.
I know I can move the call to a background thread and that fixes the issue of blocking main thread but still the loading times for each artwork are terribly slow and that impacts the UX.
SwiftUI's ArtworkImage loads the artworks much quicker and without the errors so there must be a better way to do it.
I just upgraded to iOS 17 and it looks like AVSpeechSynthesizer is now broken.
I noticed when feeding certain strings to AVSpeechUtterance it just dies flat out stops after only speaking a portion of the string.
For example I fed it a string of approx. 1200 words and it speaks up to around 300 words or so and then just stops. The synthesizer delegate method -speechSynthesizer:didFinishSpeechUtterance: is called when this happens, as if this is supposed to be the end even though it is not even close to being finishes.
Was working fine on iOS 16.
FWIW I create the AVSpeechUtterance with -initWithString:
When you initialize AVSpeechSynthesizer as View prorety in SwiftUI project in Xcode 15 with iOS 17 simulator, you get some comments in console:
Failed to get sandbox extensions
Query for com.apple.MobileAsset.VoiceServicesVocalizerVoice failed: 2
#FactoryInstall Unable to query results, error: 5
Unable to list voice folder
Query for com.apple.MobileAsset.VoiceServices.GryphonVoice failed: 2
Unable to list voice folder
Unable to list voice folder
Query for com.apple.MobileAsset.VoiceServices.GryphonVoice failed: 2
Unable to list voice folder
When you try to run utterance inside a Button as synthesizer.speak(AVSpeechUtterance(string: "iOS 17 broke TextToSpeech")), you get endless stream of warnings that repeaths on and on in console like this:
AddInstanceForFactory: No factory registered for id <CFUUID 0x60000024f200> F8BB1C28-BAE8-11D6-9C31-00039315CD46
Cannot find executable for CFBundle 0x600003b2cd20 </Library/Developer/CoreSimulator/Volumes/iOS_21A328/Library/Developer/CoreSimulator/Profiles/Runtimes/iOS 17.0.simruntime/Contents/Resources/RuntimeRoot/System/Library/PrivateFrameworks/TextToSpeechMauiSupport.framework> (not loaded)
Failed to load first party audio unit from TextToSpeechMauiSupport.framework
Could not instantiate audio unit. Error=Error Domain=NSOSStatusErrorDomain Code=-3000 "(null)"
Could not instantiate audio unit. Error=Error Domain=NSOSStatusErrorDomain Code=-3000 "(null)"
Could not instantiate audio unit. Error=Error Domain=NSOSStatusErrorDomain Code=-3000 "(null)"
Could not instantiate audio unit. Error=Error Domain=NSOSStatusErrorDomain Code=-3000 "(null)"
Could not instantiate audio unit. Error=Error Domain=NSOSStatusErrorDomain Code=-3000 "(null)"
Couldn't find audio unit for request SSML Length: 40, Voice: [AVSpeechSynthesisProviderVoice 0x600002127e30] Name: Samantha, Identifier: com.apple.voice.compact.en-US.Samantha, Supported Languages (
"en-US"
), Age: 0, Gender: 0, Size: 0, Version: (null)
VoiceProvider: Could not start synthesis for request SSML Length: 40, Voice: [AVSpeechSynthesisProviderVoice 0x600002127e30] Name: Samantha, Identifier: com.apple.voice.compact.en-US.Samantha, Supported Languages (
"en-US"
), Age: 0, Gender: 0, Size: 0, Version: (null), converted from tts request [TTSSpeechRequest 0x600003709680] iOS 17 broke TextToSpeech language: en-US footprint: compact rate: 0.500000 pitch: 1.000000 volume: 1.000000
Failed to speak request with error: Error Domain=TTSErrorDomain Code=-4010 "(null)". Attempting to speak again with fallback identifier: com.apple.voice.compact.en-US.Samantha
CPU is under pressure (more than 100%). AVSpeechSynthesizer doesn't speak.
All works fine on iOS 16.
The code of View:
import SwiftUI
import AVFoundation
struct ContentView: View {
let synthesizer = AVSpeechSynthesizer()
var body: some View {
VStack {
Button {
synthesizer.speak(AVSpeechUtterance(string: "iOS 17 broke TextToSpeech"))
} label: {
Text("speak")
}
.buttonStyle(.borderedProminent)
}
.padding()
}
}
#Preview {
ContentView()
}
On the real device nothing at all happened.
The same happens to my production app. I have so much crashes related to TextToSpeach and iOS 17. What's going on?
Hi there, I'm having some trouble with AVAudioMixerNode only working when there is a single input, and outputting silence or very quiet buzzing when >1 input node is connected. My setup has voice processing enabled, input going to a sink, and N source nodes going to the main mixer node, going to the output node. In all cases I am connecting nodes in the graph with the same declared format: 48kHz 1 channel Float32 PCM.
This is working great for 1 source node, but as soon as I add a second it breaks. I can reproduce this behaviour in the SignalGenerator sample, when the same format is used everywhere. Again, it'll work fine with 1 source node even in this configuration, but add another and there's silence.
Am I doing something wrong with formats here? Is this expected? As I understood it with voice processing on and use of a mixer node I should be able to use my own format essentially everywhere in my graph?
My SignalGenerator modified repro example follows:
import Foundation
import AVFoundation
// True replicates my real app's behaviour, which is broken.
// You can remove one source node connection
// to make it work even when this is true.
let showBrokenState: Bool = true
// SignalGenerator constants.
let frequency: Float = 440
let amplitude: Float = 0.5
let duration: Float = 5.0
let twoPi = 2 * Float.pi
let sine = { (phase: Float) -> Float in
return sin(phase)
}
let whiteNoise = { (phase: Float) -> Float in
return ((Float(arc4random_uniform(UINT32_MAX)) / Float(UINT32_MAX)) * 2 - 1)
}
// My "application" format.
let format: AVAudioFormat = .init(commonFormat: .pcmFormatFloat32,
sampleRate: 48000,
channels: 1,
interleaved: true)!
// Engine setup.
let engine = AVAudioEngine()
let mainMixer = engine.mainMixerNode
let output = engine.outputNode
try! output.setVoiceProcessingEnabled(true)
let outputFormat = engine.outputNode.inputFormat(forBus: 0)
let sampleRate = Float(format.sampleRate)
let inputFormat = format
var currentPhase: Float = 0
let phaseIncrement = (twoPi / sampleRate) * frequency
let srcNodeOne = AVAudioSourceNode { _, _, frameCount, audioBufferList -> OSStatus in
let ablPointer = UnsafeMutableAudioBufferListPointer(audioBufferList)
for frame in 0..<Int(frameCount) {
let value = sine(currentPhase) * amplitude
currentPhase += phaseIncrement
if currentPhase >= twoPi {
currentPhase -= twoPi
}
if currentPhase < 0.0 {
currentPhase += twoPi
}
for buffer in ablPointer {
let buf: UnsafeMutableBufferPointer<Float> = UnsafeMutableBufferPointer(buffer)
buf[frame] = value
}
}
return noErr
}
let srcNodeTwo = AVAudioSourceNode { _, _, frameCount, audioBufferList -> OSStatus in
let ablPointer = UnsafeMutableAudioBufferListPointer(audioBufferList)
for frame in 0..<Int(frameCount) {
let value = whiteNoise(currentPhase) * amplitude
currentPhase += phaseIncrement
if currentPhase >= twoPi {
currentPhase -= twoPi
}
if currentPhase < 0.0 {
currentPhase += twoPi
}
for buffer in ablPointer {
let buf: UnsafeMutableBufferPointer<Float> = UnsafeMutableBufferPointer(buffer)
buf[frame] = value
}
}
return noErr
}
engine.attach(srcNodeOne)
engine.attach(srcNodeTwo)
engine.connect(srcNodeOne, to: mainMixer, format: inputFormat)
engine.connect(srcNodeTwo, to: mainMixer, format: inputFormat)
engine.connect(mainMixer, to: output, format: showBrokenState ? inputFormat : outputFormat)
// Put the input node to a sink just to match the formats and make VP happy.
let sink: AVAudioSinkNode = .init { timestamp, numFrames, data in
.zero
}
engine.attach(sink)
engine.connect(engine.inputNode, to: sink, format: showBrokenState ? inputFormat : outputFormat)
mainMixer.outputVolume = 0.5
try! engine.start()
CFRunLoopRunInMode(.defaultMode, CFTimeInterval(duration), false)
engine.stop()