I haven't found any really thorough documentation or guidance on the use of CIRAWFilter.linearSpaceFilter. The API documentation calls it
An optional filter you can apply to the RAW image while it’s in linear space.
Can someone provide insight into what this means and what the linear space filter is useful for? When would we use this linear space filter instead of a filter on the output of CIRAWFilter?
Thank you.
General
RSS for tagExplore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.
Post
Replies
Boosts
Views
Activity
I take a picture using the iPhone's camera. The taken resolution is 3024.0 x 4032. I then have to apply a watermark to this image. After a bunch of trial and error, the method I decided to use was taking a snapshot of a watermark UIView, and drawing that over the image, like so:
// Create the watermarked photo.
let result: UIImage=UIGraphicsImageRenderer(size: image.size).image(actions: { _ in
image.draw(in: .init(origin: .zero, size: image.size))
let watermark: Watermark = .init(
size: image.size,
scaleFactor: image.size.smallest / self.frame.size.smallest
)
watermark.drawHierarchy(in: .init(origin: .zero, size: image.size), afterScreenUpdates: true)
})
Then with the final image — because the client wanted it to have a filename as well when viewed from within the Photos app and exported from it, and also with much trial and error — I save it to a file in a temporary directory. I then save it to the user's Photo library using that file. The difference as compared to saving the image directly vs saving it from the file is that when saved from the file the filename is used as the filename within the Photos app; and in the other case it's just a default photo name generated by Apple.
The problem is that in the image saving code I'm getting the following error:
[Metal] 9072 by 12096 iosurface is too large for GPU
And when I view the saved photo it's basically just a completely black image. This problem only started when I changed the AVCaptureSession preset to .photo. Before then there was no errors.
Now, the worst problem is that the app just completely crashes on drawing of the watermark view in the first place. When using .photo the resolution is significantly higher, so the image size is larger, so the watermark size has to be commensurately larger as well. iOS appears to be okay with the size of the watermark UIView. However, when I try to draw it over the image the app crashes with this message from Xcode:
So there's that problem. But I figured that could be resolved by taking a more manual approach to the drawing of the watermark then using a UIView snapshot. So it's not the most pressing problem. What is, is that even after the drawing code is commented out, I still get the iosurface is too large error.
Here's the code that saves the image to the file and then to the Photos library:
extension UIImage {
/// Save us with the given name to the user's photo album.
/// - Parameters:
/// - filename: The filename to be used for the saved photo. Behavior is undefined if the filename contain characters other than what is represented by this regular expression [A-Za-z0-9-_]. A decimal point for the file extension is permitted.
/// - location: A GPS location to save with the photo.
fileprivate func save(_ filename: String, _ location: Optional<Coordinates>) throws {
// Create a path to a temporary directory. Adding filenames to the Photos app form of images is accomplished by first creating an image file on the file system, saving the photo using the URL to that file, and then deleting that file on the file system.
// A documented way of adding filenames to photos saved to Photos was never found.
// Furthermore, we save everything to a `tmp` directory as if we just tried deleting individual photos after they were saved, and the deletion failed, it would be a little more tricky setting up logic to ensure that the undeleted files are eventually
// cleaned up. But by using a `tmp` directory, we can save all temporary photos to it, and delete the entire directory following each taken picture.
guard
let tmpUrl: URL=try {
guard let documentsDirUrl=NSSearchPathForDirectoriesInDomains(.documentDirectory, .userDomainMask, true).first else {
throw GeneralError("Failed to create URL to documents directory.")
}
let url: Optional<URL> = .init(string: documentsDirUrl + "/tmp/")
return url
}()
else {
throw GeneralError("Failed to create URL to temporary directory.")
}
// A path to the image file.
let filePath: String=try {
// Reduce the likelihood of photos taken in quick succession from overwriting each other.
let collisionResistantPath: String="\(tmpUrl.path(percentEncoded: false))\(UUID())/"
// Make sure all directories required by the path exist before trying to write to it.
try FileManager.default.createDirectory(atPath: collisionResistantPath, withIntermediateDirectories: true, attributes: nil)
// Done.
return collisionResistantPath + filename
}()
// Create `CFURL` analogue of file path.
guard let cfPath: CFURL=CFURLCreateWithFileSystemPath(nil, filePath as CFString, CFURLPathStyle.cfurlposixPathStyle, false) else {
throw GeneralError("Failed to create `CFURL` analogue of file path.")
}
// Create image destination object.
//
// You can change your exif type here.
// This is a note from original author. Not quite exactly sure what they mean by it. Link in method documentation can be used to refer back to the original context.
guard let destination=CGImageDestinationCreateWithURL(cfPath, UTType.jpeg.identifier as CFString, 1, nil) else {
throw GeneralError("Failed to create `CGImageDestination` from file url.")
}
// Metadata properties.
let properties: CFDictionary={
// Place your metadata here.
// Keep in mind that metadata follows a standard. You can not use custom property names here.
let tiffProperties: Dictionary<String, Any>=[:]
return [
kCGImagePropertyExifDictionary as String: tiffProperties
] as CFDictionary
}()
// Create image file.
guard let cgImage=self.cgImage else {
throw GeneralError("Failed to retrieve `CGImage` analogue of `UIImage`.")
}
CGImageDestinationAddImage(destination, cgImage, properties)
CGImageDestinationFinalize(destination)
// Save to the photo library.
PHPhotoLibrary.shared().performChanges({
guard let creationRequest: PHAssetChangeRequest = .creationRequestForAssetFromImage(atFileURL: URL(fileURLWithPath: filePath)) else {
return
}
// Add metadata to the photo.
creationRequest.creationDate = .init()
if let location=location {
creationRequest.location = .init(latitude: location.latitude, longitude: location.longitude)
}
}, completionHandler: { _, _ in
try? FileManager.default.removeItem(atPath: tmpUrl.absoluteString)
})
}
}
If anyone can provide some insight as to what's causing the iosurface is too large error and what can be done to resolve it, that'd be awesome.
Hello,
I'm playing with the ScreenCaptureKit. My scenario is very basic: I capture frames and assign them to a CALayer to display in a view (for previewing) - basically what Apple's sample app does. I have two screens, and I capture only one of them, which has no app windows on it. And my app excludes itself from capture. So, when I place my app on the screen which is not being captured, I observe that most didOutputSampleBuffer calls receive frames with Idle status (which is expected for a screen which is not updating). However, if I bring my capturing app (which, to remind, is excluded from capture) to the captured screen, the frames start coming with Complete status (i.e. holding pixel buffers). And this is not what I expect - from capture perspective the screen is still not updating, as the app is excluded. The preview which the app displays proves that, showing empty desktop. So it seems like updates to a CALayer triggers screen capture frame regardless of the app being included or excluded. This looks like a bug to me, and can lead to noticable waste of resources and battery when frames are not just displayed on screen, but also processed somehow or/and sent over a network. Also, I'm observing another issue due to this behavior, where capture hangs when queueDepth is set to 3 in this same scenario (but I'll describe it separately).
Please, advise if I should file a bug somewhere, or maybe there is a rational explanation of this behavior.
Thank you
Is there a way to retrieve a list of all the favorited artists for a given user via the Apple Music API? I see endpoints for playback history and resources added to the library. But I want to retrieve the full list of favorited artists for the given user and I don't see any obvious choices in the documentation.
Thanks in advance!
I have an app that allows users to send messages to each other via a link. I'm receiving many requests from my users to add songs. Can I use the Apple Music API to add 30-second song previews along with the user's message? If users want to listen to the full song, there will be a link to the full song on Apple Music. My app contains ads and subscriptions.
hdiutiul bug?
When making a DMG image for the whole content of user1 profile (meaning using srcFolder = /Users/user1) using hdiutil, the program fails indicating:
/Users/user1/Library/VoiceTrigger: Operation not permitted hdiutil: create failed - Operation not permitted
The complete command used was: "sudo hdiutil create -srcfolder /Users/user1 -skipunreadable -format UDZO /Volumes/testdmg/test.dmg"
And, of course, the user had local admin rights. I was using Sonoma 14.2.1 and a MacBook Pro (Intel T2)
What I would have expected, asuming that /VoiceTrigger cannot be copied for whatever reason, would be to skip that file or folder and continue the process. Then, at the end, produce a log listing the files/folders not included and the reason for their exclusion. The fact that hdiutil just ended inmediately, looks to me as a bug. Or what else could explain the problem described?
I'm building a UIKit app that reads user's Apple Music library and displays it. In MusicKit there is the Artwork structure which I need to use to display artwork images in the app. Since I'm not using SwiftUI I cannot use the ArtworkImage view that is recommended way of displaying those images but the Artwork structure has a method that returns url for the image which can be used to read the image.
The way I have it setup is really simple:
extension MusicKit.Song {
func imageURL(for cgSize: CGSize) -> URL? {
return artwork?.url(
width: Int(cgSize.width),
height: Int(cgSize.height)
)
}
func localImage(for cgSize: CGSize) -> UIImage? {
guard let url = imageURL(for: cgSize),
url.scheme == "musicKit",
let data = try? Data(contentsOf: url) else {
return nil
}
return .init(data: data)
}
}
Now, everytime I access .artwork property (so a lot of times) the main thread gets blocked and the console output gets bombared with messages like these:
2023-07-26 11:49:47.317195+0200 Plum[998:297199] [Artwork] Failed to create color analysis for artwork: <MPMediaLibraryArtwork: 0x289591590> with error; Error Domain=NSCocoaErrorDomain Code=4099 "The connection to service named com.apple.mediaartworkd.xpc was invalidated: failed at lookup with error 159 - Sandbox restriction." UserInfo={NSDebugDescription=The connection to service named com.apple.mediaartworkd.xpc was invalidated: failed at lookup with error 159 - Sandbox restriction.}
2023-07-26 11:49:47.317262+0200 Plum[998:297199] [Artwork] Failed to create color analysis for artwork: file:///var/mobile/Media/iTunes_Control/iTunes/Artwork/Originals/4b/48d7b8d349d2de858413ae4561b6ba1b294dc7
2023-07-26 11:49:47.323099+0200 Plum[998:297013] [Plum] IIOImageWriteSession:121: cannot create: '/var/mobile/Media/iTunes_Control/iTunes/Artwork/Caches/320x320/4b/48d7b8d349d2de858413ae4561b6ba1b294dc7.sb-f9c7943d-6ciLNp'error = 1 (Operation not permitted)
My guess is that the most performance-heavy task here is performing the color analysis for each artwork but IMO the property backgroundColor should not be a stored property if that's the case. I am not planning to use it anywhere and if so it should be a computed async property so it doesn't block the caller.
I know I can move the call to a background thread and that fixes the issue of blocking main thread but still the loading times for each artwork are terribly slow and that impacts the UX.
SwiftUI's ArtworkImage loads the artworks much quicker and without the errors so there must be a better way to do it.
My app is consistently crashing for a specific user on 14.3 (iMac (24-inch, M1, 2021) when their library is being retrieved in full. User says they have 36k+ songs in their library which includes purchased music.
This is the code making the call:
var request = MusicLibraryRequest<Album>()
request.limit = 10000
let response = try await request.response()
I’m aware of a similar (?) crash FB13094022 (https://forums.developer.apple.com/forums/thread/736717) that was claim fixed for 14.2. Not sure if this is a separate issue or linked.
I’ve submitted new FB13573268 for it.
CrashReporter Key: 0455323d871db6008623d9288ecee16c676248c6
Hardware Model: iMac21,1
Process: Music Flow
Identifier: com.third.musicflow
Version: 1.2
Role: Foreground
OS Version: Mac OS 14.3
NSInternalInconsistencyException: No identifiers for model class: MPModelSong from source: (null)
0 CoreFoundation +0xf2530 __exceptionPreprocess
1 libobjc.A.dylib +0x19eb0 objc_exception_throw
2 Foundation +0x10f398 -[NSAssertionHandler handleFailureInMethod:object:file:lineNumber:description:]
3 MediaPlayer +0xd59f0 -[MPBaseEntityTranslator _objectForPropertySet:source:context:]
4 MediaPlayer +0xd574c -[MPBaseEntityTranslator _objectForRelationshipKey:propertySet:source:context:]
5 MediaPlayer +0xd5cd4 __63-[MPBaseEntityTranslator _objectForPropertySet:source:context:]_block_invoke_2
6 CoreFoundation +0x40428 __NSDICTIONARY_IS_CALLING_OUT_TO_A_BLOCK__
7 CoreFoundation +0x402f0 -[__NSDictionaryI enumerateKeysAndObjectsWithOptions:usingBlock:]
8 MediaPlayer +0xd5c1c __63-[MPBaseEntityTranslator _objectForPropertySet:source:context:]_block_invoke
9 MediaPlayer +0x11296c -[MPModelObject initWithIdentifiers:block:]
10 MediaPlayer +0xd593c -[MPBaseEntityTranslator _objectForPropertySet:source:context:]
11 MediaPlayer +0xd66c4 -[MPBaseEntityTranslator objectForPropertySet:source:context:]
12 MediaPlayer +0x1a7744 __47-[MPModeliTunesLibraryRequestOperation execute]_block_invoke
13 iTunesLibrary +0x16d84 0x1b4e1cd84 (0x1b4e1cd30 + 84)
14 CoreFoundation +0x5dec0 __invoking___
15 CoreFoundation +0x5dd38 -[NSInvocation invoke]
16 Foundation +0x1e874 __NSXPCCONNECTION_IS_CALLING_OUT_TO_REPLY_BLOCK__
17 Foundation +0x1cef4 -[NSXPCConnection _decodeAndInvokeReplyBlockWithEvent:sequence:replyInfo:]
18 Foundation +0x1c850 __88-[NSXPCConnection _sendInvocation:orArguments:count:methodSignature:selector:withProxy:]_block_invoke_3
19 libxpc.dylib +0x10020 _xpc_connection_reply_callout
20 libxpc.dylib +0xff18 _xpc_connection_call_reply_async
21 libdispatch.dylib +0x398c _dispatch_client_callout3
22 libdispatch.dylib +0x21384 _dispatch_mach_msg_async_reply_invoke
23 libdispatch.dylib +0xad24 _dispatch_lane_serial_drain
24 libdispatch.dylib +0xba04 _dispatch_lane_invoke
25 libdispatch.dylib +0x16618 _dispatch_root_queue_drain_deferred_wlh
26 libdispatch.dylib +0x15e8c _dispatch_workloop_worker_thread
27 libsystem_pthread.dylib +0x3110 _pthread_wqthread
28 libsystem_pthread.dylib +0x1e2c start_wqthread
Not sure when it happened but I can no longer play explicit songs in my app using MK v3.
I've turned off restrictions/made sure i have access to explicit in...
My phone (Screen Time)
My computer (Screen Time)
My iPad (Screen Time)
music.apple.com (Settings)
and I still get this error when I try to play a song in console.
`CONTENT_RESTRICTED: Content restricted
at set isRestricted (https://js-cdn.music.apple.com/musickit/v3/musickit.js:28:296791)
at SerialPlaybackController._prepareQueue (https://js-cdn.music.apple.com/musickit/v3/musickit.js:28:318357)
at SerialPlaybackController._prepareQueue (https://js-cdn.music.apple.com/musickit/v3/musickit.js:28:359408)
at set queue (https://js-cdn.music.apple.com/musickit/v3/musickit.js:28:308934)
at https://js-cdn.music.apple.com/musickit/v3/musickit.js:28:357429
at Generator.next (<anonymous>)
at asyncGeneratorStep$j (https://js-cdn.music.apple.com/musickit/v3/musickit.js:28:351481)
at _next (https://js-cdn.music.apple.com/musickit/v3/musickit.js:28:351708)`
I use a data cable to connect my Nikon camera to my iPhone. In my project, I use the framework ImageCaptureCore. Now I can read the photos in the camera memory card, but when I press the shutter of the camera to take a picture, the camera does not respond, the connection between the camera and the phone is normal. Then the camera screen shows a picture of a laptop. I don't know. Why is that? I hope someone can help me.
I am trying to retrieve same information about a song as is available on the music.apple.com page (see screenshot).
It seems neither MusicKit and/or Apple Music API's deliver that depth of information. For example, the names listed for Production and Techniques.
Am I correct?
Hi all!
As the title states, I'm looking to integrate with Apple Music's REST API. I'm wondering if there are any OpenAPI 3.0 YAML or JSON specs available anywhere?
I'd like to avoid transcribing the types found in the developer docs manually.
Here's a link to the Apple Music API docs and to the OpenAPI 3.0 spec:
https://developer.apple.com/documentation/applemusicapi
https://spec.openapis.org/oas/latest.html
Open API was previously known as "Swagger" too.
Many thanks in advance!
I found that the app reported a crash of a pure virtual function call, which could not be reproduced.
A third-party library is referenced:
https://github.com/lincf0912/LFPhotoBrowser
Achieve smearing, blurring, and mosaic processing of images
Crash code:
if (![LFSmearBrush smearBrushCache]) {
[_edit_toolBar setSplashWait:YES index:LFSplashStateType_Smear];
CGSize canvasSize = AVMakeRectWithAspectRatioInsideRect(self.editImage.size, _EditingView.bounds).size;
[LFSmearBrush loadBrushImage:self.editImage canvasSize:canvasSize useCache:YES complete:^(BOOL success) {
[weakToolBar setSplashWait:NO index:LFSplashStateType_Smear];
}];
}
- (UIImage *)LFBB_patternGaussianImageWithSize:(CGSize)size orientation:(CGImagePropertyOrientation)orientation filterHandler:(CIFilter *(^ _Nullable )(CIImage *ciimage))filterHandler
{
CIContext *context = LFBrush_CIContext;
NSAssert(context != nil, @"This method must be called using the LFBrush class.");
CIImage *midImage = [CIImage imageWithCGImage:self.CGImage];
midImage = [midImage imageByApplyingTransform:[self LFBB_preferredTransform]];
midImage = [midImage imageByApplyingTransform:CGAffineTransformMakeScale(size.width/midImage.extent.size.width,size.height/midImage.extent.size.height)];
if (orientation > 0 && orientation < 9) {
midImage = [midImage imageByApplyingOrientation:orientation];
}
//图片开始处理
CIImage *result = midImage;
if (filterHandler) {
CIFilter *filter = filterHandler(midImage);
if (filter) {
result = filter.outputImage;
}
}
CGImageRef outImage = [context createCGImage:result fromRect:[midImage extent]];
UIImage *image = [UIImage imageWithCGImage:outImage];
CGImageRelease(outImage);
return image;
}
This line trigger crash:
CGImageRef outImage = [context createCGImage:result fromRect:[midImage extent]];
b9c90c7bbf8940e5aabed7f3f62a65a2-symbolicated.crash
- (void)cameraDevice:(ICCameraDevice*)camera
didReceiveMetadata:(NSDictionary* _Nullable)metadata
forItem:(ICCameraItem*)item
error:(NSError* _Nullable) error API_AVAILABLE(ios(13.0)){
NSLog(@"metadata = %@",metadata);
if (item) {
ICCameraFile *file = (ICCameraFile *)item;
NSURL *downloadsDirectoryURL = [[NSFileManager defaultManager] URLsForDirectory:NSDocumentDirectory inDomains:NSUserDomainMask].firstObject;
downloadsDirectoryURL = [downloadsDirectoryURL URLByAppendingPathComponent:@"Downloads"];
NSDictionary *downloadOptions = @{ ICDownloadsDirectoryURL: downloadsDirectoryURL,
ICSaveAsFilename: item.name,
ICOverwrite: @YES,
ICDownloadSidecarFiles: @YES
};
[self.cameraDevice requestDownloadFile:file options:downloadOptions downloadDelegate:self didDownloadSelector:@selector(didDownloadFile:error:options:contextInfo:) contextInfo:nil];
}
}
- (void)didDownloadFile:(ICCameraFile *)file
error:(NSError* _Nullable)error
options:(NSDictionary<NSString*, id>*)options
contextInfo:(void* _Nullable) contextInfo API_AVAILABLE(ios(13.0)){
if (error) {
NSLog(@"Download failed with error: %@", error);
}
else {
NSLog(@"Download completed for file: %@", file);
}
}
I don't know what's wrong. I don't know if this is the right way to get the camera pictures. I hope someone can help me
Is it possible using MusicKit API's to access the About information displayed on an artist page in Apple Music?
I hoped Artist.editiorialNotes would give me the information but there is scarce information in there. Even for Taylor Swift, only editorialNotes.short displays brief info: "The genre-defying singer-songwriter is the voice of a generation."
If currently not possible, are there plans for it in the future?
Also, with the above in mind, and never seen any editorialNotes for a song, is it safe to assume editorialNotes are mainly used for albums?
struct AlbumDetails : Hashable {
let artistId: String?
}
func fetchAlbumDetails(upc: String) async throws -> AlbumDetails {
let request = MusicCatalogResourceRequest<Album>(matching: \.upc, equalTo: upc)
let response = try await request.response()
guard let album = response.items.first else {
throw NSError(domain: "AlbumNotFound", code: 0, userInfo: nil)
}
do {
let artistID = try await fetchAlbumDetails(upc: upc)
print("Artist ID: \(artistID)")
} catch {
print("Error fetching artist ID: \(error)")
}
return AlbumDetails(artistId: album.artists?.first?.id)
with this function, i can return nearly everything except the artist ID so i know its not a problem with the request but I know there has to be a way to get the artistID, there has too. If anyone has a solution to this I would reallly appricate it
Is it necessary to enroll to Apple devloper program to get into apple news publisher to get my Apple news API credentials. At last. I need a gudance how to publish the articles on the apple News using News API. a detailed explanation till getting Apple News API credentials ( Key, secret key channel id) is much appreciated!
As the title states. Would like to use MusicKit for Web instead of the Swift integration.
I'm currently working on live screen broadcasting app which allows the user's to record their screen to save a mp4 video. I write video file by AVAssetWriter, and it works fine. But, when there is 1GB-2BG of storage space remaining on the device, errors such as "Attempted to start an invalid broadcast session" frequently occur, and video files cannot be played due to not call assetWriter.finishWriting().
Occur on device:
iPhone se3
iPhone 12 pro max
iPhone 13
iPad 19
iPad air 5
I have tried the movieFragmentInterval of AVAssetWriter to write movie fragments , set shouldOptimizeForNetworkUse true/false , not working.The video can not be played.
I want to known how to observe or catch this error? Thanks!
I found that didOutputSampleBuffer would not be called for long time when the screen is no change. It sometimes make me confused for if there something wrong.
In my design, it will change to other screen shot method, such as CGWindowListCreateImage, when long time no data. But this is not what I expected.
I set the minimumFrameInterval to 30 but it seems no work.
[config setMinimumFrameInterval:CMTimeMake(1, 30)];
Is there any settings that can let me get a didOutputSampleBuffer, even given a CMSampleBufferRef with status SCFrameStatusIdle, called atlest one time per second? Which would make me think it works fine without any exception.