Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.

Post

Replies

Boosts

Views

Activity

CoreML Image Classification Model - What Preprocessing Is Required For Static Images
I have trained a model to classify some symbols using Create ML. In my app I am using VNImageRequestHandler and VNCoreMLRequest to classify image data. If I use a CVPixelBuffer obtained from an AVCaptureSession then the classifier runs as I would expect. If I point it at the symbols it will work fairly accurately, so I know the model is trained fairly correctly and works in my app. If I try to use a cgImage that is obtained by cropping a section out of a larger image (from the gallery), then the classifier does not work. It always seems to return the same result (although the confidence is not a 1.0 and varies for each image, it will be to within several decimal points of it, eg 9.9999). If I pause the app when I have the cropped image and use the debugger to obtain the cropped image (via the little eye icon and then open in preview), then drop the image into the Preview section of the MLModel file or in Create ML, the model correctly classifies the image. If I scale the cropped image to be the same size as I get from my camera, and convert the cgImage to a CVPixelBuffer with same size and colour space to be the same as the camera (1504, 1128, kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange) then I get some difference in ouput, it's not accurate, but it returns different results if I specify the 'centerCrop' or 'scaleFit' options. So I know that 'something' is happening, but it's not the correct thing. I was under the impression that passing a cgImage to the VNImageRequestHandler would perform the necessary conversions, but experimentation shows this is not the case. However, when using the preview tool on the model or in Create ML this conversion is obviously being done behind the scenes because the cropped part is being detected. What am I doing wrong. tl;dr my model works, as backed up by using video input directly and also dropping cropped images into preview sections passing the cropped images directly to the VNImageRequestHandler does not work modifying the cropped images can produce different results, but I cannot see what I should be doing to get reliable results. I'd like my app to behave the same way the preview part behaves, I give it a cropped part of an image, it does some processing, it goes to the classifier, it returns a result same as in Create ML.
2
0
921
Feb ’24
NSInternalInconsistencyException: No identifiers for model class: MPModelSong from source: (null)
My app is consistently crashing for a specific user on 14.3 (iMac (24-inch, M1, 2021) when their library is being retrieved in full. User says they have 36k+ songs in their library which includes purchased music. This is the code making the call: var request = MusicLibraryRequest<Album>() request.limit = 10000 let response = try await request.response() I’m aware of a similar (?) crash FB13094022 (https://forums.developer.apple.com/forums/thread/736717) that was claim fixed for 14.2. Not sure if this is a separate issue or linked. I’ve submitted new FB13573268 for it. CrashReporter Key: 0455323d871db6008623d9288ecee16c676248c6 Hardware Model: iMac21,1 Process: Music Flow Identifier: com.third.musicflow Version: 1.2 Role: Foreground OS Version: Mac OS 14.3 NSInternalInconsistencyException: No identifiers for model class: MPModelSong from source: (null) 0 CoreFoundation +0xf2530 __exceptionPreprocess 1 libobjc.A.dylib +0x19eb0 objc_exception_throw 2 Foundation +0x10f398 -[NSAssertionHandler handleFailureInMethod:object:file:lineNumber:description:] 3 MediaPlayer +0xd59f0 -[MPBaseEntityTranslator _objectForPropertySet:source:context:] 4 MediaPlayer +0xd574c -[MPBaseEntityTranslator _objectForRelationshipKey:propertySet:source:context:] 5 MediaPlayer +0xd5cd4 __63-[MPBaseEntityTranslator _objectForPropertySet:source:context:]_block_invoke_2 6 CoreFoundation +0x40428 __NSDICTIONARY_IS_CALLING_OUT_TO_A_BLOCK__ 7 CoreFoundation +0x402f0 -[__NSDictionaryI enumerateKeysAndObjectsWithOptions:usingBlock:] 8 MediaPlayer +0xd5c1c __63-[MPBaseEntityTranslator _objectForPropertySet:source:context:]_block_invoke 9 MediaPlayer +0x11296c -[MPModelObject initWithIdentifiers:block:] 10 MediaPlayer +0xd593c -[MPBaseEntityTranslator _objectForPropertySet:source:context:] 11 MediaPlayer +0xd66c4 -[MPBaseEntityTranslator objectForPropertySet:source:context:] 12 MediaPlayer +0x1a7744 __47-[MPModeliTunesLibraryRequestOperation execute]_block_invoke 13 iTunesLibrary +0x16d84 0x1b4e1cd84 (0x1b4e1cd30 + 84) 14 CoreFoundation +0x5dec0 __invoking___ 15 CoreFoundation +0x5dd38 -[NSInvocation invoke] 16 Foundation +0x1e874 __NSXPCCONNECTION_IS_CALLING_OUT_TO_REPLY_BLOCK__ 17 Foundation +0x1cef4 -[NSXPCConnection _decodeAndInvokeReplyBlockWithEvent:sequence:replyInfo:] 18 Foundation +0x1c850 __88-[NSXPCConnection _sendInvocation:orArguments:count:methodSignature:selector:withProxy:]_block_invoke_3 19 libxpc.dylib +0x10020 _xpc_connection_reply_callout 20 libxpc.dylib +0xff18 _xpc_connection_call_reply_async 21 libdispatch.dylib +0x398c _dispatch_client_callout3 22 libdispatch.dylib +0x21384 _dispatch_mach_msg_async_reply_invoke 23 libdispatch.dylib +0xad24 _dispatch_lane_serial_drain 24 libdispatch.dylib +0xba04 _dispatch_lane_invoke 25 libdispatch.dylib +0x16618 _dispatch_root_queue_drain_deferred_wlh 26 libdispatch.dylib +0x15e8c _dispatch_workloop_worker_thread 27 libsystem_pthread.dylib +0x3110 _pthread_wqthread 28 libsystem_pthread.dylib +0x1e2c start_wqthread
0
0
576
Feb ’24
MusicKit v3: CONTENT_RESTRICTED: Content restricted How do I fix?
Not sure when it happened but I can no longer play explicit songs in my app using MK v3. I've turned off restrictions/made sure i have access to explicit in... My phone (Screen Time) My computer (Screen Time) My iPad (Screen Time) music.apple.com (Settings) and I still get this error when I try to play a song in console. `CONTENT_RESTRICTED: Content restricted at set isRestricted (https://js-cdn.music.apple.com/musickit/v3/musickit.js:28:296791) at SerialPlaybackController._prepareQueue (https://js-cdn.music.apple.com/musickit/v3/musickit.js:28:318357) at SerialPlaybackController._prepareQueue (https://js-cdn.music.apple.com/musickit/v3/musickit.js:28:359408) at set queue (https://js-cdn.music.apple.com/musickit/v3/musickit.js:28:308934) at https://js-cdn.music.apple.com/musickit/v3/musickit.js:28:357429 at Generator.next (<anonymous>) at asyncGeneratorStep$j (https://js-cdn.music.apple.com/musickit/v3/musickit.js:28:351481) at _next (https://js-cdn.music.apple.com/musickit/v3/musickit.js:28:351708)`
0
0
590
Jan ’24
How do I get photos taken by my camera in real time
I use a data cable to connect my Nikon camera to my iPhone. In my project, I use the framework ImageCaptureCore. Now I can read the photos in the camera memory card, but when I press the shutter of the camera to take a picture, the camera does not respond, the connection between the camera and the phone is normal. Then the camera screen shows a picture of a laptop. I don't know. Why is that? I hope someone can help me.
0
0
479
Jan ’24
Apple Music API Open API 3.0 / Swagger spec?
Hi all! As the title states, I'm looking to integrate with Apple Music's REST API. I'm wondering if there are any OpenAPI 3.0 YAML or JSON specs available anywhere? I'd like to avoid transcribing the types found in the developer docs manually. Here's a link to the Apple Music API docs and to the OpenAPI 3.0 spec: https://developer.apple.com/documentation/applemusicapi https://spec.openapis.org/oas/latest.html Open API was previously known as "Swagger" too. Many thanks in advance!
0
0
535
Jan ’24
How to visualize 16bit raw image data
I'm working on a very simple App where I need to visualize an image on the screen of an iPhone. However, the image has some special properties. It's a 16bit, yuv422_yuy2 encoded image. I already have all the raw bytes saved in a Data object. After googling for a long time, I still did not figure out the correct way. My current understanding is first create a CVPixelBuffer to properly represent the encoding information. Then conver the CVPixelBuffer to an UIImage. The following is my current implementation. public func YUV422YUY2ToUIImage(data: Data, height: Int, width: Int, bytesPerRow: Int) -> UIImage { return rosImage.data.withUnsafeMutableBytes { rawPointer in let baseAddress = rawPointer.baseAddress! let tempBufferPointer = UnsafeMutablePointer<CVPixelBuffer?>.allocate(capacity: 1) CVPixelBufferCreateWithBytes( kCFAllocatorDefault, width, height, kCVPixelFormatType_422YpCbCr16, baseAddress, bytesPerRow, nil, nil, nil, tempBufferPointer) let ciImage = CIImage(cvPixelBuffer: tempBufferPointer.pointee!) return UIImage(ciImage: ciImage) } } However, when I execute the code, I have the followin error -[CIImage initWithCVPixelBuffer:options:] failed because its pixel format v216 is not supported. So it seems CIImage is unhappy. I think I need to convert the encoding from yuv422_yuy2 to something like plain ARGB. But after a long tim googling, I didn't find a way to do that. The closest function I cand find is https://developer.apple.com/documentation/accelerate/1533015-vimageconvert_422cbypcryp16toarg But the function is too complex for me to understand how to use it. Any help is appreciated. Thank you!
2
0
759
Jan ’24
CoreImage createCGImage Crash
I found that the app reported a crash of a pure virtual function call, which could not be reproduced. A third-party library is referenced: https://github.com/lincf0912/LFPhotoBrowser Achieve smearing, blurring, and mosaic processing of images Crash code: if (![LFSmearBrush smearBrushCache]) { [_edit_toolBar setSplashWait:YES index:LFSplashStateType_Smear]; CGSize canvasSize = AVMakeRectWithAspectRatioInsideRect(self.editImage.size, _EditingView.bounds).size; [LFSmearBrush loadBrushImage:self.editImage canvasSize:canvasSize useCache:YES complete:^(BOOL success) { [weakToolBar setSplashWait:NO index:LFSplashStateType_Smear]; }]; } - (UIImage *)LFBB_patternGaussianImageWithSize:(CGSize)size orientation:(CGImagePropertyOrientation)orientation filterHandler:(CIFilter *(^ _Nullable )(CIImage *ciimage))filterHandler { CIContext *context = LFBrush_CIContext; NSAssert(context != nil, @"This method must be called using the LFBrush class."); CIImage *midImage = [CIImage imageWithCGImage:self.CGImage]; midImage = [midImage imageByApplyingTransform:[self LFBB_preferredTransform]]; midImage = [midImage imageByApplyingTransform:CGAffineTransformMakeScale(size.width/midImage.extent.size.width,size.height/midImage.extent.size.height)]; if (orientation > 0 && orientation < 9) { midImage = [midImage imageByApplyingOrientation:orientation]; } //图片开始处理 CIImage *result = midImage; if (filterHandler) { CIFilter *filter = filterHandler(midImage); if (filter) { result = filter.outputImage; } } CGImageRef outImage = [context createCGImage:result fromRect:[midImage extent]]; UIImage *image = [UIImage imageWithCGImage:outImage]; CGImageRelease(outImage); return image; } This line trigger crash: CGImageRef outImage = [context createCGImage:result fromRect:[midImage extent]]; b9c90c7bbf8940e5aabed7f3f62a65a2-symbolicated.crash
1
0
723
Jan ’24
When I use ICCameraDeviceDownloadDelegate method to download the picture camera image has been failed download
- (void)cameraDevice:(ICCameraDevice*)camera didReceiveMetadata:(NSDictionary* _Nullable)metadata forItem:(ICCameraItem*)item error:(NSError* _Nullable) error API_AVAILABLE(ios(13.0)){ NSLog(@"metadata = %@",metadata); if (item) { ICCameraFile *file = (ICCameraFile *)item; NSURL *downloadsDirectoryURL = [[NSFileManager defaultManager] URLsForDirectory:NSDocumentDirectory inDomains:NSUserDomainMask].firstObject; downloadsDirectoryURL = [downloadsDirectoryURL URLByAppendingPathComponent:@"Downloads"]; NSDictionary *downloadOptions = @{ ICDownloadsDirectoryURL: downloadsDirectoryURL, ICSaveAsFilename: item.name, ICOverwrite: @YES, ICDownloadSidecarFiles: @YES }; [self.cameraDevice requestDownloadFile:file options:downloadOptions downloadDelegate:self didDownloadSelector:@selector(didDownloadFile:error:options:contextInfo:) contextInfo:nil]; } } - (void)didDownloadFile:(ICCameraFile *)file error:(NSError* _Nullable)error options:(NSDictionary<NSString*, id>*)options contextInfo:(void* _Nullable) contextInfo API_AVAILABLE(ios(13.0)){ if (error) { NSLog(@"Download failed with error: %@", error); } else { NSLog(@"Download completed for file: %@", file); } } I don't know what's wrong. I don't know if this is the right way to get the camera pictures. I hope someone can help me
0
0
528
Jan ’24
Access "About Artist" information and editorialNotes in general.
Is it possible using MusicKit API's to access the About information displayed on an artist page in Apple Music? I hoped Artist.editiorialNotes would give me the information but there is scarce information in there. Even for Taylor Swift, only editorialNotes.short displays brief info: "The genre-defying singer-songwriter is the voice of a generation." If currently not possible, are there plans for it in the future? Also, with the above in mind, and never seen any editorialNotes for a song, is it safe to assume editorialNotes are mainly used for albums?
0
0
462
Jan ’24
Is it possible to get an artistId from MusicKit just off the albumID
struct AlbumDetails : Hashable { let artistId: String? } func fetchAlbumDetails(upc: String) async throws -> AlbumDetails { let request = MusicCatalogResourceRequest<Album>(matching: \.upc, equalTo: upc) let response = try await request.response() guard let album = response.items.first else { throw NSError(domain: "AlbumNotFound", code: 0, userInfo: nil) } do { let artistID = try await fetchAlbumDetails(upc: upc) print("Artist ID: \(artistID)") } catch { print("Error fetching artist ID: \(error)") } return AlbumDetails(artistId: album.artists?.first?.id) with this function, i can return nearly everything except the artist ID so i know its not a problem with the request but I know there has to be a way to get the artistID, there has too. If anyone has a solution to this I would reallly appricate it
1
0
535
Jan ’24
Apple News API
Is it necessary to enroll to Apple devloper program to get into apple news publisher to get my Apple news API credentials. At last. I need a gudance how to publish the articles on the apple News using News API. a detailed explanation till getting Apple News API credentials ( Key, secret key channel id) is much appreciated!
0
0
687
Jan ’24
Replaykit Broadcast finishing unexpectedly:Attempted to start an invalid broadcast session
I'm currently working on live screen broadcasting app which allows the user's to record their screen to save a mp4 video. I write video file by AVAssetWriter, and it works fine. But, when there is 1GB-2BG of storage space remaining on the device, errors such as "Attempted to start an invalid broadcast session" frequently occur, and video files cannot be played due to not call assetWriter.finishWriting(). Occur on device: iPhone se3 iPhone 12 pro max iPhone 13 iPad 19 iPad air 5 I have tried the movieFragmentInterval of AVAssetWriter to write movie fragments , set shouldOptimizeForNetworkUse true/false , not working.The video can not be played. I want to known how to observe or catch this error? Thanks!
0
0
574
Jan ’24
No output for long time
I found that didOutputSampleBuffer would not be called for long time when the screen is no change. It sometimes make me confused for if there something wrong. In my design, it will change to other screen shot method, such as CGWindowListCreateImage, when long time no data. But this is not what I expected. I set the minimumFrameInterval to 30 but it seems no work. [config setMinimumFrameInterval:CMTimeMake(1, 30)]; Is there any settings that can let me get a didOutputSampleBuffer, even given a CMSampleBufferRef with status SCFrameStatusIdle, called atlest one time per second? Which would make me think it works fine without any exception.
0
0
526
Jan ’24
Music User Token into Python via MusicKit JS or API?
Hi! I've been working on a project in python that pulls in a bunch of my personal apple music playback history and library, etc. I can't find a single good/functional example on how to pull the Music User Token via the android method or MusicKit JS (web) - I've spent a lot of hours on this today, and no permutation of existing examples/documentation has worked. Any guidance would be much appreciated!! If you have a web app that pulls the music user token, I just need help understanding how to get to the token itself. Thank you!
2
0
856
Jan ’24
Apple Music user token expiration
I'm writing an Android app that uses the Apple MusicKit SDK for Android. I am trying to understand how to handle the Apple Music user token, once I got it from authentication flow. I don't know when the token will expire, it is not a regular jwt token, so I cannot check the expiration date. And I don't want to run the auth flow on every app run, it will be annoying for the users. Any guidance on how to handle and invalidate apple music user tokens?
0
0
526
Jan ’24
Compilation error in Apple MusicKit SDK for Android
I have downloaded the official Apple MusicKit SDK for Android and integrated 2 AARs it has in my app (musickitauth-release-1.1.2.aar and mediaplayback-release-1.1.1.aar). When I try to build my app I'm getting an error: Manifest merger failed : android:exported needs to be explicitly specified for element <activity#com.apple.android.sdk.authentication.SDKUriHandlerActivity>. Apps targeting Android 12 and higher are required to specify an explicit value for android:exported when the corresponding component has an intent filter defined. See https://developer.android.com/guide/topics/manifest/activity-element#exported for details. Which makes sense, since when I look into the AAR's AndroidManifest.xml I see that this is missing in SDKUriHandlerActivity. Can this be fixed?
1
0
486
Jan ’24
ApplicationMusicPlayer on macOS doesn’t work with AirPlay (AVRoutePickerView)
I’m using the new ApplicationMusicPlayer support on macOS 14 and playing items from my Apple Music library. I wanted to play this music from my app to an AirPlay destination so i added an AVRoutePickerView. However, selecting any destination via this view doesn’t make a difference to the playback. It continues to play on my mac speakers no matter which airplay destination i choose. Also submitted as FB13521393.
0
1
717
Jan ’24