2 Days and I am frustrated. I"ve crossed my T's and dotted my I's.
Using
musickit
Error
Attempted to register account monitor for types client is not authorized to access: {(
"com.apple.account.iTunesStore"
)}
Offending Code
var request = MusicLibraryRequest<MusicKit.Playlist>()
request.sort(by: .lastPlayedDate, ascending: false)
let response = try await request.response()
Verified
Custom IOS Target Properities
Privacy - Media Library Usage Description
Correct Bundle Identifier
Checkbox AppServcies/Music Kit for App ID
Please help!
2 days of racking my brain, just can't get passed error
MusicKit does ask me to authorize
Other code works
let request = MusicRecentlyPlayedContainerRequest()
let response = try await request.response()
See Image
General
RSS for tagExplore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.
Post
Replies
Boosts
Views
Activity
Hey there, I'm trying to display all user's albums using the MediaPlayer library. I'm getting many albums returning nil, but I know artwork exists because they show up in the default Music app. There doesn't seem to be much rhyme or reason for what shows up and what doesn't. All downloaded albums display artwork, but some cloud album artwork displays as well. Here's the code I'm using to debug this.
let query = MPMediaQuery.albums()
if let albumCollections = query.collections {
albums = albumCollections
}
for album in albums {
let artwork = album.representativeItem?.artwork
print(artwork, artwork?.image(at: CGSize(width: 100, height: 100)))
}
Any help would be greatly appreciated. Thanks!
I notice from macOS Sonoma System Settings, we have "Screen & System audio Recording". I'm an macOS app developer and want to request only Audio permission,
I browse the document for a while and WWDC code demo, but still have no idea of how to request "System Audio Recording Only" permission?
All the demo and doc I can find is request "Screen Recording & System Audio"
When making a library sectioned request, some MusicLibraryRequestable types used result in an MusicKit.MusicLibraryRequestError being thrown.
When a Playlist is used as the MusicLibrarySectionRequestable type, no other MusicLibraryRequestable type than Track can be used for the request. For others, Artist & Genre cannot be used.
Is there a way to work around this issue? The (seemingly) equivalent functionality in MediaPlayer (MPMediaQuery and MPMediaGrouping) was very consistent and reliable.
Full error info: MusicKit.MusicLibraryRequestError.invalidType, The operation couldn’t be completed. (MusicKit.MusicLibraryRequestError error 1.)
Device and OS: iPhone 13 Pro, iOS 17.2.1
I’m using the new ApplicationMusicPlayer support on macOS 14 and playing items from my Apple Music library. I wanted to play this music from my app to an AirPlay destination so i added an AVRoutePickerView. However, selecting any destination via this view doesn’t make a difference to the playback. It continues to play on my mac speakers no matter which airplay destination i choose.
Also submitted as FB13521393.
I have downloaded the official Apple MusicKit SDK for Android and integrated 2 AARs it has in my app (musickitauth-release-1.1.2.aar and mediaplayback-release-1.1.1.aar). When I try to build my app I'm getting an error:
Manifest merger failed : android:exported needs to be explicitly specified for element <activity#com.apple.android.sdk.authentication.SDKUriHandlerActivity>. Apps targeting Android 12 and higher are required to specify an explicit value for android:exported when the corresponding component has an intent filter defined. See https://developer.android.com/guide/topics/manifest/activity-element#exported for details.
Which makes sense, since when I look into the AAR's AndroidManifest.xml I see that this is missing in SDKUriHandlerActivity. Can this be fixed?
I'm writing an Android app that uses the Apple MusicKit SDK for Android. I am trying to understand how to handle the Apple Music user token, once I got it from authentication flow. I don't know when the token will expire, it is not a regular jwt token, so I cannot check the expiration date. And I don't want to run the auth flow on every app run, it will be annoying for the users. Any guidance on how to handle and invalidate apple music user tokens?
Hi! I've been working on a project in python that pulls in a bunch of my personal apple music playback history and library, etc.
I can't find a single good/functional example on how to pull the Music User Token via the android method or MusicKit JS (web) - I've spent a lot of hours on this today, and no permutation of existing examples/documentation has worked.
Any guidance would be much appreciated!! If you have a web app that pulls the music user token, I just need help understanding how to get to the token itself.
Thank you!
I have applied some filters (like applyingGaussianBlur) to a CIImage that was converted from UIImage. The resulting image data gets corrupted only in lower end devices. What could be the reason?
I found that didOutputSampleBuffer would not be called for long time when the screen is no change. It sometimes make me confused for if there something wrong.
In my design, it will change to other screen shot method, such as CGWindowListCreateImage, when long time no data. But this is not what I expected.
I set the minimumFrameInterval to 30 but it seems no work.
[config setMinimumFrameInterval:CMTimeMake(1, 30)];
Is there any settings that can let me get a didOutputSampleBuffer, even given a CMSampleBufferRef with status SCFrameStatusIdle, called atlest one time per second? Which would make me think it works fine without any exception.
I'm currently working on live screen broadcasting app which allows the user's to record their screen to save a mp4 video. I write video file by AVAssetWriter, and it works fine. But, when there is 1GB-2BG of storage space remaining on the device, errors such as "Attempted to start an invalid broadcast session" frequently occur, and video files cannot be played due to not call assetWriter.finishWriting().
Occur on device:
iPhone se3
iPhone 12 pro max
iPhone 13
iPad 19
iPad air 5
I have tried the movieFragmentInterval of AVAssetWriter to write movie fragments , set shouldOptimizeForNetworkUse true/false , not working.The video can not be played.
I want to known how to observe or catch this error? Thanks!
As the title states. Would like to use MusicKit for Web instead of the Swift integration.
Is it necessary to enroll to Apple devloper program to get into apple news publisher to get my Apple news API credentials. At last. I need a gudance how to publish the articles on the apple News using News API. a detailed explanation till getting Apple News API credentials ( Key, secret key channel id) is much appreciated!
struct AlbumDetails : Hashable {
let artistId: String?
}
func fetchAlbumDetails(upc: String) async throws -> AlbumDetails {
let request = MusicCatalogResourceRequest<Album>(matching: \.upc, equalTo: upc)
let response = try await request.response()
guard let album = response.items.first else {
throw NSError(domain: "AlbumNotFound", code: 0, userInfo: nil)
}
do {
let artistID = try await fetchAlbumDetails(upc: upc)
print("Artist ID: \(artistID)")
} catch {
print("Error fetching artist ID: \(error)")
}
return AlbumDetails(artistId: album.artists?.first?.id)
with this function, i can return nearly everything except the artist ID so i know its not a problem with the request but I know there has to be a way to get the artistID, there has too. If anyone has a solution to this I would reallly appricate it
Is it possible using MusicKit API's to access the About information displayed on an artist page in Apple Music?
I hoped Artist.editiorialNotes would give me the information but there is scarce information in there. Even for Taylor Swift, only editorialNotes.short displays brief info: "The genre-defying singer-songwriter is the voice of a generation."
If currently not possible, are there plans for it in the future?
Also, with the above in mind, and never seen any editorialNotes for a song, is it safe to assume editorialNotes are mainly used for albums?
- (void)cameraDevice:(ICCameraDevice*)camera
didReceiveMetadata:(NSDictionary* _Nullable)metadata
forItem:(ICCameraItem*)item
error:(NSError* _Nullable) error API_AVAILABLE(ios(13.0)){
NSLog(@"metadata = %@",metadata);
if (item) {
ICCameraFile *file = (ICCameraFile *)item;
NSURL *downloadsDirectoryURL = [[NSFileManager defaultManager] URLsForDirectory:NSDocumentDirectory inDomains:NSUserDomainMask].firstObject;
downloadsDirectoryURL = [downloadsDirectoryURL URLByAppendingPathComponent:@"Downloads"];
NSDictionary *downloadOptions = @{ ICDownloadsDirectoryURL: downloadsDirectoryURL,
ICSaveAsFilename: item.name,
ICOverwrite: @YES,
ICDownloadSidecarFiles: @YES
};
[self.cameraDevice requestDownloadFile:file options:downloadOptions downloadDelegate:self didDownloadSelector:@selector(didDownloadFile:error:options:contextInfo:) contextInfo:nil];
}
}
- (void)didDownloadFile:(ICCameraFile *)file
error:(NSError* _Nullable)error
options:(NSDictionary<NSString*, id>*)options
contextInfo:(void* _Nullable) contextInfo API_AVAILABLE(ios(13.0)){
if (error) {
NSLog(@"Download failed with error: %@", error);
}
else {
NSLog(@"Download completed for file: %@", file);
}
}
I don't know what's wrong. I don't know if this is the right way to get the camera pictures. I hope someone can help me
I found that the app reported a crash of a pure virtual function call, which could not be reproduced.
A third-party library is referenced:
https://github.com/lincf0912/LFPhotoBrowser
Achieve smearing, blurring, and mosaic processing of images
Crash code:
if (![LFSmearBrush smearBrushCache]) {
[_edit_toolBar setSplashWait:YES index:LFSplashStateType_Smear];
CGSize canvasSize = AVMakeRectWithAspectRatioInsideRect(self.editImage.size, _EditingView.bounds).size;
[LFSmearBrush loadBrushImage:self.editImage canvasSize:canvasSize useCache:YES complete:^(BOOL success) {
[weakToolBar setSplashWait:NO index:LFSplashStateType_Smear];
}];
}
- (UIImage *)LFBB_patternGaussianImageWithSize:(CGSize)size orientation:(CGImagePropertyOrientation)orientation filterHandler:(CIFilter *(^ _Nullable )(CIImage *ciimage))filterHandler
{
CIContext *context = LFBrush_CIContext;
NSAssert(context != nil, @"This method must be called using the LFBrush class.");
CIImage *midImage = [CIImage imageWithCGImage:self.CGImage];
midImage = [midImage imageByApplyingTransform:[self LFBB_preferredTransform]];
midImage = [midImage imageByApplyingTransform:CGAffineTransformMakeScale(size.width/midImage.extent.size.width,size.height/midImage.extent.size.height)];
if (orientation > 0 && orientation < 9) {
midImage = [midImage imageByApplyingOrientation:orientation];
}
//图片开始处理
CIImage *result = midImage;
if (filterHandler) {
CIFilter *filter = filterHandler(midImage);
if (filter) {
result = filter.outputImage;
}
}
CGImageRef outImage = [context createCGImage:result fromRect:[midImage extent]];
UIImage *image = [UIImage imageWithCGImage:outImage];
CGImageRelease(outImage);
return image;
}
This line trigger crash:
CGImageRef outImage = [context createCGImage:result fromRect:[midImage extent]];
b9c90c7bbf8940e5aabed7f3f62a65a2-symbolicated.crash
I'm working on a very simple App where I need to visualize an image on the screen of an iPhone. However, the image has some special properties. It's a 16bit, yuv422_yuy2 encoded image. I already have all the raw bytes saved in a Data object.
After googling for a long time, I still did not figure out the correct way. My current understanding is first create a CVPixelBuffer to properly represent the encoding information. Then conver the CVPixelBuffer to an UIImage. The following is my current implementation.
public func YUV422YUY2ToUIImage(data: Data, height: Int, width: Int, bytesPerRow: Int) -> UIImage {
return rosImage.data.withUnsafeMutableBytes { rawPointer in
let baseAddress = rawPointer.baseAddress!
let tempBufferPointer = UnsafeMutablePointer<CVPixelBuffer?>.allocate(capacity: 1)
CVPixelBufferCreateWithBytes( kCFAllocatorDefault,
width,
height,
kCVPixelFormatType_422YpCbCr16,
baseAddress,
bytesPerRow,
nil,
nil,
nil,
tempBufferPointer)
let ciImage = CIImage(cvPixelBuffer: tempBufferPointer.pointee!)
return UIImage(ciImage: ciImage)
}
}
However, when I execute the code, I have the followin error
-[CIImage initWithCVPixelBuffer:options:] failed because its pixel format v216 is not supported.
So it seems CIImage is unhappy. I think I need to convert the encoding from yuv422_yuy2 to something like plain ARGB. But after a long tim googling, I didn't find a way to do that. The closest function I cand find is https://developer.apple.com/documentation/accelerate/1533015-vimageconvert_422cbypcryp16toarg
But the function is too complex for me to understand how to use it.
Any help is appreciated. Thank you!
Hi all!
As the title states, I'm looking to integrate with Apple Music's REST API. I'm wondering if there are any OpenAPI 3.0 YAML or JSON specs available anywhere?
I'd like to avoid transcribing the types found in the developer docs manually.
Here's a link to the Apple Music API docs and to the OpenAPI 3.0 spec:
https://developer.apple.com/documentation/applemusicapi
https://spec.openapis.org/oas/latest.html
Open API was previously known as "Swagger" too.
Many thanks in advance!
I am trying to retrieve same information about a song as is available on the music.apple.com page (see screenshot).
It seems neither MusicKit and/or Apple Music API's deliver that depth of information. For example, the names listed for Production and Techniques.
Am I correct?