I really love Quartz Composer from Apple which is a quite old app, not updated for years. It works well on my 2015 mid MacBook Pro, but not on new M1 iMac. Does anyone know how to run this great app on my new machine? Thank you!
General
RSS for tagExplore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.
Post
Replies
Boosts
Views
Activity
I have applied some filters (like applyingGaussianBlur) to a CIImage that was converted from UIImage. The resulting image data gets corrupted only in lower end devices. What could be the reason?
Hi! I've been working on a project in python that pulls in a bunch of my personal apple music playback history and library, etc.
I can't find a single good/functional example on how to pull the Music User Token via the android method or MusicKit JS (web) - I've spent a lot of hours on this today, and no permutation of existing examples/documentation has worked.
Any guidance would be much appreciated!! If you have a web app that pulls the music user token, I just need help understanding how to get to the token itself.
Thank you!
I'm writing an Android app that uses the Apple MusicKit SDK for Android. I am trying to understand how to handle the Apple Music user token, once I got it from authentication flow. I don't know when the token will expire, it is not a regular jwt token, so I cannot check the expiration date. And I don't want to run the auth flow on every app run, it will be annoying for the users. Any guidance on how to handle and invalidate apple music user tokens?
I have downloaded the official Apple MusicKit SDK for Android and integrated 2 AARs it has in my app (musickitauth-release-1.1.2.aar and mediaplayback-release-1.1.1.aar). When I try to build my app I'm getting an error:
Manifest merger failed : android:exported needs to be explicitly specified for element <activity#com.apple.android.sdk.authentication.SDKUriHandlerActivity>. Apps targeting Android 12 and higher are required to specify an explicit value for android:exported when the corresponding component has an intent filter defined. See https://developer.android.com/guide/topics/manifest/activity-element#exported for details.
Which makes sense, since when I look into the AAR's AndroidManifest.xml I see that this is missing in SDKUriHandlerActivity. Can this be fixed?
I’m using the new ApplicationMusicPlayer support on macOS 14 and playing items from my Apple Music library. I wanted to play this music from my app to an AirPlay destination so i added an AVRoutePickerView. However, selecting any destination via this view doesn’t make a difference to the playback. It continues to play on my mac speakers no matter which airplay destination i choose.
Also submitted as FB13521393.
When making a library sectioned request, some MusicLibraryRequestable types used result in an MusicKit.MusicLibraryRequestError being thrown.
When a Playlist is used as the MusicLibrarySectionRequestable type, no other MusicLibraryRequestable type than Track can be used for the request. For others, Artist & Genre cannot be used.
Is there a way to work around this issue? The (seemingly) equivalent functionality in MediaPlayer (MPMediaQuery and MPMediaGrouping) was very consistent and reliable.
Full error info: MusicKit.MusicLibraryRequestError.invalidType, The operation couldn’t be completed. (MusicKit.MusicLibraryRequestError error 1.)
Device and OS: iPhone 13 Pro, iOS 17.2.1
I notice from macOS Sonoma System Settings, we have "Screen & System audio Recording". I'm an macOS app developer and want to request only Audio permission,
I browse the document for a while and WWDC code demo, but still have no idea of how to request "System Audio Recording Only" permission?
All the demo and doc I can find is request "Screen Recording & System Audio"
Hello Apple Community,
First, I'm asking this on the holidays, so happy holidays. Now I have some "fun coding time" and can do it because I have some holiday time.
I'm new to this area and would greatly appreciate your expertise and guidance. Have mercy.
I'm attempting to develop a simple application on my Mac OS to browse my playlists. However, I've encountered a few roadblocks that I struggle to navigate. I understand I need to implement two-factor authentication to access my playlists, for which an OAuth setup is required. This involves obtaining an Apple ID and a service ID and dealing with other complex elements.
One particular challenge I'm facing is with the redirect URI in the OAuth process. It seems that it needs to be a valid domain, and I'm unsure if using a local server address like https://localhost:5000 would work. My goal is to create a basic Flask application that can interact with Apple's web authentication system, but I'm uncertain about the feasibility of this approach, given the domain restrictions.
I would appreciate any advice or step-by-step guidance on the following points.
What would be the simplest way to create an application (Swift, Python, or JavaScript) that can authenticate and enable browsing through my playlists?
Any insights, tips, or examples you could share would be immensely helpful. I am eager to learn and look forward to your valuable suggestions. Anything step-by-step would be great, but I like to dream.
Thank you so much for your time and assistance.
2 Days and I am frustrated. I"ve crossed my T's and dotted my I's.
Using
musickit
Error
Attempted to register account monitor for types client is not authorized to access: {(
"com.apple.account.iTunesStore"
)}
Offending Code
var request = MusicLibraryRequest<MusicKit.Playlist>()
request.sort(by: .lastPlayedDate, ascending: false)
let response = try await request.response()
Verified
Custom IOS Target Properities
Privacy - Media Library Usage Description
Correct Bundle Identifier
Checkbox AppServcies/Music Kit for App ID
Please help!
2 days of racking my brain, just can't get passed error
MusicKit does ask me to authorize
Other code works
let request = MusicRecentlyPlayedContainerRequest()
let response = try await request.response()
See Image
I'm trying to test the MusicMarathon and MusicAlbums tutorial/demo apps for MusicKit and half the endpoints do not work.
As an example the MusicMarathon call to MusicRecentlyPlayedContainerRequest() just returns a 401.
Everything I've done seems correct. I've got a fully authorized session and I have all the development keys successfully setup.
Also it's not all API's as I can access the users Library, Just none of the recommendation and search endpoints seems to be working correctly.
I'm running iOS 17.2 and Xcode 15.1
I'm pretty certain this is easily repeatable by just running the demo applications from the MusicKit documentation,
When I try to initialize SCContentFilter(desktopIndependentWindow: window), I get a very weird error:
Assertion failed: (did_initialize), function CGS_REQUIRE_INIT, file CGInitialization.c, line 44.
This is a brand new CLI project on Sonoma 14.2.1. Any ideas?
I want to add a small play button to play a song preview on my site. I don't want to use the embedded player because I want to style it custom. If I use the MusicKit Web library, do I need to add Apple branding legally? I don't see anywhere that says its required and I want to confirm.
I recently downloaded Apple Music for for Windows 10 and started a new music library. Then I downloaded all my purchased music albums to my computer. However, the album art is not embedded in any of the m4a files. It is stored in a separate artwork folder. I have Visual Studio 2022 and was wondering how I might go about accessing the Apple Music app using the API. I want the script to go through all my purchased music and take the album art from the artwork folder and embed it in the purchased m4a files. I found some other programs/scripts online but they are all for iTunes or for Mac. None of the old scripts will work because they don’t recognize the new Apple Music for Windows music library. I can’t go back to iTunes because Apple removed all the music options from the iTunes for Windows app. Any ideas on how to get started writing the app/script would be greatly appreciated. Can I download something from Apple to make it easy to access the API in Visual Studio 2022 for Windows?
Hi, my team is developing an iOS app which connects to Apple Music (currently most of it is via API, but some tests have been done with MusicKit). In a separate development, we were using a service which competes with A.M., and when their (web) API is used, if the user doesn't interact with the app somehow (scrolling, buttons, etc), the connection will timeout after 30 seconds and force them to reconnect.
So, questions I have are:
Is there any timeout regarding stopping tracks in Apple Music or does the connection to the A.M. app stay active?
If the connection drops on iOS as well, is there any limitation regarding streaming A.M. content via API (aka are there licensing issues as well?)
Hi,
In iOS 17.0 Apple introduced a favorite button for the music app. This favorite button is available everywhere including Control Center and CarPlay.
My question is if there is a way to show/use this favorite button in my own app. Couldn't find it in the documentation.
Thank you..!
Hello,
I develop an application called MiniMeters and am using ScreenCaptureKit to get the desktop audio on macOS. As of macOS 14.2, a few users began noticing that the volume of the incoming audio differed depending on the audio device connected.
One user's Apogee Symphony Desktop is trimmed -14dB, another user's UAD Apollo Twin is -14dB as well, my MOTU M4 is -6dB, and my MacBook Pro's internal speakers show 0dB.
This does not change with changing the output volume on the interface (obviously), nor digitally in the system.
I cannot seem to find anything documenting this change. It also affects other applications that use ScreenCaptureKit such as OBS. Does anyone have any idea what that could correlate to so I could potentially compensate?
Thanks.
I am detecting problems with the volume level with the Bluetooth connection after the iOS 17.2 update. Before this problem persisted on the iPhone 11 and the iPhone 15 Pro, after the 17.2 update it seems that the problem was fixed on the iPhone 11 but still It persists on the iPhone 15 Pro. I have never had problems with the volume level in my car, but something Apple has changed that continues to affect it. How can it be corrected? Thank you very much for your support. I did a test with the same song and the same volume level (maximum volume on the smartphone and volume 12 on my Suzuki Swift) and these were the decibels results obtained. The Iphone 11 and 15 Pro has updated to iOS 17.2
Greetings Fellow Humans,
My player uses the v3 musickit-js library. I am trying to handle situations where a user tries to play explicit content in my player with an account that has content restrictions enabled. I don't see a mechanism to know if the toggle is set in the account. The only mechanism I see is to respond to a CONTENT_RESTRICTED error as handled by the callback to the function I provide as a callback to the mediaPlaybackError event.
I have attached many callbacks (like bufferedProgressDidChange) and those all work, but this one never fires.
music.addEventListener("mediaPlaybackError", onPlaybackError);
Or
music.addEventListener(MusicKit.Events.mediaPlaybackError, onPlaybackError);
My onPlaybackError function, at least for debugging purposes, is:
function onPlaybackError(e) {
console.log("onPlaybackError");
console.log(e);
}
There are so many error conditions that are meant to be handled in this way but the callback never happens. Am I missing something? Why doesn't this callback fire?
Thanks!
Hello everyone,
I'm currently facing a challenging issue with my macOS application that involves HEIF image processing. The application uses an OperationQueue to handle HEIF compression tasks. However, I've observed a significant delay in processing when a screen recording is active. This delay doesn't occur under normal circumstances.
Here's a brief overview of the implementation:
The HEIF processing task is encapsulated within an Operation added to an OperationQueue.
The task involves using CIContext for image processing.
When screen recording is initiated, the operation's execution becomes unusually slow or gets delayed extensively.
After some research and community feedback, I learned that screen recording might be affecting the system's resource allocation, particularly impacting tasks that utilize GPU resources, like CIContext operations in my case.
To address this, I tried the following:
Switching to a custom dispatch queue with a .userInitiated QoS.
Using GCD instead of OperationQueue.
Despite these attempts, the issue persists during screen recording. It seems like the screen recording process is given higher priority by macOS, leading to resource reallocation and thus affecting my application's performance.
I'm looking for insights or suggestions on how to handle this scenario more effectively. Specifically, I am interested in:
Understanding how screen recording impacts resource allocation in macOS.
Exploring ways to ensure that my HEIF processing task is not severely impacted by other system processes like screen recording.
Any best practices or alternative approaches for handling image processing tasks that are sensitive to system resource availability.
Here's a snippet of the HEIF processing function for reference:
import CoreImage
struct CommandResult: CustomStringConvertible {
let output: String
let error: Process.TerminationReason
let status: Int32
var description: String {
return "error:\(error.rawValue), output:\(output), status:\(status)"
}
}
func heif(at sourceURL: URL, to destinationURL: URL, as quality: Int = 75) -> CommandResult {
let compressionQuality = CGFloat(quality) / 100.0
guard let ciImage = CIImage(contentsOf: sourceURL) else {
return CommandResult(output: "Load heic image failed \(sourceURL)", error: .exit, status: -1)
}
let context = CIContext(options: nil)
let heifOptions = [kCGImageDestinationLossyCompressionQuality: compressionQuality] as! [CIImageRepresentationOption: Any]
do {
try context.writeHEIFRepresentation(of: ciImage,
to: destinationURL,
format: .RGBA8,
colorSpace: ciImage.colorSpace!,
options: heifOptions)
} catch {
return CommandResult(output: "Compress and write heic image failed \(sourceURL)", error: .exit, status: -1)
}
return CommandResult(output: "Compress and write heic image successfully \(sourceURL)", error: .exit, status: 0)
}
Thank you for your time and any assistance you can provide!