Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.

All subtopics

Post

Replies

Boosts

Views

Activity

MusicKit UPCs changing and handling that
I use Universal Product Codes (UPC) in my app to reliably identify albums after having used albumIDs for a time. AlbumIDs can change over time for no obvious reasons (see here for songIDs) so I switched to UPCs since I believed they cannot change. Well apparently they can. A few days ago I populated a JSON with UPCs including 196871067713. Today trying to perform a MusicCatalogResourchRequest for the UPC does not return anything. When using that UPC and putting it into an Apple Music link like https://music.apple.com/de/album/folge-89-im-geistergarten/1683337782?l=en-GB redirects to https://music.apple.com/de/album/folge-89-im-geistergarten/1683337782?l=en-GB so I assume the UPC has changed from 196871067713 to 1683337782. Apple Music can handle that and redirects to the new upc both in the app and as a website. But a MusicCatalogResourceRequest cannot do that. I filed a suggestion for that (FB15167146) but I need a solution quicker. Can I somehow detect where the URL is redirecting to? Is there a way MusicCatalogResourceRequest can do this? Performing a MusicCatalogSearchRequest can be an option but seems unreliable when using the title as search term. Other ideas? Thank you
1
1
305
Sep ’24
H.264 License
I’m using AVFoundation in my iPhone application to encode a video in MP4 format with H.264, which can then be shared or exported. Do I need to pay a license for using the H.264 format to MPEG LA? Or are these fees already covered by Apple? I’ve read articles suggesting that Apple covers these fees when encoding is done through its native APIs (or via its dedicated encoding hardware components), but I haven’t found any explicit confirmation of this point in the various documentation or contracts... Did I miss something?
1
0
219
3w
Playlist IDs from MusicKit not working with Apple Music API
I am building an app for MacOS and I am trying to implement the code to add songs to a library playlist (which is added below). The issue I am having is that if I use Music Kit to load a users library playlists, the ID for the playlist (which is just a string of numbers) does not work with the Add tracks to a Library Playlist endpoint of Apple Music API. If I retrieve the playlists from the Apple Music API and use that playlist ID (which is different than the id I get from MusicKit) my code works fine and adds the song to the playlist. The problem is that when getting a users library playlists from Apple Music API is that it does not give me all of the library playlists that I get when using Music Kit and it also does not give me Artwork for playlists that have the collage of album covers, so I would prefer to use Music Kit to get the playlists. I have also tested trying to retrieve a single playlist using the Apple Music API with the playlist Id from Music Kit and it does not work. I get the error that the resource cannot be found. Since this is a macOs app I cannot use MusicKit to add songs to library playlists. Does anyone know a way to resolve this? Or a possible workaround? Ideally I want to use MusicKit to get the library playlists and have some way to use the playlist Id and add songs to that playlist. Below is my code for adding a song to a playlist using the Apple Music API, which works correctly only if I originally get the library playlist's id value from a playlist retrieved from the Apple Music API. Also, does anyone know why the playlist Id's are not universal and are different when using Music Kit and Apple Music API? For songs and tracks it does not seem to matter if I use music kit or Apple Music API, the Id's are in the correct format for Apple Music API to use and work with my code. Thanks everyone for any and all help! func addToPlaylist(songs: [Track], playlist: Playlist, alert: Binding<AlertItem?>) async { let tracks = AppleMusicPlaylistPostRequestBody(data: songs.compactMap { AppleMusicPlaylistPostRequestItem(id: $0.id.rawValue, type: "songs") // or "library-songs" }) let playlistID = playlist.id // Build the request URL for adding a song to a playlist guard let url = URL(string: "https://api.music.apple.com/v1/me/library/playlists/\(playlistID)/tracks") else { alert.wrappedValue = AlertItem(title: "Error", message: "Invalid URL for the playlist.") return } // Authorization Header guard let musicUserToken = try? await MusicUserTokenProvider().getUserMusicToken() else { alert.wrappedValue = AlertItem(title: "Error", message: "Unable to retrieve Music User Token.") return } do { var request = URLRequest(url: url) request.httpMethod = "POST" request.setValue("Bearer \(musicUserToken)", forHTTPHeaderField: "Authorization") request.setValue("application/json", forHTTPHeaderField: "Content-Type") let encoder = JSONEncoder() let data = try encoder.encode(tracks) request.httpBody = data let musicRequest = MusicDataRequest(urlRequest: request) let musicRequestResponse = try await musicRequest.response() // Check if the request was successful (status 201) if musicRequestResponse.urlResponse.statusCode == 201 { alert.wrappedValue = AlertItem(title: "Success", message: "Song successfully added to the playlist.") } else { print("Status Code: \(musicRequestResponse.urlResponse.statusCode)") print("Response Data: \(String(data: musicRequestResponse.data, encoding: .utf8) ?? "No Data")") // Attempt to decode the error response into the AppleMusicErrorResponse model if let appleMusicError = try? JSONDecoder().decode(AppleMusicErrorResponse.self, from: musicRequestResponse.data) { let errorMessage = appleMusicError.errors.first?.detail ?? "Unknown error occurred." alert.wrappedValue = AlertItem(title: "Error", message: errorMessage) } else { alert.wrappedValue = AlertItem(title: "Error", message: "Failed to add song to the playlist.") } } } catch { alert.wrappedValue = AlertItem(title: "Error", message: "Network error: \(error.localizedDescription)") } }
0
0
189
3w
AVPlayer HLS Restirect Stream Resulotion
Hello, we have HLS stream app and we use AVPlayer for HLS stream. We want to implement dynamic resulotion feature as user's selection. For example, if user want to watch only 1080p user has to watch only 1080p but we have tried to implement "preferredMaximumResolution" and "preferredPeakBitRate" parameters and but AVPlayer does not force it which means that setting preferredMaximumResolution= CGSize(width: 1920, height: 1080) player does not only force to play 1080p profile, player drops resulotion to 720p but we do not want 720p stream if user selected 1080p resulotion. Is there any method to force it even if stream stalls? Thank you in advance
0
0
129
3w
Capturing system audio no longer works with macOS Sequoia
Our capture application records system audio via HAL plugin, however, with the latest macOS 15 Sequoia, all audio buffer values are zero. I am attaching sample code that replicates the problem. Compile as a Command Line Tool application with Xcode. STEPS TO REPRODUCE Install BlackHole 2ch audio driver: https://existential.audio/blackhole/download/?code=1579271348 Start some system audio, e.g. YouTube. Compile and run the sample application. On macOS up to Sonoma, you will hear audio via loopback and see audio values in the debug/console window. On macOS Sequoia, you will not hear audio and the audio values are 0. #import <AVFoundation/AVFoundation.h> #import <CoreAudio/CoreAudio.h> #define BLACKHOLE_UID @"BlackHole2ch_UID" #define DEFAULT_OUTPUT_UID @"BuiltInSpeakerDevice" @interface AudioCaptureDelegate : NSObject <AVCaptureAudioDataOutputSampleBufferDelegate> @end void setDefaultAudioDevice(NSString *deviceUID); @implementation AudioCaptureDelegate // receive samples from CoreAudio/HAL driver and print amplitute values for testing // this is where samples would normally be copied and passed downstream for further processing which // is not needed in this simple sample application - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection { // Access the audio data in the sample buffer CMBlockBufferRef blockBuffer = CMSampleBufferGetDataBuffer(sampleBuffer); if (!blockBuffer) { NSLog(@"No audio data in the sample buffer."); return; } size_t length; char *data; CMBlockBufferGetDataPointer(blockBuffer, 0, NULL, &length, &data); // Process the audio samples to calculate the average amplitude int16_t *samples = (int16_t *)data; size_t sampleCount = length / sizeof(int16_t); int64_t sum = 0; for (size_t i = 0; i < sampleCount; i++) { sum += abs(samples[i]); } // Calculate and log the average amplitude float averageAmplitude = (float)sum / sampleCount; NSLog(@"Average Amplitude: %f", averageAmplitude); } @end // set the default audio device to Blackhole while testing or speakers when done // called by main void setDefaultAudioDevice(NSString *deviceUID) { AudioObjectPropertyAddress address; AudioDeviceID deviceID = kAudioObjectUnknown; UInt32 size; CFStringRef uidString = (__bridge CFStringRef)deviceUID; // Gets the device corresponding to the given UID. AudioValueTranslation translation; translation.mInputData = &uidString; translation.mInputDataSize = sizeof(uidString); translation.mOutputData = &deviceID; translation.mOutputDataSize = sizeof(deviceID); size = sizeof(translation); address.mSelector = kAudioHardwarePropertyDeviceForUID; address.mScope = kAudioObjectPropertyScopeGlobal; //???? address.mElement = kAudioObjectPropertyElementMain; OSStatus status = AudioObjectGetPropertyData(kAudioObjectSystemObject, &address, 0, NULL, &size, &translation); if (status != noErr) { NSLog(@"Error: Could not retrieve audio device ID for UID %@. Status code: %d", deviceUID, (int)status); return; } AudioObjectPropertyAddress propertyAddress; propertyAddress.mSelector = kAudioHardwarePropertyDefaultOutputDevice; propertyAddress.mScope = kAudioObjectPropertyScopeGlobal; status = AudioObjectSetPropertyData(kAudioObjectSystemObject, &propertyAddress, 0, NULL, sizeof(AudioDeviceID), &deviceID); if (status == noErr) { NSLog(@"Default audio device set to %@", deviceUID); } else { NSLog(@"Failed to set default audio device: %d", status); } } // sets Blackhole device as default and configures it as AVCatureDeviceInput // sets the speakers as loopback so we can hear what is being captured // sets up queue to receive capture samples // runs session for 30 seconds, then restores speakers as default output int main(int argc, const char * argv[]) { @autoreleasepool { // Create the capture session AVCaptureSession *session = [[AVCaptureSession alloc] init]; // Select the audio device AVCaptureDevice *audioDevice = nil; NSString *audioDriverUID = nil; audioDriverUID = BLACKHOLE_UID; setDefaultAudioDevice(audioDriverUID); audioDevice = [AVCaptureDevice deviceWithUniqueID:audioDriverUID]; if (!audioDevice) { NSLog(@"Audio device %s not found!", [audioDriverUID UTF8String]); return -1; } else { NSLog(@"Using Audio device: %s", [audioDriverUID UTF8String]); } // Configure the audio input with the selected device (Blackhole) NSError *error = nil; AVCaptureDeviceInput *audioInput = [AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:&error]; if (error || !audioInput) { NSLog(@"Failed to create audio input: %@", error); return -1; } [session addInput:audioInput]; // Configure the audio data output AVCaptureAudioDataOutput *audioOutput = [[AVCaptureAudioDataOutput alloc] init]; AudioCaptureDelegate *delegate = [[AudioCaptureDelegate alloc] init]; dispatch_queue_t queue = dispatch_queue_create("AudioCaptureQueue", NULL); [audioOutput setSampleBufferDelegate:delegate queue:queue]; [session addOutput:audioOutput]; // Set audio settings NSDictionary *audioSettings = @{ AVFormatIDKey: @(kAudioFormatLinearPCM), AVSampleRateKey: @48000, AVNumberOfChannelsKey: @2, AVLinearPCMBitDepthKey: @16, AVLinearPCMIsFloatKey: @NO, AVLinearPCMIsNonInterleaved: @NO }; [audioOutput setAudioSettings:audioSettings]; AVCaptureAudioPreviewOutput * loopback_output = nil; loopback_output = [[AVCaptureAudioPreviewOutput alloc] init]; loopback_output.volume = 1.0; loopback_output.outputDeviceUniqueID = DEFAULT_OUTPUT_UID; [session addOutput:loopback_output]; const char *deviceID = loopback_output.outputDeviceUniqueID ? [loopback_output.outputDeviceUniqueID UTF8String] : "nil"; NSLog(@"session addOutput for preview/loopback: %s", deviceID); // Start the session [session startRunning]; NSLog(@"Capturing audio data for 30 seconds..."); [[NSRunLoop currentRunLoop] runUntilDate:[NSDate dateWithTimeIntervalSinceNow:30.0]]; // Stop the session [session stopRunning]; NSLog(@"Capture session stopped."); setDefaultAudioDevice(DEFAULT_OUTPUT_UID); } return 0; }
4
0
303
3w
CGImageMetadataTagCreate error when create gain map tag
I extracted the gain map info from an image using let url = Bundle.main.url(forResource: "IMG_1181", withExtension: "HEIC") let source = CGImageSourceCreateWithURL(url! as CFURL, nil) let portraitData = CGImageSourceCopyAuxiliaryDataInfoAtIndex(source!, 0, kCGImageAuxiliaryDataTypeHDRGainMap) as! [AnyHashable : Any] let metaData = portraitData[kCGImageAuxiliaryDataInfoMetadata] as! CGImageMetadata Then I printed all the metadata tags func printMetadataProperties(from metadata: CGImageMetadata) { guard let tags = CGImageMetadataCopyTags(metadata) as? [CGImageMetadataTag] else { return } for tag in tags { if let prefix = CGImageMetadataTagCopyPrefix(tag) as String?, let namespace = CGImageMetadataTagCopyNamespace(tag) as String?, let key = CGImageMetadataTagCopyName(tag) as String?, let value = CGImageMetadataTagCopyValue(tag){ print("Namespace: \(namespace), Key: \(key), Prefix: \(prefix), value: \(value)") } else { } } } //Namespace: http://ns.apple.com/ImageIO/1.0/, Key: hasXMP, Prefix: iio, value: True //Namespace: http://ns.apple.com/HDRGainMap/1.0/, Key: HDRGainMapVersion, Prefix: HDRGainMap, value: 131072 //Namespace: http://ns.apple.com/HDRGainMap/1.0/, Key: HDRGainMapHeadroom, Prefix: HDRGainMap, value: 3.586325 I want to create a new CGImageMetadata and tags. But when it comes to the HDR tags. It always fails to add to metadata. let tag = CGImageMetadataTagCreate( "http://ns.apple.com/HDRGainMap/1.0/" as CFString, "HDRGainMap" as CFString, "HDRGainMapHeadroom" as CFString, .default, 3.56 as CFNumber ) let path = "\(HDRGainMap):\(HDRGainMapHeadroom)" as CFString let success = CGImageMetadataSetTagWithPath(metadata, nil, path, tag)// always false The hasXMP works fine. Is HDR a private dict for Apple?
0
0
153
3w
Crash in iOS 18 regarding [AVPlayerController _observeValueForKeyPath:oldValue:newValue:]
There are significant crash reports coming from iOS 18 users regarding AVKit framework that starts from this line [AVPlayerController _observeValueForKeyPath:oldValue:newValue:] which seems to be coming from iOS internal SDK. There are 2 kinds of crash we found: UI modification on background thread From the stack trace it seems like when AVPictureInPictureController is being deallocated and its view is being removed from superview somehow the code is being executed in background thread because there is this line there _AssertAutoLayoutOnAllowedThreadsOnly highlighted before the crash. But I’ve checked our code that plays around AVPictureInPictureController, in the locations where we would deallocate the object it will always be called on main thread which are insideviewDidLoad and deinit inside UIViewController class. From the log, it seems like the crash happened when user try to open another content when PIP player is active resulting in the current PIP instance will be replaced with a new one. My suspect is the observation logic inside AVPlayerController could be the hint to this issue, probably something broken over there since this issue happened across our app versions on iOS 18 users only. Unfortunately, I was unable to reproduce this issue yet but one of my colleagues reproduced it once but haven’t been able to do it again since. The reports keep raising each day up to 1.3k events in the last 30 days now. Over release object This one has lower reports than the first one but I decided to include it since it might have relevant information regarding the first crash since the starting stack trace is similar. The crash timing seems to be similar to the first one, where we deallocate existing AVPictureInPictureController and later replace it with a new one and also found only in iOS 18 users which also refers to [AVPlayerController _observeValueForKeyPath:oldValue:newValue:]. I also was unable to reproduce this issue so far. Oh, and both of the issues happened on both iPhone and iPad. We’d appreciate any advice on what we can do to avoid this in the future and probably any hint on why it could happened. I have reported this issue with bug number: FB15620734 I also attached one sample crash report for each of the crashes here. non ui thread access.crash over release.crash
6
12
507
3w
LockedCameraCaptureExtension can not get access to photo library
I've requested the authentication in my main app. PHPhotoLibrary.requestAuthorization(for: .readWrite) { status in } Add the privacy description in both the main app and the extension. But No matter the device is locked or unlocked. When I call let fetchResult = PHAsset.fetchAssets(with: .image, options: nil) let count = fetchResult.count the count is always zero, even after a new photo is saved to the album in the same session.
1
0
206
4w
Big ColorSync Bug? macOS Interprets Rec 705 Profile Differently Between 15 and 14
ColorSync throughout the system interprets the Rec. ITU-R BT.709-5 color profile differently in macOS 15 than it did on macOS 14 leading to severe color differences. This is breaking/problem-causing color change. Here's a comparison of one way the results can appear different. Steps to Reproduce Open this image above (named here as sRGB_Bars.png) in ColorSync Utility. (This should be an sRGB profiled image if the forums don't mess with that.) With Digital Color Meter open and set to "Display in sRGB" [0-255] you can see the gray bars progress left-to-right as 0, 23, 46, 69, 92, 115, etc... ] Those are the correct reference values. At the bottom of the window in ColorSync use [Match to Profile] [Display -> Rec. ITU-R BT.709-5], then click the Apply Button. You can verify the image now uses the Rec 709 profile with the (i) Get Info button in the toolbar. Use "Save As…" to save the image with a different name. **) Do the steps above on macOS 14 and macOS 15 to create two images. Rec709_CreatedOnMacOS14, and Rec709_CreatedOnMacOS15. **) Compare the two images on BOTH operating system versions, and you'll see they are significantly different from each other. Rec709_CreatedOnMacOS14.png when viewed on macOS 15 has the gray values of: [0, 1, 18, 43, 67, ...] Rec709_CreatedOnMacOS14.png when viewed on macOS 15 has the gray values of: [0, 51, 72, 94, 115, ... ] These are MASSIVE differences. Significance This is not just a problem that affects ColorSync Utility or Preview, etc. This same gamma interpretation difference is affecting the Core Video, Core Image, Video Toolbox, etc pipelines as well. That gets complicated to talk about, and this is example is the simplest I can boil it down to. Conclusion Significant bug? What's going on?
2
0
263
Oct ’24
Green Line Appears in Thumbnails in Photos but Not in Actual Photo
Hello, I’ve encountered an issue where a green line appears only in the thumbnail view of an image in the iPhone photo gallery. The green line is not present in the actual image itself. Here are some additional details: • The issue occurs only when saving the image as a JPEG. When saved as PNG, the green line does not appear. • The green line also shows up when using the PHAsset method requestImage(for:targetSize:). Depending on the targetSize, the resulting image may contain the green line. • Interestingly, this issue does not appear on iPhone Xs Max running iOS 15.2.1. However, the green line does appear on iPhone 15 Pro Max running iOS 18.0.1 when viewing the same image. I have attached the problematic image for your reference. Following images are the screen captures that shows the issue occurring on my iPhone. iPhone 15 pro max (iOS 18.0.1) iPhone Xs Max (iOS 15.2.1 ) Could this be related to a display or gallery app issue on iOS? Any advice or solutions would be greatly appreciated. Thank you!
2
0
315
Oct ’24
Inquiry Regarding File Size Differences When Migrating from ALAssetsLibrary to PHPhotoLibrary
We are currently in the process of migrating our application from using ALAssetsLibrary to PHPhotoLibrary to ensure compatibility with the latest versions of iOS. However, we have noticed a discrepancy in the file sizes of images obtained using PHPhotoLibrary compared to those obtained using ALAssetsLibrary. Specifically, we would like to understand the following points: 1.Reason for File Size Differences: What are the reasons for the difference in file sizes between images obtained using ALAssetsLibrary and those obtained using PHPhotoLibrary? Could you provide detailed information on the settings and options in PHPhotoLibrary that affect the size and quality of the images? 2.Optimal Settings: What are the optimal settings in PHPhotoLibrary to obtain images with the same quality and file size as those obtained using ALAssetsLibrary? If possible, could you provide code examples or recommended option settings?
3
0
248
Oct ’24
Play audio data coming from serial port
Hi. I want to read ADPCM encoded audio data, coming from an external device to my Mac via serial port (/dev/cu.usbserial-0001) as 256 byte chunks, and feed it into an audio player. So far I am using Swift and SwiftSerial (GitHub - mredig/SwiftSerial: A Swift Linux and Mac library for reading and writing to serial ports. 3) to get the data via serialPort.asyncBytes() into a AsyncStream but I am struggling to understand how to feed the stream to a AVAudioPlayer or similar. I am new to Swift and macOS audio development so any help to get me on the right track is greatly appreciated. Thx
0
0
138
3w
Get audio volume from microphone
Hello. We are trying to get audio volume from microphone. We have 2 questions. 1. Can anyone tell me about AVAudioEngine.InputNode.volume? AVAudioEngine.InputNode.volume Return 0 in the silence, Return float type value within 1.0 depending on the volume are expected work, but it looks 1.0 (default value) is returned at any time. Which case does it return 0.5 or 0? Sample code is below. Microphone works correctly. // instance member private var engine: AVAudioEngine! private var node: AVAudioInputNode! // start method self.engine = .init() self.node = engine.inputNode engine.prepare() try! engine.start() // volume getter print(\(self.node.volume)) 2. What is the best practice to get audio volume from microphone? Requirements are: Without AVAudioRecorder. We use it for streaming audio. it should withstand high frequency access. Testing info device: iPhone XR OS version: iOS 18 Best Regards.
2
0
303
Oct ’24
Phone turns black
My iphone 15 plus suddenly turns black and a losing icon keeps spinning. Then it turns off and I can use it again, it is only for a few seconds. I have updated to iOS 18.1 beta, could this be the issue. Is my phone broken? I have tried restarting my phone
0
0
153
4w
arm64 Logic Leaking Plugins (Not Calling AP_Close)
I'm running into an issue where in some cases, when the AUHostingServiceXPC_arrow process is shut down by Logic, the process is terminated abruptly without calling AP_Close on all of the plugins hosted in the process. In our case, we have filesystem resources we need to clean up, and having stale files around from the last run can cause issues in new sessions, so this leak is having some pretty gnarly effects. I can reproduce the issue using only Apple sample plugins, and it seems to be triggered by a timeout. If I have two different AU plugins in the session, and I add a 1 second sleep to the destructor of one of the sample plugins, Logic will force terminate the process and the remaining destructors are not called (even for the plugins without the 1 second sleep). Is there a way to avoid this behavior? Or to safely clean up our plugin even if other plugins in the session take a second to tear down?
1
1
175
Oct ’24
Alternative for crashing API MPMediaItemArtwork
When setting the now playing info for playing media in MPNowPlayingInfoCenter we can set artwork. But it seems the Apple API for creating the artwork is crashing on iOS 18 (FB15145734). On iOS 17 this gave the warning that the completion handler was not run on the main thread. I've tried to seek help here: https://stackoverflow.com/questions/78989543/swift-data-race-with-appkit-mpmediaitemartwork-function/78990231?noredirect=1#comment139277425_78990231 but it seems that it's not possible to override the completion handler and therefor it's up to Apple to fix this issue. .task { await MainActor.run { let nowPlayingInfoCenter = MPNowPlayingInfoCenter.default() var nowPlayingInfo = [String: Any]() let image = NSImage(named: "image")! // warning: data race detected: @MainActor function at MPMediaItemArtwork/ContentView.swift:22 was not called on the main thread nowPlayingInfo[MPMediaItemPropertyArtwork] = MPMediaItemArtwork(boundsSize: image.size, requestHandler: { _ in // Not on main thread here! return image }) nowPlayingInfoCenter.nowPlayingInfo = nowPlayingInfo } } I'm wondering if there is an alternative method to set the now playing artwork?
3
0
366
Sep ’24
High / prohibitive memory usage for AVAssetExportSession
I am making an app that can two two videos, and then stitch them together on the screen (one video on top half and the other on bottom half). This is achieved with AVMutableComposition, and then I am using AVAssetExportSession to export a mp4 file out: guard let export = AVAssetExportSession(asset: composition, presetName: AVAssetExportPreset1920x1080) else { return } export.exportAsynchronously { .... When the two input videos are around 1GB each, starting the export session immediately increases memory usage by ~2GB, as if it moves the input files into memory immediately (my guess), and at some point my app is killed for using too much memory. Is there a way to avoid this upfront memory usage and/or avoid getting killed?
1
3
161
Oct ’24