Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.

All subtopics

Post

Replies

Boosts

Views

Activity

LockedCameraCaptureExtension can not get access to photo library
I've requested the authentication in my main app. PHPhotoLibrary.requestAuthorization(for: .readWrite) { status in } Add the privacy description in both the main app and the extension. But No matter the device is locked or unlocked. When I call let fetchResult = PHAsset.fetchAssets(with: .image, options: nil) let count = fetchResult.count the count is always zero, even after a new photo is saved to the album in the same session.
1
0
144
1w
Big ColorSync Bug? macOS Interprets Rec 705 Profile Differently Between 15 and 14
ColorSync throughout the system interprets the Rec. ITU-R BT.709-5 color profile differently in macOS 15 than it did on macOS 14 leading to severe color differences. This is breaking/problem-causing color change. Here's a comparison of one way the results can appear different. Steps to Reproduce Open this image above (named here as sRGB_Bars.png) in ColorSync Utility. (This should be an sRGB profiled image if the forums don't mess with that.) With Digital Color Meter open and set to "Display in sRGB" [0-255] you can see the gray bars progress left-to-right as 0, 23, 46, 69, 92, 115, etc... ] Those are the correct reference values. At the bottom of the window in ColorSync use [Match to Profile] [Display -> Rec. ITU-R BT.709-5], then click the Apply Button. You can verify the image now uses the Rec 709 profile with the (i) Get Info button in the toolbar. Use "Save As…" to save the image with a different name. **) Do the steps above on macOS 14 and macOS 15 to create two images. Rec709_CreatedOnMacOS14, and Rec709_CreatedOnMacOS15. **) Compare the two images on BOTH operating system versions, and you'll see they are significantly different from each other. Rec709_CreatedOnMacOS14.png when viewed on macOS 15 has the gray values of: [0, 1, 18, 43, 67, ...] Rec709_CreatedOnMacOS14.png when viewed on macOS 15 has the gray values of: [0, 51, 72, 94, 115, ... ] These are MASSIVE differences. Significance This is not just a problem that affects ColorSync Utility or Preview, etc. This same gamma interpretation difference is affecting the Core Video, Core Image, Video Toolbox, etc pipelines as well. That gets complicated to talk about, and this is example is the simplest I can boil it down to. Conclusion Significant bug? What's going on?
2
0
152
2w
Green Line Appears in Thumbnails in Photos but Not in Actual Photo
Hello, I’ve encountered an issue where a green line appears only in the thumbnail view of an image in the iPhone photo gallery. The green line is not present in the actual image itself. Here are some additional details: • The issue occurs only when saving the image as a JPEG. When saved as PNG, the green line does not appear. • The green line also shows up when using the PHAsset method requestImage(for:targetSize:). Depending on the targetSize, the resulting image may contain the green line. • Interestingly, this issue does not appear on iPhone Xs Max running iOS 15.2.1. However, the green line does appear on iPhone 15 Pro Max running iOS 18.0.1 when viewing the same image. I have attached the problematic image for your reference. Following images are the screen captures that shows the issue occurring on my iPhone. iPhone 15 pro max (iOS 18.0.1) iPhone Xs Max (iOS 15.2.1 ) Could this be related to a display or gallery app issue on iOS? Any advice or solutions would be greatly appreciated. Thank you!
2
0
211
3w
Inquiry Regarding File Size Differences When Migrating from ALAssetsLibrary to PHPhotoLibrary
We are currently in the process of migrating our application from using ALAssetsLibrary to PHPhotoLibrary to ensure compatibility with the latest versions of iOS. However, we have noticed a discrepancy in the file sizes of images obtained using PHPhotoLibrary compared to those obtained using ALAssetsLibrary. Specifically, we would like to understand the following points: 1.Reason for File Size Differences: What are the reasons for the difference in file sizes between images obtained using ALAssetsLibrary and those obtained using PHPhotoLibrary? Could you provide detailed information on the settings and options in PHPhotoLibrary that affect the size and quality of the images? 2.Optimal Settings: What are the optimal settings in PHPhotoLibrary to obtain images with the same quality and file size as those obtained using ALAssetsLibrary? If possible, could you provide code examples or recommended option settings?
3
0
188
2w
Play audio data coming from serial port
Hi. I want to read ADPCM encoded audio data, coming from an external device to my Mac via serial port (/dev/cu.usbserial-0001) as 256 byte chunks, and feed it into an audio player. So far I am using Swift and SwiftSerial (GitHub - mredig/SwiftSerial: A Swift Linux and Mac library for reading and writing to serial ports. 3) to get the data via serialPort.asyncBytes() into a AsyncStream but I am struggling to understand how to feed the stream to a AVAudioPlayer or similar. I am new to Swift and macOS audio development so any help to get me on the right track is greatly appreciated. Thx
0
0
96
1w
Get audio volume from microphone
Hello. We are trying to get audio volume from microphone. We have 2 questions. 1. Can anyone tell me about AVAudioEngine.InputNode.volume? AVAudioEngine.InputNode.volume Return 0 in the silence, Return float type value within 1.0 depending on the volume are expected work, but it looks 1.0 (default value) is returned at any time. Which case does it return 0.5 or 0? Sample code is below. Microphone works correctly. // instance member private var engine: AVAudioEngine! private var node: AVAudioInputNode! // start method self.engine = .init() self.node = engine.inputNode engine.prepare() try! engine.start() // volume getter print(\(self.node.volume)) 2. What is the best practice to get audio volume from microphone? Requirements are: Without AVAudioRecorder. We use it for streaming audio. it should withstand high frequency access. Testing info device: iPhone XR OS version: iOS 18 Best Regards.
2
0
232
2w
How to edit gain map image to preserve HDR in Live Photo
I have an app that allows you to edit your photos. To preserve HDR, I edit both the SDR image and gain map image, like so: let sdrImage = CIImage(data: data, options: [.applyOrientationProperty: true]) let gainMapImage = CIImage(data: data, options: [.applyOrientationProperty: true, .auxiliaryHDRGainMap: true]) // edit them... try CIContext().writeHEIFRepresentation(of: sdrImage, to: url, format: .RGBA8, colorSpace: colorSpace, options: [.hdrGainMapImage: gainMapImage]) I also support editing the still photo in Live Photos. To do this you create a PHLivePhotoEditingContext, set the frameProcessor block which gives you a CIImage that I edit when the frame.type is .photo, then you create a PHContentEditingOutput and call saveLivePhoto. I’m not seeing any way to preserve HDR here. Interestingly the frame processor is called twice with .photo frame.type, but I don’t see any difference between these images. How can I edit a gain map image to preserve HDR in the still photo of a Live Photo?
0
0
135
1w
Phone turns black
My iphone 15 plus suddenly turns black and a losing icon keeps spinning. Then it turns off and I can use it again, it is only for a few seconds. I have updated to iOS 18.1 beta, could this be the issue. Is my phone broken? I have tried restarting my phone
0
0
102
1w
arm64 Logic Leaking Plugins (Not Calling AP_Close)
I'm running into an issue where in some cases, when the AUHostingServiceXPC_arrow process is shut down by Logic, the process is terminated abruptly without calling AP_Close on all of the plugins hosted in the process. In our case, we have filesystem resources we need to clean up, and having stale files around from the last run can cause issues in new sessions, so this leak is having some pretty gnarly effects. I can reproduce the issue using only Apple sample plugins, and it seems to be triggered by a timeout. If I have two different AU plugins in the session, and I add a 1 second sleep to the destructor of one of the sample plugins, Logic will force terminate the process and the remaining destructors are not called (even for the plugins without the 1 second sleep). Is there a way to avoid this behavior? Or to safely clean up our plugin even if other plugins in the session take a second to tear down?
1
1
133
2w
Alternative for crashing API MPMediaItemArtwork
When setting the now playing info for playing media in MPNowPlayingInfoCenter we can set artwork. But it seems the Apple API for creating the artwork is crashing on iOS 18 (FB15145734). On iOS 17 this gave the warning that the completion handler was not run on the main thread. I've tried to seek help here: https://stackoverflow.com/questions/78989543/swift-data-race-with-appkit-mpmediaitemartwork-function/78990231?noredirect=1#comment139277425_78990231 but it seems that it's not possible to override the completion handler and therefor it's up to Apple to fix this issue. .task { await MainActor.run { let nowPlayingInfoCenter = MPNowPlayingInfoCenter.default() var nowPlayingInfo = [String: Any]() let image = NSImage(named: "image")! // warning: data race detected: @MainActor function at MPMediaItemArtwork/ContentView.swift:22 was not called on the main thread nowPlayingInfo[MPMediaItemPropertyArtwork] = MPMediaItemArtwork(boundsSize: image.size, requestHandler: { _ in // Not on main thread here! return image }) nowPlayingInfoCenter.nowPlayingInfo = nowPlayingInfo } } I'm wondering if there is an alternative method to set the now playing artwork?
3
0
311
Sep ’24
High / prohibitive memory usage for AVAssetExportSession
I am making an app that can two two videos, and then stitch them together on the screen (one video on top half and the other on bottom half). This is achieved with AVMutableComposition, and then I am using AVAssetExportSession to export a mp4 file out: guard let export = AVAssetExportSession(asset: composition, presetName: AVAssetExportPreset1920x1080) else { return } export.exportAsynchronously { .... When the two input videos are around 1GB each, starting the export session immediately increases memory usage by ~2GB, as if it moves the input files into memory immediately (my guess), and at some point my app is killed for using too much memory. Is there a way to avoid this upfront memory usage and/or avoid getting killed?
1
3
106
2w
Video Hardware Acceleration on Mac
I observe significant performance differences when encoding a video in mp4 format (H264). The code I use is standard (using AVAssetWriter, AVAssetWriterInput...). Here is what I notice when I run the same code on different platforms: On an iPhone, the video is encoded in 3 seconds (iPhone 13, 14, 15, 16, Pro...). On a Mac equipped with an M2 Pro, the video is encoded in 50 seconds. On a Mac equipped with an Intel processor (2,3 GHz Intel Xeon W 18 cœurs), the video is encoded in 2 minutes. The encoding on an iPhone is very fast due to hardware acceleration. However, I don’t understand why I don’t get similar performance with a Mac M2 Pro, which is equipped with a dedicated component for hardware acceleration (H264 media engine)? Is hardware acceleration disabled on a Mac?
0
0
133
2w
Swift fails to fetch Photo Albums Synced using Windows iTunes from External Drive
See Configuration Details at the end of this message. Despite numerous attempts, I have been unable to determine the correct syntax to fetch photo albums from my iPad Pro 13.0 using Xcode and Swift. All the photo album were synced to the iPad Pro 13-inch using the latest versions of Apple iTunes for Windows from an external Western Digital G-Drive hard drive (No iCloud). All synced albums appear under "From My Mac" on the iPad. I only want to access each album's photo and video count. See sample code snippet below. I have tried multiple subtype options and album types without success. Zero albums are always returned despite having around 3900 albums in the iPad Pro 13.0 photo library. Authorization to the photo library does not appear to be the problem. PHPhotoLibrary.requestAuthorization { status in if status == .authorized { let result = PHAssetCollection.fetchAssetCollections(with: .album, subtype: .any, options: nil) if result.count == 0 { print("No albums found.") return } } } Any help or suggestions would greatly be appreciated. ApplePhoto Configuration Details iPad Pro 13-inch (M4) (iPad16,6) iPadOS = 17.7 iCloud = Turned Off iPad Pro Photo Library Albums = 3900 iPad Pro Photo Library Photos = 118000 iPad Pro Photo Library Videos = 4800 MacOS = Sonoma 14.6.1 XCode Version = 16.0 Swift Version = 5.0 (Xcode Default) Microsoft Windows 10 Pro Version = 22H2 Apple iTunes for Windows = 12.13.4.4
1
0
93
2w
What format for writeHEIFRepresentation preserves HDR?
In the WWDC 24 session "Use HDR for dynamic image experiences in your app" it's noted this is how you save edits for Adaptive HDR: SDR + HDR: writeHEIFRepresentation(of: sdrImage, to: url, colorSpace: p3Space, options: [.hdrImage: hdrImage]) SDR + Gain: writeHEIFRepresentation(of: sdrImage, to: url, colorSpace: p3Space, options: [.hdrGainMapImage: gainImage]) This won't compile because the format argument is missing. What format should be used? In the WWDC 23 session "Support HDR images in your app" RGBAf, RGBAh, and RGBA16, and RGB10 were mentioned but I'm not sure which one to use. If relevant, I'm editing photos from the user's photo library, so the image was probably taken on iPhone but perhaps not. Thanks!
1
0
228
Oct ’24
Upstream Service Error on Apple Music Feed API
Hi, I'm working on an integration with the Apple Music Feed, but over the last day, I've been getting a 500 Upstream Service Error on 99% of the API calls I make. Remarkably, retrying the same endpoint 20-30 times sometimes gives the correct response, but mostly it's a 500 error. Just to give an example, this: GET https://api.media.apple.com/v1/feed/album/latest Returns this generic error: { "errors": [ { "id": "U4ARRA2QDCGYKYRI2IPEJVBTHY", "title": "Upstream Service Error", "detail": "Call to get metadata for album feed failed", "status": "500", "code": "50001" } ] } The same goes for other feeds like song, artist, and so on. Before today, I did get the same error message sometimes, but a few retries would solve the issue. Any insight on what's happening and/or an ETA on fixing it? Thank you
0
0
124
2w
SFCustomLanguageModelData.CustomPronunciation and X-SAMPA string conversion
Can anyone please guide me on how to use SFCustomLanguageModelData.CustomPronunciation? I am following the below example from WWDC23 https://wwdcnotes.com/documentation/wwdcnotes/wwdc23-10101-customize-ondevice-speech-recognition/ While using this kind of custom pronunciations we need X-SAMPA string of the specific word. There are tools available on the web to do the same Word to IPA: https://openl.io/ IPA to X-SAMPA: https://tools.lgm.cl/xsampa.html But these tools does not seem to produce the same kind of X-SAMPA strings used in demo, example - "Winawer" is converted to "w I n aU @r". While using any online tools it gives - "/wI"nA:w@r/".
0
0
131
2w
Is there a recommended approach to safeguarding against audio recording losses via app crash?
AVAudioRecorder leaves a completely useless chunk of file if a crash happens while recording. I need to be able to recover. I'm thinking of streaming the recording to disk. I know that is possible with AVAudioEngine but I also know that API is a headache that will lead to unexpected crashes unless you're lucky and the person who built it. Does Apple have a recommended strategy for failsafe audio recordings? I'm thinking of chunking recordings using many instances of AVAudioRecorder and then stitching those chunks together.
1
0
146
2w
Realitiy Composer file size and iOS 18
Hi folks: I've been creating .reality files out of Reality Composer for over a year. Some of the files are up to 500 mB and, prior to the last month they opened fine as AR projected experiences on even basic iPhones and iPads. Now, I think since iOS 18, a 64Mb file will open as an AR experience but files it seems from about 350MB up don't open. Files just opens a window displaying the name of the file, that it's a .reality file and the file size. But it no longer opens into either an AR or Object display of the .reality scene. Has there been a new file size limit put on .reality files that Files will open or what else is going on here. Have a client who was about to launch and experience based on the .Reality file I can no longer open. Please help.
0
0
119
2w