Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.

All subtopics

Post

Replies

Boosts

Views

Activity

AVPlayer HLS Restirect Stream Resulotion
Hello, we have HLS stream app and we use AVPlayer for HLS stream. We want to implement dynamic resulotion feature as user's selection. For example, if user want to watch only 1080p user has to watch only 1080p but we have tried to implement "preferredMaximumResolution" and "preferredPeakBitRate" parameters and but AVPlayer does not force it which means that setting preferredMaximumResolution= CGSize(width: 1920, height: 1080) player does not only force to play 1080p profile, player drops resulotion to 720p but we do not want 720p stream if user selected 1080p resulotion. Is there any method to force it even if stream stalls? Thank you in advance
0
0
128
3w
CGImageMetadataTagCreate error when create gain map tag
I extracted the gain map info from an image using let url = Bundle.main.url(forResource: "IMG_1181", withExtension: "HEIC") let source = CGImageSourceCreateWithURL(url! as CFURL, nil) let portraitData = CGImageSourceCopyAuxiliaryDataInfoAtIndex(source!, 0, kCGImageAuxiliaryDataTypeHDRGainMap) as! [AnyHashable : Any] let metaData = portraitData[kCGImageAuxiliaryDataInfoMetadata] as! CGImageMetadata Then I printed all the metadata tags func printMetadataProperties(from metadata: CGImageMetadata) { guard let tags = CGImageMetadataCopyTags(metadata) as? [CGImageMetadataTag] else { return } for tag in tags { if let prefix = CGImageMetadataTagCopyPrefix(tag) as String?, let namespace = CGImageMetadataTagCopyNamespace(tag) as String?, let key = CGImageMetadataTagCopyName(tag) as String?, let value = CGImageMetadataTagCopyValue(tag){ print("Namespace: \(namespace), Key: \(key), Prefix: \(prefix), value: \(value)") } else { } } } //Namespace: http://ns.apple.com/ImageIO/1.0/, Key: hasXMP, Prefix: iio, value: True //Namespace: http://ns.apple.com/HDRGainMap/1.0/, Key: HDRGainMapVersion, Prefix: HDRGainMap, value: 131072 //Namespace: http://ns.apple.com/HDRGainMap/1.0/, Key: HDRGainMapHeadroom, Prefix: HDRGainMap, value: 3.586325 I want to create a new CGImageMetadata and tags. But when it comes to the HDR tags. It always fails to add to metadata. let tag = CGImageMetadataTagCreate( "http://ns.apple.com/HDRGainMap/1.0/" as CFString, "HDRGainMap" as CFString, "HDRGainMapHeadroom" as CFString, .default, 3.56 as CFNumber ) let path = "\(HDRGainMap):\(HDRGainMapHeadroom)" as CFString let success = CGImageMetadataSetTagWithPath(metadata, nil, path, tag)// always false The hasXMP works fine. Is HDR a private dict for Apple?
0
0
153
3w
Crash in iOS 18 regarding [AVPlayerController _observeValueForKeyPath:oldValue:newValue:]
There are significant crash reports coming from iOS 18 users regarding AVKit framework that starts from this line [AVPlayerController _observeValueForKeyPath:oldValue:newValue:] which seems to be coming from iOS internal SDK. There are 2 kinds of crash we found: UI modification on background thread From the stack trace it seems like when AVPictureInPictureController is being deallocated and its view is being removed from superview somehow the code is being executed in background thread because there is this line there _AssertAutoLayoutOnAllowedThreadsOnly highlighted before the crash. But I’ve checked our code that plays around AVPictureInPictureController, in the locations where we would deallocate the object it will always be called on main thread which are insideviewDidLoad and deinit inside UIViewController class. From the log, it seems like the crash happened when user try to open another content when PIP player is active resulting in the current PIP instance will be replaced with a new one. My suspect is the observation logic inside AVPlayerController could be the hint to this issue, probably something broken over there since this issue happened across our app versions on iOS 18 users only. Unfortunately, I was unable to reproduce this issue yet but one of my colleagues reproduced it once but haven’t been able to do it again since. The reports keep raising each day up to 1.3k events in the last 30 days now. Over release object This one has lower reports than the first one but I decided to include it since it might have relevant information regarding the first crash since the starting stack trace is similar. The crash timing seems to be similar to the first one, where we deallocate existing AVPictureInPictureController and later replace it with a new one and also found only in iOS 18 users which also refers to [AVPlayerController _observeValueForKeyPath:oldValue:newValue:]. I also was unable to reproduce this issue so far. Oh, and both of the issues happened on both iPhone and iPad. We’d appreciate any advice on what we can do to avoid this in the future and probably any hint on why it could happened. I have reported this issue with bug number: FB15620734 I also attached one sample crash report for each of the crashes here. non ui thread access.crash over release.crash
6
12
507
3w
Capturing system audio no longer works with macOS Sequoia
Our capture application records system audio via HAL plugin, however, with the latest macOS 15 Sequoia, all audio buffer values are zero. I am attaching sample code that replicates the problem. Compile as a Command Line Tool application with Xcode. STEPS TO REPRODUCE Install BlackHole 2ch audio driver: https://existential.audio/blackhole/download/?code=1579271348 Start some system audio, e.g. YouTube. Compile and run the sample application. On macOS up to Sonoma, you will hear audio via loopback and see audio values in the debug/console window. On macOS Sequoia, you will not hear audio and the audio values are 0. #import <AVFoundation/AVFoundation.h> #import <CoreAudio/CoreAudio.h> #define BLACKHOLE_UID @"BlackHole2ch_UID" #define DEFAULT_OUTPUT_UID @"BuiltInSpeakerDevice" @interface AudioCaptureDelegate : NSObject <AVCaptureAudioDataOutputSampleBufferDelegate> @end void setDefaultAudioDevice(NSString *deviceUID); @implementation AudioCaptureDelegate // receive samples from CoreAudio/HAL driver and print amplitute values for testing // this is where samples would normally be copied and passed downstream for further processing which // is not needed in this simple sample application - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection { // Access the audio data in the sample buffer CMBlockBufferRef blockBuffer = CMSampleBufferGetDataBuffer(sampleBuffer); if (!blockBuffer) { NSLog(@"No audio data in the sample buffer."); return; } size_t length; char *data; CMBlockBufferGetDataPointer(blockBuffer, 0, NULL, &length, &data); // Process the audio samples to calculate the average amplitude int16_t *samples = (int16_t *)data; size_t sampleCount = length / sizeof(int16_t); int64_t sum = 0; for (size_t i = 0; i < sampleCount; i++) { sum += abs(samples[i]); } // Calculate and log the average amplitude float averageAmplitude = (float)sum / sampleCount; NSLog(@"Average Amplitude: %f", averageAmplitude); } @end // set the default audio device to Blackhole while testing or speakers when done // called by main void setDefaultAudioDevice(NSString *deviceUID) { AudioObjectPropertyAddress address; AudioDeviceID deviceID = kAudioObjectUnknown; UInt32 size; CFStringRef uidString = (__bridge CFStringRef)deviceUID; // Gets the device corresponding to the given UID. AudioValueTranslation translation; translation.mInputData = &uidString; translation.mInputDataSize = sizeof(uidString); translation.mOutputData = &deviceID; translation.mOutputDataSize = sizeof(deviceID); size = sizeof(translation); address.mSelector = kAudioHardwarePropertyDeviceForUID; address.mScope = kAudioObjectPropertyScopeGlobal; //???? address.mElement = kAudioObjectPropertyElementMain; OSStatus status = AudioObjectGetPropertyData(kAudioObjectSystemObject, &address, 0, NULL, &size, &translation); if (status != noErr) { NSLog(@"Error: Could not retrieve audio device ID for UID %@. Status code: %d", deviceUID, (int)status); return; } AudioObjectPropertyAddress propertyAddress; propertyAddress.mSelector = kAudioHardwarePropertyDefaultOutputDevice; propertyAddress.mScope = kAudioObjectPropertyScopeGlobal; status = AudioObjectSetPropertyData(kAudioObjectSystemObject, &propertyAddress, 0, NULL, sizeof(AudioDeviceID), &deviceID); if (status == noErr) { NSLog(@"Default audio device set to %@", deviceUID); } else { NSLog(@"Failed to set default audio device: %d", status); } } // sets Blackhole device as default and configures it as AVCatureDeviceInput // sets the speakers as loopback so we can hear what is being captured // sets up queue to receive capture samples // runs session for 30 seconds, then restores speakers as default output int main(int argc, const char * argv[]) { @autoreleasepool { // Create the capture session AVCaptureSession *session = [[AVCaptureSession alloc] init]; // Select the audio device AVCaptureDevice *audioDevice = nil; NSString *audioDriverUID = nil; audioDriverUID = BLACKHOLE_UID; setDefaultAudioDevice(audioDriverUID); audioDevice = [AVCaptureDevice deviceWithUniqueID:audioDriverUID]; if (!audioDevice) { NSLog(@"Audio device %s not found!", [audioDriverUID UTF8String]); return -1; } else { NSLog(@"Using Audio device: %s", [audioDriverUID UTF8String]); } // Configure the audio input with the selected device (Blackhole) NSError *error = nil; AVCaptureDeviceInput *audioInput = [AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:&error]; if (error || !audioInput) { NSLog(@"Failed to create audio input: %@", error); return -1; } [session addInput:audioInput]; // Configure the audio data output AVCaptureAudioDataOutput *audioOutput = [[AVCaptureAudioDataOutput alloc] init]; AudioCaptureDelegate *delegate = [[AudioCaptureDelegate alloc] init]; dispatch_queue_t queue = dispatch_queue_create("AudioCaptureQueue", NULL); [audioOutput setSampleBufferDelegate:delegate queue:queue]; [session addOutput:audioOutput]; // Set audio settings NSDictionary *audioSettings = @{ AVFormatIDKey: @(kAudioFormatLinearPCM), AVSampleRateKey: @48000, AVNumberOfChannelsKey: @2, AVLinearPCMBitDepthKey: @16, AVLinearPCMIsFloatKey: @NO, AVLinearPCMIsNonInterleaved: @NO }; [audioOutput setAudioSettings:audioSettings]; AVCaptureAudioPreviewOutput * loopback_output = nil; loopback_output = [[AVCaptureAudioPreviewOutput alloc] init]; loopback_output.volume = 1.0; loopback_output.outputDeviceUniqueID = DEFAULT_OUTPUT_UID; [session addOutput:loopback_output]; const char *deviceID = loopback_output.outputDeviceUniqueID ? [loopback_output.outputDeviceUniqueID UTF8String] : "nil"; NSLog(@"session addOutput for preview/loopback: %s", deviceID); // Start the session [session startRunning]; NSLog(@"Capturing audio data for 30 seconds..."); [[NSRunLoop currentRunLoop] runUntilDate:[NSDate dateWithTimeIntervalSinceNow:30.0]]; // Stop the session [session stopRunning]; NSLog(@"Capture session stopped."); setDefaultAudioDevice(DEFAULT_OUTPUT_UID); } return 0; }
4
0
303
3w
Play audio data coming from serial port
Hi. I want to read ADPCM encoded audio data, coming from an external device to my Mac via serial port (/dev/cu.usbserial-0001) as 256 byte chunks, and feed it into an audio player. So far I am using Swift and SwiftSerial (GitHub - mredig/SwiftSerial: A Swift Linux and Mac library for reading and writing to serial ports. 3) to get the data via serialPort.asyncBytes() into a AsyncStream but I am struggling to understand how to feed the stream to a AVAudioPlayer or similar. I am new to Swift and macOS audio development so any help to get me on the right track is greatly appreciated. Thx
0
0
138
3w
How to edit gain map image to preserve HDR in Live Photo
I have an app that allows you to edit your photos. To preserve HDR, I edit both the SDR image and gain map image, like so: let sdrImage = CIImage(data: data, options: [.applyOrientationProperty: true]) let gainMapImage = CIImage(data: data, options: [.applyOrientationProperty: true, .auxiliaryHDRGainMap: true]) // edit them... try CIContext().writeHEIFRepresentation(of: sdrImage, to: url, format: .RGBA8, colorSpace: colorSpace, options: [.hdrGainMapImage: gainMapImage]) I also support editing the still photo in Live Photos. To do this you create a PHLivePhotoEditingContext, set the frameProcessor block which gives you a CIImage that I edit when the frame.type is .photo, then you create a PHContentEditingOutput and call saveLivePhoto. I’m not seeing any way to preserve HDR here. Interestingly the frame processor is called twice with .photo frame.type, but I don’t see any difference between these images. How can I edit a gain map image to preserve HDR in the still photo of a Live Photo?
1
0
212
3w
Phone turns black
My iphone 15 plus suddenly turns black and a losing icon keeps spinning. Then it turns off and I can use it again, it is only for a few seconds. I have updated to iOS 18.1 beta, could this be the issue. Is my phone broken? I have tried restarting my phone
0
0
152
4w
LockedCameraCaptureExtension can not get access to photo library
I've requested the authentication in my main app. PHPhotoLibrary.requestAuthorization(for: .readWrite) { status in } Add the privacy description in both the main app and the extension. But No matter the device is locked or unlocked. When I call let fetchResult = PHAsset.fetchAssets(with: .image, options: nil) let count = fetchResult.count the count is always zero, even after a new photo is saved to the album in the same session.
1
0
206
4w
writeImageAtIndex:1012: ⭕️ ERROR: 'App' is trying to save an opaque image (5712x4284) with 'AlphaLast'.
I have an app that edits photos in your library. When I call try CIContext().writeHEIFRepresentation(of: editedImage, to: fileURL, format: .RGBA8, colorSpace: originalImage.colorSpace!) The following is logged to the console: writeImageAtIndex:1012: ⭕️ ERROR: 'App' is trying to save an opaque image (5712x4284) with 'AlphaLast'. This would unnecessarily increase the file size and will double (!!!) the required memory when decoding the image --> ignoring alpha. What does that mean and how can I resolve it? Xcode Version 16.0 (16A242d) iOS 18.1 (22B82)
3
6
442
4w
Linking to iTunesLibrary requires access every launch?
Hello, I have a command line application that uses iTunesLibrary to "save" the state of what I have listened to. I have it run every night via a LaunchAgent. You can see the source here: https://github.com/bolsinga/itunes_json Prior to Sequoia it would run nightly. I'd just have to grant it access to the Music library once, and it would be fine thereafter. However with Sequoia it requires UI interaction to grant it access every time. This makes it no longer run unattended overnight, defeating its purpose. I have the console logs of when this happens. You can see it in my issue tracking it here: https://github.com/bolsinga/itunes_json/issues/410 One thing that makes me wonder is that it is a command line application, not a bundle. How do I make a command line application get access to MusicKit / iTunesLibrary, and keep it thereafter? I'd like to get my pre-Sequoia behavior back. I've filed FB15592660 too. I've granted it access to run in the background, as well as access to my Music library (please see attached screenshots). AMPLibraryAgent 10:48:29.489944-0700 xpc Connection from framework client invalidated pid:57606 clientname:iTunesLibrary(itunes_json) AMPLibraryAgent 10:48:29.492763-0700 service Unloading domains(14) for ClientID:iTunesLibrary(itunes_json)-1229 previous open:15 new open:1 itunes_json 10:48:59.980864-0700 connection [0x157f05800] activating connection: mach=true listener=false peer=false name=com.apple.amp.library.framework tccd 10:48:59.982568-0700 access AUTHREQ_ATTRIBUTION: msgID=1795.214, attribution={accessing={TCCDProcess: identifier=itunes_json, pid=57652, auid=501, euid=501, binary_path=/Users/bolsinga/Applications/itunes_json/Products/usr/local/bin/itunes_json}, requesting={TCCDProcess: identifier=com.apple.AMPLibraryAgent, pid=1795, auid=501, euid=501, binary_path=/System/Library/PrivateFrameworks/AMPLibrary.framework/Versions/A/Support/AMPLibraryAgent}, }, tccd 10:48:59.982651-0700 access requestor: TCCDProcess: identifier=com.apple.AMPLibraryAgent, pid=1795, auid=501, euid=501, binary_path=/System/Library/PrivateFrameworks/AMPLibrary.framework/Versions/A/Support/AMPLibraryAgent is checking access for accessor TCCDProcess: identifier=itunes_json, pid=57652, auid=501, euid=501, binary_path=/Users/bolsinga/Applications/itunes_json/Products/usr/local/bin/itunes_json tccd 10:48:59.995636-0700 access AUTHREQ_SUBJECT: msgID=1795.214, subject=/Users/bolsinga/Applications/itunes_json/Products/usr/local/bin/itunes_json, tccd 10:48:59.996283-0700 access -[TCCDAccessIdentity staticCode]: static code for: identifier /Users/bolsinga/Applications/itunes_json/Products/usr/local/bin/itunes_json, type: 1: 0xc00341b00 at /Users/bolsinga/Applications/itunes_json/Products/usr/local/bin/itunes_json tccd 10:49:00.018205-0700 access Failed to match existing code requirement for subject /Users/bolsinga/Applications/itunes_json/Products/usr/local/bin/itunes_json and service kTCCServiceMediaLibrary cdhash H"6bc380972f4df49b337a2a05308fb7b98fbe6473" or cdhash H"0708bcaabbfbab8770522050f7e2642d4d864f31" cdhash H"6bc380972f4df49b337a2a05308fb7b98fbe6473" or cdhash H"0708bcaabbfbab8770522050f7e2642d4d864f31" tccd 10:49:00.018997-0700 access AUTHREQ_PROMPTING: msgID=1795.214, service=kTCCServiceMediaLibrary, subject=Sub:{/Users/bolsinga/Applications/itunes_json/Products/usr/local/bin/itunes_json}Resp:{TCCDProcess: identifier=itunes_json, pid=57652, auid=501, euid=501, binary_path=/Users/bolsinga/Applications/itunes_json/Products/usr/local/bin/itunes_json}, AMPLibraryAgent 10:49:02.489170-0700 xpc ampld> register framework ClientName:iTunesLibrary(itunes_json) tccd 10:49:02.488189-0700 events Publishing <TCCDEvent: type=Create, service=kTCCServiceMediaLibrary, identifier_type=Path, identifier=/Users/bolsinga/Applications/itunes_json/Products/usr/local/bin/itunes_json> to 4 subscribers: { 633 = "<TCCDEventSubscriber: token=633, state=Initial, csid=(null)>"; 628 = "<TCCDEventSubscriber: token=628, state=Passed, csid=com.apple.chronod>"; 464 = "<TCCDEventSubscriber: token=464, state=Passed, csid=com.apple.cloudd>"; 513 = "<TCCDEventSubscriber: token=513, state=Passed, csid=com.apple.photolibraryd>"; } AMPLibraryAgent 10:49:02.490391-0700 xpc ampld> registered framework ClientName:iTunesLibrary(itunes_json) with clientID:1230 itunes_json 10:49:02.792084-0700 connection [0x147e04340] activating connection: mach=true listener=false peer=false name=com.apple.amp.artworkd itunes_json 10:49:02.801482-0700 <Missing Description> openDatabase 0xe4af30f4493e5ef5 artwork folder Y '<private>' itunes_json 10:49:02.805087-0700 <Missing Description> openDatabase 0xf2db6e8d7672edc9 artwork folder Y '<private>' itunes_json 10:49:02.806736-0700 <Missing Description> openDatabase 0xfb2acd898c951851 artwork folder Y '<private>' itunes_json 10:49:02.813286-0700 <Missing Description> openDatabase 0xf0f4919c5ff0e88 artwork folder Y '<private>' itunes_json 10:49:09.634928-0700 connection [0x600002b6a0d0] activating connection: mach=true listener=false peer=false name=com.apple.cfprefsd.daemon itunes_json 10:49:09.635019-0700 connection [0x600002b78000] activating connection: mach=true listener=false peer=false name=com.apple.cfprefsd.agent AMPLibraryAgent 10:49:12.382878-0700 xpc Connection from framework client invalidated pid:57652 clientname:iTunesLibrary(itunes_json) AMPLibraryAgent 10:49:12.383474-0700 service Unloading domains(14) for ClientID:iTunesLibrary(itunes_json)-1230 previous open:15 new open:1 itunes_json.log
6
0
340
4w
Get audio volume from microphone
Hello. We are trying to get audio volume from microphone. We have 2 questions. 1. Can anyone tell me about AVAudioEngine.InputNode.volume? AVAudioEngine.InputNode.volume Return 0 in the silence, Return float type value within 1.0 depending on the volume are expected work, but it looks 1.0 (default value) is returned at any time. Which case does it return 0.5 or 0? Sample code is below. Microphone works correctly. // instance member private var engine: AVAudioEngine! private var node: AVAudioInputNode! // start method self.engine = .init() self.node = engine.inputNode engine.prepare() try! engine.start() // volume getter print(\(self.node.volume)) 2. What is the best practice to get audio volume from microphone? Requirements are: Without AVAudioRecorder. We use it for streaming audio. it should withstand high frequency access. Testing info device: iPhone XR OS version: iOS 18 Best Regards.
2
0
302
Oct ’24
Big ColorSync Bug? macOS Interprets Rec 705 Profile Differently Between 15 and 14
ColorSync throughout the system interprets the Rec. ITU-R BT.709-5 color profile differently in macOS 15 than it did on macOS 14 leading to severe color differences. This is breaking/problem-causing color change. Here's a comparison of one way the results can appear different. Steps to Reproduce Open this image above (named here as sRGB_Bars.png) in ColorSync Utility. (This should be an sRGB profiled image if the forums don't mess with that.) With Digital Color Meter open and set to "Display in sRGB" [0-255] you can see the gray bars progress left-to-right as 0, 23, 46, 69, 92, 115, etc... ] Those are the correct reference values. At the bottom of the window in ColorSync use [Match to Profile] [Display -> Rec. ITU-R BT.709-5], then click the Apply Button. You can verify the image now uses the Rec 709 profile with the (i) Get Info button in the toolbar. Use "Save As…" to save the image with a different name. **) Do the steps above on macOS 14 and macOS 15 to create two images. Rec709_CreatedOnMacOS14, and Rec709_CreatedOnMacOS15. **) Compare the two images on BOTH operating system versions, and you'll see they are significantly different from each other. Rec709_CreatedOnMacOS14.png when viewed on macOS 15 has the gray values of: [0, 1, 18, 43, 67, ...] Rec709_CreatedOnMacOS14.png when viewed on macOS 15 has the gray values of: [0, 51, 72, 94, 115, ... ] These are MASSIVE differences. Significance This is not just a problem that affects ColorSync Utility or Preview, etc. This same gamma interpretation difference is affecting the Core Video, Core Image, Video Toolbox, etc pipelines as well. That gets complicated to talk about, and this is example is the simplest I can boil it down to. Conclusion Significant bug? What's going on?
2
0
260
Oct ’24
Video Hardware Acceleration on Mac
I observe significant performance differences when encoding a video in mp4 format (H264). The code I use is standard (using AVAssetWriter, AVAssetWriterInput...). Here is what I notice when I run the same code on different platforms: On an iPhone, the video is encoded in 3 seconds (iPhone 13, 14, 15, 16, Pro...). On a Mac equipped with an M2 Pro, the video is encoded in 50 seconds. On a Mac equipped with an Intel processor (2,3 GHz Intel Xeon W 18 cœurs), the video is encoded in 2 minutes. The encoding on an iPhone is very fast due to hardware acceleration. However, I don’t understand why I don’t get similar performance with a Mac M2 Pro, which is equipped with a dedicated component for hardware acceleration (H264 media engine)? Is hardware acceleration disabled on a Mac?
0
0
187
Oct ’24
Swift fails to fetch Photo Albums Synced using Windows iTunes from External Drive
See Configuration Details at the end of this message. Despite numerous attempts, I have been unable to determine the correct syntax to fetch photo albums from my iPad Pro 13.0 using Xcode and Swift. All the photo album were synced to the iPad Pro 13-inch using the latest versions of Apple iTunes for Windows from an external Western Digital G-Drive hard drive (No iCloud). All synced albums appear under "From My Mac" on the iPad. I only want to access each album's photo and video count. See sample code snippet below. I have tried multiple subtype options and album types without success. Zero albums are always returned despite having around 3900 albums in the iPad Pro 13.0 photo library. Authorization to the photo library does not appear to be the problem. PHPhotoLibrary.requestAuthorization { status in if status == .authorized { let result = PHAssetCollection.fetchAssetCollections(with: .album, subtype: .any, options: nil) if result.count == 0 { print("No albums found.") return } } } Any help or suggestions would greatly be appreciated. ApplePhoto Configuration Details iPad Pro 13-inch (M4) (iPad16,6) iPadOS = 17.7 iCloud = Turned Off iPad Pro Photo Library Albums = 3900 iPad Pro Photo Library Photos = 118000 iPad Pro Photo Library Videos = 4800 MacOS = Sonoma 14.6.1 XCode Version = 16.0 Swift Version = 5.0 (Xcode Default) Microsoft Windows 10 Pro Version = 22H2 Apple iTunes for Windows = 12.13.4.4
1
0
145
Oct ’24
Upstream Service Error on Apple Music Feed API
Hi, I'm working on an integration with the Apple Music Feed, but over the last day, I've been getting a 500 Upstream Service Error on 99% of the API calls I make. Remarkably, retrying the same endpoint 20-30 times sometimes gives the correct response, but mostly it's a 500 error. Just to give an example, this: GET https://api.media.apple.com/v1/feed/album/latest Returns this generic error: { "errors": [ { "id": "U4ARRA2QDCGYKYRI2IPEJVBTHY", "title": "Upstream Service Error", "detail": "Call to get metadata for album feed failed", "status": "500", "code": "50001" } ] } The same goes for other feeds like song, artist, and so on. Before today, I did get the same error message sometimes, but a few retries would solve the issue. Any insight on what's happening and/or an ETA on fixing it? Thank you
0
0
180
Oct ’24
SFCustomLanguageModelData.CustomPronunciation and X-SAMPA string conversion
Can anyone please guide me on how to use SFCustomLanguageModelData.CustomPronunciation? I am following the below example from WWDC23 https://wwdcnotes.com/documentation/wwdcnotes/wwdc23-10101-customize-ondevice-speech-recognition/ While using this kind of custom pronunciations we need X-SAMPA string of the specific word. There are tools available on the web to do the same Word to IPA: https://openl.io/ IPA to X-SAMPA: https://tools.lgm.cl/xsampa.html But these tools does not seem to produce the same kind of X-SAMPA strings used in demo, example - "Winawer" is converted to "w I n aU @r". While using any online tools it gives - "/wI"nA:w@r/".
0
1
186
Oct ’24
Realitiy Composer file size and iOS 18
Hi folks: I've been creating .reality files out of Reality Composer for over a year. Some of the files are up to 500 mB and, prior to the last month they opened fine as AR projected experiences on even basic iPhones and iPads. Now, I think since iOS 18, a 64Mb file will open as an AR experience but files it seems from about 350MB up don't open. Files just opens a window displaying the name of the file, that it's a .reality file and the file size. But it no longer opens into either an AR or Object display of the .reality scene. Has there been a new file size limit put on .reality files that Files will open or what else is going on here. Have a client who was about to launch and experience based on the .Reality file I can no longer open. Please help.
0
0
176
Oct ’24
Some use AVCaptureControl problems
Set 3 controls to the AVCaptureSession and remove them all. The number of controls in the session is indeed 0, but the camera controls button still shows the previous 3 controls. If it is only 3->2 or 3->1, it can be modified normally, 3->0 is not OK, 0->3 is OK. f (self.captureControl.zoom) { if (self.zoomScaleControl) { self.zoomScaleControl.enabled = false; [_session removeControl:self.zoomScaleControl]; } AVCaptureSlider *zoomSlider = [self.captureControl.zoom fetchCaptureSlider]; [zoomSlider setActionQueue:dispatch_get_main_queue() action:^(float zoomFactor) { @strongify(self); if ([self.dataOutputDelegate respondsToSelector:@selector(videoCaptureSession:tryChangeZoomScale:)]) { [self.dataOutputDelegate videoCaptureSession:self tryChangeZoomScale:zoomFactor]; } }]; self.zoomScaleControl = zoomSlider; } else { if (self.zoomScaleControl) { self.zoomScaleControl.enabled = false; [_session removeControl:self.zoomScaleControl]; } self.zoomScaleControl = nil; } if (self.captureControl.exposure) { if (self.exposureBiasControl) { self.exposureBiasControl.enabled = false; [_session removeControl:self.exposureBiasControl]; } AVCaptureSlider *exposureSlider = [self.captureControl.exposure fetchCaptureSlider]; [exposureSlider setActionQueue:dispatch_get_main_queue() action:^(float bias) { @strongify(self); if ([self.dataOutputDelegate respondsToSelector:@selector(videoCaptureSession:tryChangeExposureBias:)]) { [self.dataOutputDelegate videoCaptureSession:self tryChangeExposureBias:bias]; } }]; self.exposureBiasControl = exposureSlider; } else { if (self.exposureBiasControl) { self.exposureBiasControl.enabled = false; [_session removeControl:self.exposureBiasControl]; } self.exposureBiasControl = nil; } if (self.captureControl.len) { if (self.lenControl) { self.lenControl.enabled = false; [_session removeControl:self.lenControl]; } ORLenCaptureControlCustomModel *len = self.captureControl.len; AVCaptureIndexPicker *picker = [len fetchCaptureSlider]; [picker setActionQueue:dispatch_get_main_queue() action:^(NSInteger selectedIndex) { @strongify(self); if ([self.dataOutputDelegate respondsToSelector:@selector(videoCaptureSession:didChangeLenIndex:datas:)]) { [self.dataOutputDelegate videoCaptureSession:self didChangeLenIndex:selectedIndex datas:self.captureControl.len.indexDatas]; } }]; self.lenControl = picker; } else { if (self.lenControl) { self.lenControl.enabled = false; [_session removeControl:self.lenControl]; } self.lenControl = nil; } if ([_session canAddControl:self.zoomScaleControl]) { [_session addControl:self.zoomScaleControl]; } else { self.zoomScaleControl = nil; } if ([_session canAddControl:self.lenControl]) { [_session addControl:self.lenControl]; } else { self.lenControl = nil; } if ([_session canAddControl:self.exposureBiasControl]) { [_session addControl:self.exposureBiasControl]; } else { self.exposureBiasControl = nil; } if (_session.controlsDelegate == nil) { [_session setControlsDelegate:self queue:GetCaptureControlQueue()]; }
0
0
181
Oct ’24