Dive into the world of video on Apple platforms, exploring ways to integrate video functionalities within your iOS,iPadOS, macOS, tvOS, visionOS or watchOS app.

Video Documentation

Post

Replies

Boosts

Views

Activity

Accessing Events from Video Device
I have an intra-**** video device that's supported by Apple's AVCaptureDevice. I can use the AV classes to connect to the device and get video. However, this device has a button that's used to acquire still images from the video stream. I can't use the IOUSBDeviceInterface to do an asynchronous read, because the Apple driver has the device opened exclusively. How do I go about receiving the button event in this scenario? I know which pipe to read, based on a bus analyzer when I run this on Windows, I just need to know how to access that pipe when the device is opened by another process.
0
0
39
5h
Drawing shapes and interacting with them
I have an app that lets the user draw a circle or line over a live video feed. There’s a view for circles and a view for lines. These are displayed in ContentView and their position in the ZStack is altered based on the toolbar section by the user. Whichever view is on top of the stack is active and usable. That all works fine but I want the user to be able to select any shape on screen whether it is a circle or line. My question is what is best practice to manage tool selection and drawing the shapes? I’m guessing I should draw all shapes onto one view but I would like to avoid this as I have all my logic in each shapes respective view.
0
0
93
4d
Time Limit
Hi Team, When we select the screen Timing option and if the screen limit is exists for the day limit we have an option to click Ignore Limit—-> Ignore limit for today. After we select this option, suppose if we are watching any videos via Third party apps (YouTube) at that moment it got hang after we clicked multiple times the play button the video started to play but the volume is not working, again if we click previous or forward button then only the sounds are coming. Kindly look into this issue and address it. Thanks, Pemkumar S
0
0
100
1w
AVAssetExportSession is not working on Iphone 16 pro max.
My App is live on app store , user are using it with iPhone 16 pro max and they are getting Operation Stopped while combining videos and audios only specifically on iPhone 16 pro max , on every other device its working fine. And When i adding AVAssetExportPresetPassthrough it able to combine videos and audios but not respecting the encoding and without audio. NSArray *compatiblePresets = [AVAssetExportSession exportPresetsCompatibleWithAsset:composition]; if ([compatiblePresets containsObject:AVAssetExportPresetHighestQuality]) { presetName = AVAssetExportPresetHighestQuality; } else if ([compatiblePresets containsObject:AVAssetExportPreset1920x1080]) { presetName = AVAssetExportPreset1920x1080; } else if ([compatiblePresets containsObject:AVAssetExportPreset1280x720]) { presetName = AVAssetExportPreset1280x720; } else { presetName = AVAssetExportPresetPassthrough; } } else { presetName = AVAssetExportPreset1280x720; }
3
1
140
1w
AVUnknown error using Camera Extensions in AVCaptureSession
I have a Mac Catalyst video conferencing app that streams video using AVCaptureMultiCamSession. Everything has been working well for me in a variety of scenarios and hardware, but recently I got a report that virtual cameras / camera extensions do not seem to work - which I can reproduce 100% of the time by using something like OBS's virtual camera. FaceTime and Photo Booth work okay with these virtual cameras. Although my app can see and add the external AVCaptureDevice, I get an AVCaptureSessionRuntimeError posted when I start the session with a connection between the virtual camera and a AVCaptureVideoDataOutput (I don't get the error if I don't connect or add an output). The posted error is AVUnknown: AVCaptureSessionRuntimeErrorNotification with Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo={NSLocalizedFailureReason=An unknown error occurred (-12780), NSLocalizedDescription=The operation could not be completed, NSUnderlyingError=0x600001dcd680 {Error Domain=NSOSStatusErrorDomain Code=-12780 "(null)"}} Which doesn't tell me too much. I do see some fig assertions just above in Console though: <<<< BWMultiStreamCameraSourceNode >>>> Fig assert: "err == 0 " at bail (BWMultiStreamCameraSourceNode.m:3964) - (err=-12780) <<<< BWMultiStreamCameraSourceNode >>>> Fig assert: "err == 0 " at bail (BWMultiStreamCameraSourceNode.m:1591) - (err=-12780) <<<< BWMultiStreamCameraSourceNode >>>> Fig assert: "err == 0 " at bail (BWMultiStreamCameraSourceNode.m:1418) - (err=-12780) <<<< FigCaptureCameraSourcePipeline >>>> Fig assert: "err == 0 " at bail (FigCaptureCameraSourcePipeline.m:3572) - (err=-12780) <<<< FigCaptureCameraSourcePipeline >>>> Fig assert: "err == 0 " at bail (FigCaptureCameraSourcePipeline.m:4518) - (err=-12780) <<<< FigCaptureCameraSourcePipeline >>>> Fig assert: "err == 0 " at bail (FigCaptureCameraSourcePipeline.m:483) - (err=-12780) I've verified formats are sane (the usual 420v 1080p 30fps I have everywhere else) and data output functions and such, but I'm a bit stuck as to where to go from here. One thing that did stand out is that in the AVCamBarcode example I can see the virtual camera in that app's preview layer, but if I create an AVCaptureVideoDataOutput and add it to the session in that example, it fails in what looks like exactly the same way that my app does, with the same assertions. Does anyone have any advice? Thanks!
4
0
151
1w
AVFoundation error when making a window full screen
I am working on a macOS app that uses AVFoundation to record the screen. During a recording if I make a window full screen, AVFoundation stops capturing screen frames (or does it at a very slow rate). In my logs I get the following error: Error Domain=AVFoundationErrorDomain Code=-11844 note that I have had instances where I could not reproduce the error but they were rare. The screen recording sometimes resumes normally if I switch desktops or minimize the full screen window. Did anyone ever run across a similar issue or knows how to fix it ?
1
0
98
1w
Recording A/V .mov file with SMPTE timecode
Hello, I used following technical note to develop app that record mov file with SMPTE timecode. https://developer.apple.com/library/archive/technotes/tn2310/_index.html As result, a timecode track is present within .mov file (other tracks are audio and video) Unfortunately, QuickTime Player doesn't display timecode information. Analyser tools like mediainfo or online service as https://media-analyzer.pro/app show that timecode track has null duration (and so no "time code of last frame" example n° of TC track : Other ID : 3 Type : Time code Format : QuickTime TC Frame rate : 60.000 FPS Time code of first frame : 17:39:59:00 Time code, stripped : Yes Title : Core Media Time Code Encoded date : 2024-09-10 15:39:46 UTC Tagged date : 2024-09-10 15:39:59 UTC example 2 of Timecode track : 0000569562Quicktime Timecode #0 00007f6b8a'trak' Track atom #1 00007f6b92'tkhd' Track header atom #2 size 92 (0x5C) type 'tkhd' (hex 74 6B 68 64) version 0 flags 15 (0xF) creation_time 0xE30618C2, '2024-09-10 15:39:46' modification_time 0xE30618CF, '2024-09-10 15:39:59' track_ID 3 reserved 0 duration 0 reserved [0, 0] In each case, duration is considered as null even if the record's duration is more than 20s. STEPS TO REPRODUCE Use AVAssetWriter for video and audio. Create AVAssetWrite for timecode and associate it with video track. Just before stopping record, a sample buffer containing SMPTE is generated and added. All track are marked as finished before stopping the record with finishWritingWithCompletionHandler.
1
0
115
1w
RPScreenRecorder startCapture issues on generated file
Hello all, This is my first post on the developer forums. I am developing an app that records the screen of my app, using AVAssetWriter and RPScreenRecorder startCapture. Everything is working as it should on most cases. There are some seemingly random times where the file generated is of some kb and it is corrupted. There seems to be no pattern on what the device is or the iOS version is. It can happen on various phones and iOS versions. The steps I have followed in order to create the file are: configuring the AssetWritter videoAssetWriter = try? AVAssetWriter(outputURL: url!, fileType: AVFileType.mp4) let size = UIScreen.main.bounds.size let width = (Int(size.width / 4)) * 4 let height = (Int(size.height / 4)) * 4 let videoOutputSettings: Dictionary<String, Any> = [ AVVideoCodecKey : AVVideoCodecType.h264, AVVideoWidthKey : width, AVVideoHeightKey : height ] videoInput = AVAssetWriterInput(mediaType: AVMediaType.video, outputSettings: videoOutputSettings) videoInput?.expectsMediaDataInRealTime = true guard let videoInput = videoInput else { return } if videoAssetWriter?.canAdd(videoInput) ?? false { videoAssetWriter?.add(videoInput) } let audioInputsettings = [ AVFormatIDKey: Int(kAudioFormatMPEG4AAC), AVSampleRateKey: 12000, AVNumberOfChannelsKey: 1, AVEncoderAudioQualityKey: AVAudioQuality.high.rawValue ] audioInput = AVAssetWriterInput(mediaType: .audio, outputSettings: audioInputsettings) audioInput?.expectsMediaDataInRealTime = true guard let audioInput = audioInput else { return } if videoAssetWriter?.canAdd(audioInput) ?? false { videoAssetWriter?.add(audioInput) } The urlForVideo returns the URL to the documentDirectory, after appending and creating the folders needed. This part seems to be working as it should as the directories are created and the video file exists on them. Start the recording if RPScreenRecorder.shared().isRecording { return } RPScreenRecorder.shared().startCapture(handler: { [weak self] sample, bufferType, error in if let error = error { onError?(error.localizedDescription) } else { if (!RPScreenRecorder.shared().isMicrophoneEnabled) { RPScreenRecorder.shared().stopCapture { error in if let error = error { return } } onError?("Microphone was not enabled") } else { succesCompletion?() succesCompletion = nil self?.processSampleBuffer(sample, with: bufferType) } } }) { error in if let error = error { onError?(error.localizedDescription) } } Process the sampleBuffers guard CMSampleBufferDataIsReady(sampleBuffer) else { return } DispatchQueue.main.async { [weak self] in switch sampleBufferType { case .video: self?.handleVideoBaffer(sampleBuffer) case .audioMic: self?.add(sample: sampleBuffer, to: self?.audioInput) self?.audioInput) default: break } } // The add function from above fileprivate func add(sample: CMSampleBuffer, to writerInput: AVAssetWriterInput?) { if writerInput?.isReadyForMoreMediaData ?? false { writerInput?.append(sample) } // The handleVideoBaffer function from above fileprivate func handleVideoBaffer(_ sampleBuffer: CMSampleBuffer) { if self.videoAssetWriter?.status == AVAssetWriter.Status.unknown { self.videoAssetWriter?.startWriting() self.videoAssetWriter?.startSession(atSourceTime: CMSampleBufferGetPresentationTimeStamp(sampleBuffer)) } else { if (self.videoInput?.isReadyForMoreMediaData) ?? false { if self.videoAssetWriter?.status == AVAssetWriter.Status.writing { self.videoInput?.append(sampleBuffer) } } } } } Finally the stop recording func stopRecording(completion: @escaping (URL?, URL?, Error?) -> Void) { RPScreenRecorder.shared().stopCapture { error in if let error = error { completion(nil, nil, error) return } self.finish { videoURL, _ in completion(videoURL, nil, nil) } } } // The finish function mentioned above fileprivate func finish(completion: @escaping (URL?, URL?) -> Void) { let dispatchGroup = DispatchGroup() dispatchGroup.enter() finishRecordVideo { dispatchGroup.leave() } dispatchGroup.notify(queue: .main) { print("Finish with url:\(String(describing: self.urlForVideo()))") completion(self.urlForVideo(), nil) } } // The finishRecordVideo mentioned above fileprivate func finishRecordVideo(completion: @escaping ()-> Void) { videoInput?.markAsFinished() audioInput?.markAsFinished() videoAssetWriter?.finishWriting { if let writer = self.videoAssetWriter { if writer.status == .completed { completion() } else if writer.status == .failed { // Print the error to find out what went wrong if let error = writer.error { print("Video asset writing failed with error: \(error.localizedDescription). Url: \(writer.outputURL.path)") } else { print("Video asset writing failed, but no error description available.") } completion() }else { completion() } } } } What could it be the reason of the corrupted files generated? This issue has never happened to my devices so there is no way to debug using xcode. Also there are no errors popping out on the logs. Can you spot any issues on the code that can create this kind of issue? Do you have any suggestions on the problem at hand? Thanks
0
0
101
1w
AVPlayer.replaceCurrentItem(with:) "Incorrect actor executor assumption" runtime crash when building for iOS 18
Hi there, I have some code that's been working fine for the last few versions of iOS and macOS and all the others, and now causes a runtime crash in iOS 18/macOS 15 etc. I have an actor called Player which is basically a big wrapper around an AVPlayer. It all gets compiled down to a Framework, and my clients use it by dropping it in to their video player app code. It handles everything needed for them to be able to talk to our media infrastructure and handles telemetry. It has its own property called avplayer which is an AVPlayer. Gets created at the init(). It has a function called load(_ avPlayerItem: AVPlayerItem) which the clients use to load a new video into player. The offending code (which used to work!) looks like this: Task { @MainActor in avplayer.replaceCurrentItem(with: avPlayerItem) } No warnings in Xcode. When you run it, it crashes on iOS 18 and macOS 15 with this error in the debugger: Incorrect actor executor assumption I thought, "Okay well maybe replaceCurrentItem has changed and doesn't need to be on the main actor anymore, so even if you say this outside of a Main Actor-scoped task: avplayer.replaceCurrentItem(with: avPlayerItem) ...it still crashes the exact same way. Does anyone have any ideas? I'm under some heavy pressure here to get this working and I don't even know where to start with this. Big thanks in advance.
4
0
219
1w
Progress tracking does not work after the download has been paused and resumed
Hello, I recently started integrating HLS downloads into my application by using AVAssetDownloadTask and AVAssetDownloadConfiguration. I took an example from the documentation as a basis, with only one small difference: the minimum target for my application is iOS 16, so I replaced urlSession(_:assetDownloadTask:willDownloadTo:) with urlSession(_:assetDownloadTask:didFinishDownloadingTo:). And I encountered the following issue: after pausing a download and resuming it later, the progress no longer functions as expected. Could you, please, help me with this? What are the right approaches to implementing pause and progress tracking? Some details: I used devices with iOS 16.0.2 and 17.6.1 for testing. There was no code in the example that pauses the download and resumes it. So, I used the following methods to do this: suspend and resume Also, I have tried to track downloading progress using two different approaches: Using task.progress.observe(\.fractionCompleted) { ... }, which was presented in the example. In this scenario, after a pause, an observation callback will only be called once, when the download has completed, despite the fact that data is being successfully downloaded over the network. Using urlSession(_:assetDownloadTask:didLoad:totalTimeRangesLoaded:timeRangeExpectedToLoad:) and calculating progress as totalTimeRangesLoaded.reduce(0.0) { $0 + CMTimeGetSeconds($1.timeRangeValue.duration) / CMTimeGetSeconds(timeRangeExpectedToLoad.duration) }. In this scenario, I have noticed that the result of the calculation does not always increase, but sometimes there are outliers. Example of logs: 68%, 69%, 70%, 72%, 63%, 65%, 66%, 69%, 70%, 71%, 72%. Such fluctuations are most easily reproduced when I try to resume the download after pause. However, sometimes they occur spontaneously. It's important to mention, that this method marked as deprecated, perhaps for this reason. In both cases download is successful, the problem is with progress reporting only. Full version of code can be found here.
0
0
85
1w
How to generate thumbnails for protected content using AVAssetImageGenerator?
I have a FairPlay-encrypted HLS stream and played the video in an AVPlayer.And I want to generate scrubbing thumbnails using the AVAssetImageGenerator. Also, I am able to generate thumbnails for clear streams but get errors for protected content. *How to generate thumbnails for protected content. func getImageThumbnail(forTime: CMTime) { let generator = AVAssetImageGenerator(asset: asset) generator.appliesPreferredTrackTransform = true generator.cancelAllCGImageGeneration() generator.generateCGImagesAsynchronously(forTimes: [NSValue(time: forTime)]) { [weak self] requestedTime, image, actualTime, result, error in if let error = error { print("Error generate: \(error.localizedDescription)") return } if let image = image { DispatchQueue.main.async { let image = UIImage(cgImage: image).jpegData(compressionQuality: 1.0) self?.playerImg.image = UIImage(data: image!) } } } }
1
0
92
1w
iOS18 AVPlayerViewController 出现卡住界面
(AVPlayerViewController *)avPlayerVC { if(!_avPlayerVC){ _avPlayerVC =[[AVPlayerViewController alloc] init]; _avPlayerVC.videoGravity = AVLayerVideoGravityResizeAspectFill; _avPlayerVC.showsPlaybackControls = NO; [self addSubview:_avPlayerVC.view]; [_avPlayerVC.view mas_makeConstraints:^(MASConstraintMaker *make) { make.edges.mas_equalTo(0); }]; [self sendSubviewToBack:_avPlayerVC.view]; } return _avPlayerVC; } 我在一个cell里添加这个,界面无法动弹。只有在iOS18会这样
2
1
166
1w
iOS18 and Xcode16 using AVPlayer prints lots of warning logs
<<<< FigPlayerInterstitial >>>> fpic_ServiceCurrentEvent signalled err=-15671 (kFigPlayerInterstitialError_ClientReleased) (no primary) at FigPlayerInterstitialCoordinator.m:7885 <<<< FigPlayerInterstitial >>>> fpic_ServiceCurrentEvent signalled err=-15671 (kFigPlayerInterstitialError_ClientReleased) (no primary) at FigPlayerInterstitialCoordinator.m:7885 <<<< FigPlayerInterstitial >>>> fpic_ServiceCurrentEvent signalled err=-15671 (kFigPlayerInterstitialError_ClientReleased) (no primary) at FigPlayerInterstitialCoordinator.m:7885 <<<< FigPlayerInterstitial >>>> fpic_ServiceCurrentEvent signalled err=-15671 (kFigPlayerInterstitialError_ClientReleased) (no primary) at FigPlayerInterstitialCoordinator.m:7885 <<<< FigPlayerInterstitial >>>> fpic_ServiceCurrentEvent signalled err=-15671 (kFigPlayerInterstitialError_ClientReleased) (no primary) at FigPlayerInterstitialCoordinator.m:7885 My project uses AVPlayer (AVPlayerViewController) to play video. There are continuous warning logs while playing and when it goes to dealloc, it prints information below. <<<< PlayerRemoteXPC >>>> remoteXPCItem_handleSetProperty signalled err=-12860 (kFigPlayerError_ParamErr) (propertyValue should be MTAudioProcessingTap) at FigPlayer_RemoteXPC.m:2760 This only happens in iOS 18 and I have no idea about this. There is no any information for FigPlayerInterstitial and else.
1
2
147
1w
Blocking the main thread when calling the pause method of AVPlayer.
Basic iPhone 11 iOS 17.5.1 Main Thread libsystem_kernel.dylib___ulock_wait (in libsystem_kernel.dylib) +8 libdispatch.dylib__dlock_wait (in libdispatch.dylib) +52 libdispatch.dylib__dispatch_thread_event_wait_slow (in libdispatch.dylib) +52 libdispatch.dylib___DISPATCH_WAIT_FOR_QUEUE__ (in libdispatch.dylib) +364 libdispatch.dylib__dispatch_sync_f_slow (in libdispatch.dylib) +144 MediaToolbox_fpic_CopyCurrentEvent (in MediaToolbox) +132 AVFCore___104-[AVPlayer _setRate:withVolumeRampDuration:playImmediately:rateChangeReason:affectsCoordinatedPlayback:]_block_invoke_2 (in AVFCore) +244 AVFCore-[AVPlayer _setRate:withVolumeRampDuration:playImmediately:rateChangeReason:affectsCoordinatedPlayback:] (in AVFCore) +276 AVFCore-[AVPlayer setRate:] (in AVFCore) +56 call AVPlayer pause Thread 81 name: fpic-sync libsystem_kernel.dylib___ulock_wait (in libsystem_kernel.dylib) +8 libdispatch.dylib__dlock_wait (in libdispatch.dylib) +52 libdispatch.dylib__dispatch_thread_event_wait_slow (in libdispatch.dylib) +52 libdispatch.dylib___DISPATCH_WAIT_FOR_QUEUE__ (in libdispatch.dylib) +364 libdispatch.dylib__dispatch_sync_f_slow (in libdispatch.dylib) +144 MediaToolbox_itemasync_CopyProperty (in MediaToolbox) +588 MediaToolbox_fpic_CurrentItemMoment (in MediaToolbox) +184 MediaToolbox___fpic_EstablishCurrentEventForCurrentItem_block_invoke (in MediaToolbox) +136 libdispatch.dylib__dispatch_client_callout (in libdispatch.dylib) +16 libdispatch.dylib__dispatch_lane_barrier_sync_invoke_and_complete (in libdispatch.dylib) +52 MediaToolbox_fpic_ServiceCurrentEvent (in MediaToolbox) +600 MediaToolbox___fpic_NotifyServiceCurrentEvent_block_invoke (in MediaToolbox) +912 libdispatch.dylib__dispatch_call_block_and_release (in libdispatch.dylib) +28 libdispatch.dylib__dispatch_client_callout (in libdispatch.dylib) +16 libdispatch.dylib__dispatch_lane_serial_drain (in libdispatch.dylib) +744 libdispatch.dylib__dispatch_lane_invoke (in libdispatch.dylib) +428 libdispatch.dylib__dispatch_root_queue_drain (in libdispatch.dylib) +388 libdispatch.dylib__dispatch_worker_thread (in libdispatch.dylib) +256 libsystem_pthread.dylib__pthread_start (in libsystem_pthread.dylib) +132 libsystem_pthread.dylib_thread_start (in libsystem_pthread.dylib) +4 Thread 93 name: com.apple.coremedia.player.async.0x303c60240.P/GR libsystem_kernel.dylib_mach_msg2_trap (in libsystem_kernel.dylib) +8 libsystem_kernel.dylib_mach_msg2_internal (in libsystem_kernel.dylib) +76 libsystem_kernel.dylib_mach_msg_overwrite (in libsystem_kernel.dylib) +432 libsystem_kernel.dylib_mach_msg (in libsystem_kernel.dylib) +20 libdispatch.dylib__dispatch_mach_send_and_wait_for_reply (in libdispatch.dylib) +540 libdispatch.dylib_dispatch_mach_send_with_result_and_wait_for_reply (in libdispatch.dylib) +56 libxpc.dylib_xpc_connection_send_message_with_reply_sync (in libxpc.dylib) +260 CoreMedia_FigXPCConnectionSendSyncMessageCreatingReply (in CoreMedia) +288 CoreMedia_FigXPCRemoteClientSendSyncMessageCreatingReply (in CoreMedia) +44 MediaToolbox_remoteXPCPlayer_SetRateWithOptions (in MediaToolbox) +148 MediaToolbox_playerasync_runOneCommand (in MediaToolbox) +768 MediaToolbox_playerasync_runAsynchronousCommandOnQueue (in MediaToolbox) +180 libdispatch.dylib__dispatch_client_callout (in libdispatch.dylib) +16 libdispatch.dylib__dispatch_lane_serial_drain (in libdispatch.dylib) +744 libdispatch.dylib__dispatch_lane_invoke (in libdispatch.dylib) +428 libdispatch.dylib__dispatch_root_queue_drain (in libdispatch.dylib) +388 libdispatch.dylib__dispatch_worker_thread (in libdispatch.dylib) +256 libsystem_pthread.dylib__pthread_start (in libsystem_pthread.dylib) +132 libsystem_pthread.dylib_thread_start (in libsystem_pthread.dylib) +4
1
0
144
1w
fcpxml asset-clip "tcFormat" attribute question
I'm trying to create code to generate an fcpxml file so I can automate Final Cut Pro timeline (project) creation. Here's an xml element that FCP successfully imports (and successfully creates a project/timeline). <project name="2013-08-09 19_23_07 (id).mov"> <sequence format="r1"> <spine> <asset-clip ref="r2" offset="0s" name="2013-08-09 19_23_07 (id).mov" start="146173027/60000s" duration="871871/60000s" tcFormat="DF" audioRole="dialogue"></asset-clip> </spine> </sequence> </project> The xml element example above was generated by exporting a simple timeline with a single clip. The problem I'm having is the media asset has timecode that gives a start time in relation to the timecode. When I try to remove timecode attributes and change the start time to "0s" <asset-clip ref="r2" offset="0s" name="2013-08-09 19_23_07 (id).mov" start="0s" duration="871871/60000s" audioRole="dialogue"></asset-clip> FCP complains with the import error: 2013-08-09 19_23_07 (id).fcpxml Invalid edit with no respective media. (/fcpxml[1]/project[1]/sequence[1]/spine[1]/asset-clip[1]) I guess the question is, does AVAsset provide a way to get the timecode information and the timecode based start offset, or is there a way to tell FCP to use a default start time independent of timecode?
0
0
120
2w
Floating point value support in CMFormatDescriptionExtension
I updated macOS to 15.0 yesterday, and I found some floating point value support under CMFormatDescriptionExtensions and CVPixelBuffer's Attachment seems to be broken. When I call CMSampleBufferCreateReadyWithImageBuffer() from CVPixelBuffer, macOS 15.0 always fail with floating point values. a. kCMFormatDescriptionExtension_GammaLevel Previous macOS 14.x works with double value like NSString* keyGamma = (__bridge NSString*)kCMFormatDescriptionExtension_GammaLevel; extensions[keyGamma] = @(2.2); b. kCMFormatDescriptionExtension_CleanAperture I am not sure yet but such non-integer value issue also seems to be applied to CleanAperture. kCMFormatDescriptionKey_CleanApertureWidth kCMFormatDescriptionKey_CleanApertureHeight kCMFormatDescriptionKey_CleanApertureHorizontalOffset kCMFormatDescriptionKey_CleanApertureVerticalOffset Also, When I add rational values to extensions, it cannot pass CMVideoFormatDescriptionMatchesImageBuffer() with: kCMFormatDescriptionKey_CleanApertureWidthRational kCMFormatDescriptionKey_CleanApertureHeightRational kCMFormatDescriptionKey_CleanApertureHorizontalOffsetRational kCMFormatDescriptionKey_CleanApertureVerticalOffsetRational Is there any known workaround?
0
0
143
2w
Alternative for crashing API MPMediaItemArtwork
When setting the now playing info for playing media in MPNowPlayingInfoCenter we can set artwork. But it seems the Apple API for creating the artwork is crashing on iOS 18 (FB15145734). On iOS 17 this gave the warning that the completion handler was not run on the main thread. I've tried to seek help here: https://stackoverflow.com/questions/78989543/swift-data-race-with-appkit-mpmediaitemartwork-function/78990231?noredirect=1#comment139277425_78990231 but it seems that it's not possible to override the completion handler and therefor it's up to Apple to fix this issue. .task { await MainActor.run { let nowPlayingInfoCenter = MPNowPlayingInfoCenter.default() var nowPlayingInfo = [String: Any]() let image = NSImage(named: "image")! // warning: data race detected: @MainActor function at MPMediaItemArtwork/ContentView.swift:22 was not called on the main thread nowPlayingInfo[MPMediaItemPropertyArtwork] = MPMediaItemArtwork(boundsSize: image.size, requestHandler: { _ in // Not on main thread here! return image }) nowPlayingInfoCenter.nowPlayingInfo = nowPlayingInfo } } I'm wondering if there is an alternative method to set the now playing artwork?
2
0
179
2w