Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.

All subtopics

Post

Replies

Boosts

Views

Activity

AVFoundation - Access to each of eye of spatial video (MV-HEVC)
Hi, I am looking at display some spatial video content captured on iPhone 15 Pros in a side-by-side format. I've read the HEVC Stereo Video Profile provided by Apple, but I am confused on access the left and right eye video. Looking at the AVAsset track information, there is one video track, one sound, and three metadata ones. Apple's document references them as layers, but I am unsure how to access them. Could anyone provide some guidance on the access of them? Thanks, Will
0
0
614
Jan ’24
PHPhotoLibrary.cloudIdentifierMappings(forLocalIdentifiers:) crashes
Hello, I am experiencing an unexpected issue related to PhotosKit, where an occasional crash is occurring in an area it shouldn't. I've provided the crash log below for reference. The crash can be recreated by calling PHPhotoLibrary.cloudIdentifierMappings(forLocalIdentifiers:) with random identifiers, as demonstrated below: let cloudIdentifiers = (0...10).map { _ in PHCloudIdentifier(stringValue: UUID().uuidString) } let localIdsMap = PHPhotoLibrary.shared().localIdentifierMappings(for: cloudIdentifiers) I've noticed that the method seems to crash consistently when an incorrect cloud identifier is input. This is surprising to me since the return type is [PHCloudIdentifier : Result<String, Error>], so I was anticipating an explicit error. The issue can be reproduced on both Xcode 15.0 & iOS 17.2 and Xcode 14.3.1 & iOS 16.0. While I'm fairly certain that only valid identifiers are used in my app, I would still like to know if there is a potential way to validate a cloud identifier prior to accessing this method?
0
0
404
Jan ’24
Replaykit Broadcast finishing unexpectedly:Attempted to start an invalid broadcast session
I'm currently working on live screen broadcasting app which allows the user's to record their screen to save a mp4 video. I write video file by AVAssetWriter, and it works fine. But, when there is 1GB-2BG of storage space remaining on the device, errors such as "Attempted to start an invalid broadcast session" frequently occur, and video files cannot be played due to not call assetWriter.finishWriting(). Occur on device: iPhone se3 iPhone 12 pro max iPhone 13 iPad 19 iPad air 5 I have tried the movieFragmentInterval of AVAssetWriter to write movie fragments , set shouldOptimizeForNetworkUse true/false , not working.The video can not be played. I want to known how to observe or catch this error? Thanks!
0
0
543
Jan ’24
PHPhotoLibrary.cloudIdentifierMappings(forLocalIdentifiers:) crashes
Hello, I am experiencing an unexpected issue related to PhotosKit, where an occasional crash is occurring in an area it shouldn't. I've provided the crash log below for reference below: Last Exception Backtrace: 0 CoreFoundation 0x1aa050860 __exceptionPreprocess + 164 1 libobjc.A.dylib 0x1a2363c80 objc_exception_throw + 60 2 CoreFoundation 0x1a9f68218 -[__NSDictionaryM setObject:forKeyedSubscript:] + 1184 3 Photos 0x1c11cb178 -[PHCloudIdentifierLookup _lookupCodeSpecificCloudIdentifierStrings:forIdentifierCode:] + 572 4 Photos 0x1c11cb278 -[PHCloudIdentifierLookup _lookupLocalIdentifiersForIdentifierCode:codeSpecificCloudIdentifierStrings:] + 92 5 Photos 0x1c11cbbec __69-[PHCloudIdentifierLookup lookupLocalIdentifiersForCloudIdentifiers:]_block_invoke + 88 6 CoreFoundation 0x1a9f67d70 NSDICTIONARY_IS_CALLING_OUT_TO_A_BLOCK + 24 7 CoreFoundation 0x1a9f67bec -[__NSDictionaryM enumerateKeysAndObjectsWithOptions:usingBlock:] + 288 8 Photos 0x1c11cbb40 -[PHCloudIdentifierLookup lookupLocalIdentifiersForCloudIdentifiers:] + 488 9 Photos 0x1c125e6e4 -[PHPhotoLibrary(CloudIdentifiers) localIdentifierMappingsForCloudIdentifiers:] + 96 10 Photos 0x1c1104f18 0x1c1101000 + 16152 "exceptionReason" : {"arguments":["-[__NSDictionaryM setObject:forKeyedSubscript:]"],"format_string":"*** %s: key cannot be nil","name":"NSInvalidArgumentException","type":"objc-exception","composed_message":"*** -[__NSDictionaryM setObject:forKeyedSubscript:]: key cannot be nil","class":"NSException"}, The crash can be recreated by calling PHPhotoLibrary.cloudIdentifierMappings(forLocalIdentifiers:) with random identifiers, as demonstrated below: let cloudIdentifiers = (0...10).map { _ in PHCloudIdentifier(stringValue: UUID().uuidString) } let localIdsMap = PHPhotoLibrary.shared().localIdentifierMappings(for: cloudIdentifiers) I've noticed that the method seems to crash consistently when an incorrect cloud identifier is input. This is surprising to me since the return type is [PHCloudIdentifier : Result<String, Error>], so I was anticipating an explicit error. The issue can be reproduced on both Xcode 15.0 & iOS 17.2 and Xcode 14.3.1 & iOS 16.0. While I'm fairly certain that only valid identifiers are used in my app, I would still like to know if there is a potential way to validate a cloud identifier prior to accessing this method?
0
0
464
Jan ’24
Apple Music preview question
I have a question about the Apple Music preview app for Windows 11. It has a setting called Sound Check. Is that feature available on the Apple Music web player and the Apple Music Android app? If not, is that a planned feature for those?
0
0
450
Jan ’24
Code=-11803 "Cannot Record" Error while capturing photo from AVCaptureSession ?
Hi Everyone need your help . I am working on an application where I am capturing photo from the back camera using AVCaptureSession. It is working fine with the devices running iOS17+ but I am facing an error on device iPhone X running iOS 16.7.4 ERROR: error: Optional(Error Domain=AVFoundationErrorDomain Code=-11803 "Cannot Record" UserInfo={NSUnderlyingError=0x283f0b780 {Error Domain=NSOSStatusErrorDomain Code=-16409 "(null)"}, NSLocalizedRecoverySuggestion=Try recording again., AVErrorRecordingFailureDomainKey=3, NSLocalizedDescription=Cannot Record}) Here is my Code: `final class CedulaScanningVC: UIViewController { var captureSession: AVCaptureSession! var stillImageOutput: AVCapturePhotoOutput! var videoPreviewLayer: AVCaptureVideoPreviewLayer! var delegate: ScanCedulaDelegate? override func viewDidLoad() { super.viewDidLoad() } override func viewWillAppear(_ animated: Bool) { super.viewWillAppear(animated) } override func viewWillDisappear(_ animated: Bool) { super.viewWillDisappear(animated) self.captureSession.stopRunning() } override func viewDidAppear(_ animated: Bool) { super.viewDidAppear(animated) setupCamera() } // MARK: - Configure Camera func setupCamera() { captureSession = AVCaptureSession() captureSession.sessionPreset = .medium guard let backCamera = AVCaptureDevice.default(for: AVMediaType.video) else { print("Unable to access back camera!") return } let input: AVCaptureDeviceInput do { input = try AVCaptureDeviceInput(device: backCamera) //Step 9 stillImageOutput = AVCapturePhotoOutput() if captureSession.canAddInput(input) && captureSession.canAddOutput(stillImageOutput) { captureSession.addInput(input) captureSession.addOutput(stillImageOutput) setupLivePreview() } } catch let error { print("Error Unable to initialize back camera: \(error.localizedDescription)") } } func setupLivePreview() { videoPreviewLayer = AVCaptureVideoPreviewLayer(session: captureSession) videoPreviewLayer.videoGravity = .resizeAspectFill videoPreviewLayer.connection?.videoOrientation = .portrait self.view.layer.addSublayer(videoPreviewLayer) //Step12 DispatchQueue.global(qos: .userInitiated).async { [weak self] in self?.captureSession.startRunning() //Step 13 DispatchQueue.main.async { self?.videoPreviewLayer.frame = self?.view.bounds ?? .zero } } } func failed() { let ac = UIAlertController(title: "Scanning not supported", message: "Your device does not support scanning a code from an item. Please use a device with a camera.", preferredStyle: .alert) ac.addAction(UIAlertAction(title: "OK", style: .default)) present(ac, animated: true) captureSession = nil } // MARK: - actions func cameraButtonPressed() { let settings = AVCapturePhotoSettings(format: [AVVideoCodecKey: AVVideoCodecType.jpeg]) stillImageOutput.capturePhoto(with: settings, delegate: self) } } extension CedulaScanningVC: AVCapturePhotoCaptureDelegate { func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) { print("error: \(error)") captureSession.stopRunning() DispatchQueue.main.asyncAfter(deadline: .now() + 1.0) { [weak self] in guard let self = self else {return} guard let imageData = photo.fileDataRepresentation() else { print("NO image captured") return } let image = UIImage(data: imageData) self.delegate?.capturedImage(image: image) } } }` I don't know what am doing wrong ?
0
0
777
Jan ’24
Capturing Photot error Code=-11803 "Cannot Record"
Hi iOS community need your help. I am working on an application where I am capturing photo from the back camera using AVCaptureSession. It is working fine with the devices running iOS17+ but I am facing an error on device iPhone X running iOS 16.7.4 ERROR: ` error: Optional(Error Domain=AVFoundationErrorDomain Code=-11803 "Cannot Record" UserInfo={NSUnderlyingError=0x283f0b780 {Error Domain=NSOSStatusErrorDomain Code=-16409 "(null)"}, NSLocalizedRecoverySuggestion=Try recording again., AVErrorRecordingFailureDomainKey=3, NSLocalizedDescription=Cannot Record}) My Code here: final class CedulaScanningVC: UIViewController { var captureSession: AVCaptureSession! var stillImageOutput: AVCapturePhotoOutput! var videoPreviewLayer: AVCaptureVideoPreviewLayer! var delegate: ScanCedulaDelegate? override func viewDidLoad() { super.viewDidLoad() } override func viewWillAppear(_ animated: Bool) { super.viewWillAppear(animated) } override func viewWillDisappear(_ animated: Bool) { super.viewWillDisappear(animated) self.captureSession.stopRunning() } override func viewDidAppear(_ animated: Bool) { super.viewDidAppear(animated) setupCamera() } // MARK: - Configure Camera func setupCamera() { captureSession = AVCaptureSession() captureSession.sessionPreset = .medium guard let backCamera = AVCaptureDevice.default(for: AVMediaType.video) else { print("Unable to access back camera!") return } let input: AVCaptureDeviceInput do { input = try AVCaptureDeviceInput(device: backCamera) //Step 9 stillImageOutput = AVCapturePhotoOutput() if captureSession.canAddInput(input) && captureSession.canAddOutput(stillImageOutput) { captureSession.addInput(input) captureSession.addOutput(stillImageOutput) setupLivePreview() } } catch let error { print("Error Unable to initialize back camera: \(error.localizedDescription)") } } func setupLivePreview() { videoPreviewLayer = AVCaptureVideoPreviewLayer(session: captureSession) videoPreviewLayer.videoGravity = .resizeAspectFill videoPreviewLayer.connection?.videoOrientation = .portrait self.view.layer.addSublayer(videoPreviewLayer) //Step12 DispatchQueue.global(qos: .userInitiated).async { [weak self] in self?.captureSession.startRunning() //Step 13 DispatchQueue.main.async { self?.videoPreviewLayer.frame = self?.view.bounds ?? .zero } } } func failed() { let ac = UIAlertController(title: "Scanning not supported", message: "Your device does not support scanning a code from an item. Please use a device with a camera.", preferredStyle: .alert) ac.addAction(UIAlertAction(title: "OK", style: .default)) present(ac, animated: true) captureSession = nil } // MARK: - actions func cameraButtonPressed() { let settings = AVCapturePhotoSettings(format: [AVVideoCodecKey: AVVideoCodecType.jpeg]) stillImageOutput.capturePhoto(with: settings, delegate: self) } } extension CedulaScanningVC: AVCapturePhotoCaptureDelegate { func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) { print("error: \(error)") captureSession.stopRunning() DispatchQueue.main.asyncAfter(deadline: .now() + 1.0) { [weak self] in guard let self = self else {return} guard let imageData = photo.fileDataRepresentation() else { print("NO image captured") return } let image = UIImage(data: imageData) self.delegate?.capturedImage(image: image) } } } I don't know what am doing wrong ?
0
0
641
Jan ’24
iPhone 15 Pro Front Camera quality issues and poor face photos
This isn't just my observation but lots of people around me and also you can find tonnes of feedback on the inter webs. The processing of images taken with the front facing camera on the 15 (and I think 14 before) is so over processed that I'm aware of people jumping to other phones. And they're right. The 15 exacerbates that even more. You can turn off HDR (a viewing thing), you can prioritise speed over processing but really you cannot turn this off. You can take a Live Photo and then choose a different frame and the processing is less. As a developer I look at that and think it's bonkers, it's just software so why hasn't anyone produced a camera app that makes faces look good (not AI processing) from the front camera. I can be all enthusiastic and say I will develop one but it seems like a simple, obvious fix for Apple. To have the settings so bad that I have friends returning their phones, seems pretty bad. And as a photographer I would agree. There's a lot to love with Apple on the 15 and the log and prores but a simple selfie produces such ugly results. That's an actual problem. So throwing it it out there. What does everyone think? cheers Paul
0
1
1.6k
Jan ’24
Apple News API
Is it necessary to enroll to Apple devloper program to get into apple news publisher to get my Apple news API credentials. At last. I need a gudance how to publish the articles on the apple News using News API. a detailed explanation till getting Apple News API credentials ( Key, secret key channel id) is much appreciated!
0
0
670
Jan ’24
Debugging CAAnimation that will not start under very strange circumstances.
I have a navigation controller with two VCs. One VC is pushed onto the NavController, the other is presented on top of the NavController. The presented VC has a relatively complex animation involving a CAEmitter -> Animate birth rate down -> Fade out -> Remove. The pushed VC has an 'inputAccessoryView' and can become first responder. The expected behavior is open presented VC -> Emitter Emits pretty pictures -> emitter stops gracefully. The animation works perfectly. UNLESS I open pushed VC -> Leave -> go to presented VC. In this case when I open the presented VC the emitter emits pretty pictures -> they never stop. (Please do not ask me how long it took to figure this much out 🤬😔) The animation code in question is: let animation = CAKeyframeAnimation(keyPath: #keyPath(CAEmitterLayer.birthRate)) animation.duration = 1 animation.timingFunction = CAMediaTimingFunction(name: .easeIn) animation.values = [1, 0 , 0] animation.keyTimes = [0, 0.5, 1] animation.fillMode = .forwards animation.isRemovedOnCompletion = false emitter.beginTime = CACurrentMediaTime() let now = Date() CATransaction.begin() CATransaction.setCompletionBlock { [weak self] in print("fade beginning -- delta: \(Date().timeIntervalSince(now))") let transition = CATransition() transition.delegate = self transition.type = .fade transition.duration = 1 transition.timingFunction = CAMediaTimingFunction(name: .easeOut) transition.setValue(emitter, forKey: kKey) transition.isRemovedOnCompletion = false emitter.add(transition, forKey: nil) emitter.opacity = 0 } emitter.add(animation, forKey: nil) CATransaction.commit() The delegate method is: extension PresentedVC: CAAnimationDelegate { func animationDidStop(_ anim: CAAnimation, finished flag: Bool) { if let emitter = anim.value(forKey: kKey) as? CALayer { emitter.removeAllAnimations() emitter.removeFromSuperlayer() } else { } } } Here is the pushed VC: class PushedVC: UIViewController { override var canBecomeFirstResponder: Bool { return true } override var canResignFirstResponder: Bool { return true } override var inputAccessoryView: UIView? { return UIView() } } So to reiterate - If I push pushedVC onto the navController, pop it, present PresentedVC the emitters emit, but then the call to emitter.add(animation, forKey: nil) is essentially ignored. The emitter just keeps emitting. Here are some sample happy print statements from the completion block: fade beginning -- delta: 1.016232967376709 fade beginning -- delta: 1.0033869743347168 fade beginning -- delta: 1.0054619312286377 fade beginning -- delta: 1.0080779790878296 fade beginning -- delta: 1.0088880062103271 fade beginning -- delta: 0.9923020601272583 fade beginning -- delta: 0.99943196773529 Here are my findings: The issue presents only when the pushed VC has an inputAccessoryView AND canBecomeFirstResponder is true It does not matter if the inputAccessoryView is UIKit or custom, has size, is visible, or anything. When I dismiss PresentedVC the animation is completed and the print statements show. Here are some unhappy print examples: fade beginning -- delta: 5.003802061080933 fade beginning -- delta: 5.219511032104492 fade beginning -- delta: 5.73025906085968 fade beginning -- delta: 4.330522060394287 fade beginning -- delta: 4.786169052124023 CATransaction.flush() does not fix anything Removing the entire CATransaction block and just calling emitter.add(animation, forKey: nil) similarly does nothing - the birth rate decrease animation does not happen I am having trouble creating a simple demo project where the issue is reproducible (it is 100% reproducible in my code, the entirety of which I'm not going to link here) so I think getting a "solution" is unrealistic. What I would love is if anyone had any suggestions on where else to look? Any ways to debug CAAnimation? I think if I can solve the last bullet - emitter.add(animation, forKey: nil) called w/o a CATransaction - I can break this whole thing. Why would a CAAnimation added directly to the layer which is visible and doing stuff refuse to run?
1
0
671
Jan ’24
CoreMediaErrorDomain error -12865
Hello, can anybody help me with this ? I am downloading video in FS, and when I give that url to player it gives me this error. but this comes up only in case of m3u8. other format like mp4 are working fine locally. please help ! {"error": {"code": -12865, "domain": "CoreMediaErrorDomain", "localizedDescription": "The operation couldn’t be completed. (CoreMediaErrorDomain error -12865.)", "localizedFailureReason": "", "localizedRecoverySuggestion": ""}, "target": 13367}
2
1
816
Jan ’24
Is it possible to get an artistId from MusicKit just off the albumID
struct AlbumDetails : Hashable { let artistId: String? } func fetchAlbumDetails(upc: String) async throws -> AlbumDetails { let request = MusicCatalogResourceRequest<Album>(matching: \.upc, equalTo: upc) let response = try await request.response() guard let album = response.items.first else { throw NSError(domain: "AlbumNotFound", code: 0, userInfo: nil) } do { let artistID = try await fetchAlbumDetails(upc: upc) print("Artist ID: \(artistID)") } catch { print("Error fetching artist ID: \(error)") } return AlbumDetails(artistId: album.artists?.first?.id) with this function, i can return nearly everything except the artist ID so i know its not a problem with the request but I know there has to be a way to get the artistID, there has too. If anyone has a solution to this I would reallly appricate it
1
0
523
Jan ’24
How to use Swift and AVFoundation to stream/record USB microphone input?
I have a custom USB device that includes a microphone. I can see the microphone on macOS when I plug in the device so I know that it is working with the kernel and AV subsystems. I can enumerate and reference the microphone using AVCaptureDevice but I have not been able to figure out how to use this device reference with AVAudioEngine. I'm trying to accomplish two things with this microphone. I want to stream audio from the microphone and have it rendered to the speakers on my MacBook Pro. I want to capture sound data from the microphone and forward it to a live streaming API. To my mind, from what I've read, I need AVAudioEngine to do this but I'm having trouble determining from the documentation just how to go about it on macOS. It seems that there is a lot more information for iOS or iPadOS but since USB-C support is sparsely documented on those operating systems, I'm focusing on the desktop (macOS) for now. Can I convert an AVCaptureDevice into and audio input for AVAudioEngine? If not, how can I accomplish what I'm trying to do using whatever is available on AVFoundation?
1
0
1.2k
Jan ’24
Unable change Photo permission on setting app
My app uses camera and photo library. I found that if a user follows certain steps, they will no longer be able to change the photo permissions for my app in the Settings app. The steps are as follows Press the camera button in the app to launch the camera. Take a picture with camera permissions granted. grant ".addOnly" permission to the photo library. Press the photo library button in the app to read photo library. Deny ".readWrite" permission to the photo library. After step 5, the Settings app only shows items to switch ".addOnly" permissions, but not ".readWrite" permissions. I am aware that in iOS14 or later, the permission required after a photo is taken with the camera should be ".addOnly". Therefore, I suspect that this problem is occurring in other apps. So far I have devised my app to deal with this problem, but is this the expected behavior of the Settings app? If so, how can I avoid this problem?
0
1
390
Jan ’24
How to retrieve the playback state
Per FP Streaming programming guide, The SPC includes a specific TLLV to provide the state of the media content playback. And total value length of this is 16 in decimals. Here i'm trying to retrieve the Playback State. which is of 20-23 ByteRange. byte[] mediaPlaybackStateBlock = getBlock(MEDIA_PLAYBACK_STATE).getValueData(); playbackState = Arrays.copyOfRange(mediaPlaybackStateBlock, 20, 24); I'm endup in issue - arraycopy: length -4 is negative. I'm bit confused on how to retrieve the playback state from 20-23 ByteRange when its length jus 16.. Kindly clarify
0
0
494
Jan ’24