Streaming

RSS for tag

Deep dive into the technical specifications that influence seamless playback for streaming services, including bitrates, codecs, and caching mechanisms.

Streaming Documentation

Post

Replies

Boosts

Views

Activity

CoreMediaErrorDomain : code : -16012
Hi , We are getting big spikes of errors on very few programs in some live channels. [-16012:CoreMediaErrorDomain] [Error Domain=CoreMediaErrorDomain Code=-16012 "(null)"] When this error occurred , AVPlayer stops & users has to restart the playback. This is the error we are getting & this happens in some live programs. We have the same set up & uses same transcoders etc in all programs. Mostly we have very low error rate in player with live programs , but with this error , error rate can increase up to 80% of the users effecting pretty much all the users on apple devices. Does anyone know what this error actually means ? What is the context & what is the reason behind that ? It seems like this may be related subtitles & this occurs only when the subtitles are enabled. ( The subtitles are not embedded in the stream it is teletext ) Try to find in apple documents & online & nothing could be find unfortunately.
1
0
1.6k
May ’23
Issues Detecting Apple TV's HDR Playback Capability
I've been working with the eligibleforhdrplayback property to determine if HDR playback is supported. However, I've noticed an inconsistency. When the video format switches from HDR to SDR in settings menu on Apple TV, the property still returns true, indicating HDR is playable even when it's not (This seems to contradict what was mentioned around the [20:40] mark of this WWDC video). I've tried using the eligibleForHDRPlaybackDidChangeNotification and even restarted the app, but I still encounter the same issue. Are there alternative approaches to accurately determine if the app can play HDR content on Apple TV?
0
0
506
Oct ’23
Playing Audio Streamed data immediately after first chunk arrives.
Hi there, I want to play a stream of audio immediately after the first chunk of the audio arrives. The data stream looks like this Audio_data, Delimiter, Json_data. currently I am handling all chunks before the delimiter and adds it in the queue of the AVQueuePlayer. However, when playing this audio during the stream there are many glitches and does not work well. Waiting until all chunks arrived and then play the audio works well. So I assume there is no problem with the audio data, but with the handling of the chunks as they come and play immediately. Happy about any advice you have! I am pretty lost right now. Thank you so much. import SwiftUI import AVFoundation struct AudioStreamView: View { @State private var players: [AVAudioPlayer] = [] @State private var jsonString: String = "" @State private var queuePlayer = AVQueuePlayer() var streamDelegate = AudioStreamDelegate() var body: some View { VStack(spacing: 20) { Button("Fetch Stream") { fetchDataFromServer() } .padding() TextEditor(text: $jsonString) .disabled(true) .border(Color.gray) .padding() .frame(minHeight: 200, maxHeight: .infinity) } } func fetchDataFromServer() { guard let url = URL(string: "https://dev-sonia.riks0trv4c6ns.us-east-1.cs.amazonlightsail.com/voiceMessage") else { return } var request = URLRequest(url: url) request.httpMethod = "POST" // Specify the request type as POST let parameters: [String: Any] = [ "message_uuid": "value1", "user_uuid": "68953DFC-B9EA-4391-9F32-0B36A34ECF56", "session_uuid": "value3", "timestamp": "value4", "voice_message": "Whats up?" ] request.httpBody = try? JSONSerialization.data(withJSONObject: parameters, options: .fragmentsAllowed) request.addValue("application/json", forHTTPHeaderField: "Content-Type") let task = URLSession.shared.dataTask(with: request) { [weak self] (data, response, error) in guard let strongSelf = self else { return } if let error = error { print("Error occurred: \(error)") return } // You might want to handle the server's response more effectively based on the API's design. // For now, I'll make an assumption that the server returns the audio URL in the response JSON. if let data = data { do { if let jsonResponse = try JSONSerialization.jsonObject(with: data, options: []) as? [String: Any], let audioURLString = jsonResponse["audioURL"] as? String, let audioURL = URL(string: audioURLString) { DispatchQueue.main.async { strongSelf.playAudioFrom(url: audioURL) strongSelf.jsonString = String(data: data, encoding: .utf8) ?? "Invalid JSON" } } else { print("Invalid JSON structure.") } } catch { print("JSON decoding error: \(error)") } } } task.resume() } func playAudioFrom(url: URL) { let playerItem = AVPlayerItem.init(url: url) queuePlayer.replaceCurrentItem(with: playerItem) queuePlayer.play() } }
0
1
850
Oct ’23
"outputObscuredDueToInsufficientExternalProtection" state of this flag always remains true even though output is not obscured.
We have a logic in the SDK which stops playback when the outputObscuredDueToInsufficientExternalProtection event is fired by the player. Our initial understanding was that this event is fired only when the DRM blocks the video playback. However, in the present case we see that it is called even when playback is successful(playback with external screen connected). To determine whether playback still functions when the 'outputObscuredDueToInsufficientExternalProtection' event is triggered, we temporarily disabled the playback stop implementation that occurs after the event is triggered. code snippet - Observations - After this event was triggered during mirroring playback using a Lightning to HDMI connector, our expectation was that the playback would result in a black screen. However, to our surprise, the playback worked perfectly, indicating that this event is being triggered even when there are no DRM restrictions for that asset's playback. Another scenario we tested involved using a VGA connector. In this case, we observed that the 'outputObscuredDueToInsufficientExternalProtection' event was triggered. Initially, playback started as expected when we commented out the playback stop implementation. However, after a few seconds of playback, the screen went black. In the first scenario, it was unexpected for the 'outputObscuredDueToInsufficientExternalProtection' event to trigger, as the playback worked without issues even after the event was triggered. However, in the second scenario, the event was triggered as expected. The issue we identified is that this event is being triggered irrespective of the presence of DRM restrictions for the asset. In another scenario, we attempted to differentiate between the VGA and HDMI connectors to determine if such distinction was possible. However, we found that the VGA cable was also recognized as an HDMI port in the case of iOS. We also tested the issue on an older iOS version (iOS 14.6.1) to see if the problem persisted. Surprisingly, we found that the 'outputObscuredDueToInsufficientExternalProtection' event was triggered even in the older OS version. Conclusion: In our analysis, we have identified that the 'outputObscuredDueToInsufficientExternalProtection' flag always remains true even though output is not obsecured. working case log: default 13:23:19.096682+0530 AMC ||| observeValueForKeyPath = "outputObscuredDueToInsufficientExternalProtection" object = <AVPlayer: 0x281284930> change kind = { kind = 1; new = 1; old = 0; } non working case log: default 13:45:21.356857+0530 AMC ||| observeValueForKeyPath = "outputObscuredDueToInsufficientExternalProtection" object = <AVPlayer: 0x281c071e0> change kind = {kind = 1; new = 1; old = 0; } We searched through related documents and conducted a Google search, but we couldn't find any information or references related to this behavior of the 'outputObscuredDueToInsufficientExternalProtection' event. It would be really appreciated if any one can help us with this!
0
1
550
Oct ’23
"outputObscuredDueToInsufficientExternalProtection" state of this flag always remains one (not zero) even when the playback is successful
Our initial understanding was that this event is fired only when the DRM blocks the video playback. However, in the present case we see that it is called even when playback is successful(playback with external screen connected). To assess whether playback remains functional when the 'outputObscuredDueToInsufficientExternalProtection' event is triggered, we conducted two specific scenario tests: 1) playing an asset without any DRM restrictions, and 2) playing an asset with DRM restrictions. Result: In our analysis, we have identified that the 'outputObscuredDueToInsufficientExternalProtection' flag always remains set to one, even when playback is successful. However, it is expected to be set to zero when the playback is successful. working case log when playback is successful: default 13:23:19.096682+0530 AMC ||| observeValueForKeyPath = "outputObscuredDueToInsufficientExternalProtection" object = <AVPlayer: 0x281284930> change kind = { kind = 1; new = 1; old = 0; } non working case log when playback came as black screen: default 13:45:21.356857+0530 AMC ||| observeValueForKeyPath = "outputObscuredDueToInsufficientExternalProtection" object = <AVPlayer: 0x281c071e0> change kind = {kind = 1; new = 1; old = 0; } We searched through related documents and conducted a Google search, but we couldn't find any information or references related to this behavior of the 'outputObscuredDueToInsufficientExternalProtection' event. It would be really appreciated if any one can help us with this!
0
0
586
Oct ’23
HLS/Fairplay - Terminated due to signal 9 - Only when running "Mac (designed for iPad)" from xcode
My project is a TV player app for HLS streams with fairplay encryption. It is made on swiftUI for iPhone and iPad, it is in production. I have enabled the target "Mac (Designed for iPad)" in the project settings, and It is working perfectly on Mac M1 chips when running the app from the Mac AppStore. The Mac version has never been main main focus, but it is nice to have it working so easily. However when I run the app from Xcode, by selecting "My Mac (Designed for iPad)", everytime AVPlayer wants to start playback I am ejected from the app and the only thing I get from the logcat is: Message from debugger: Terminated due to signal 9 Why? And Why does it work when running the app published on the appstore? I was able to debug a bit and identify which line of code triggers the issue but I am still stuck: I am using an AVAssetResourceLoaderDelegate to load the Fairplay Keys instead of the default one (because I need some authentication parameters in the HTTP headers to communicate with the DRM Proxy). So, in the process I am able to request SPC data and CKC (I have verified the data), and then when the loadingRequest.finishLoading() is called.. BOOM the app is terminated and it triggers the log Message from debugger: Terminated due to signal 9. I am sharing the delegate method from the AVAssetResourceLoaderDelegate where it happens. This has been written a while ago and is running fine on all devices. If you are not used to this delegate, it is used by AVPlayer whenever a new mediaItem is set with the method: AVPlayer.replaceCurrentItem(with: mediaItem) func resourceLoader(_ resourceLoader: AVAssetResourceLoader, shouldWaitForLoadingOfRequestedResource loadingRequest: AVAssetResourceLoadingRequest) -> Bool { guard let dataRequest = loadingRequest.dataRequest else { return false } getFairplaycertificate { data, _ in // Request Server Playback Context (SPC) data guard let certificate = data, let contentIdData = (loadingRequest.request.url?.host ?? "").data(using: String.Encoding.utf8), let spcData = try? loadingRequest.streamingContentKeyRequestData( forApp: certificate, contentIdentifier: contentIdData, options: [AVContentKeyRequestProtocolVersionsKey: [1]] ) else { loadingRequest.finishLoading(with: NSError(domain: "tvplayer", code: -1, userInfo: nil)) print("⚠️", #function, "Unable to get SPC data.") return false } // Now the CKC can be requested let networkId = loadingRequest.request.url?.host ?? "" self.requestCKC(spcData: spcData, contentId: networkId) { ckc, error in if error == nil && ckc != nil { // The CKC is correctly returned and is sent to AVPlayer. Stream is decrypted dataRequest.respond(with: ckc!) loadingRequest.contentInformationRequest?.contentType = AVStreamingKeyDeliveryContentKeyType loadingRequest.finishLoading() // <--- THIS LINE IS GUILTY!!! } else { print("⚠️", #function, "Unable to get CKC.") loadingRequest.finishLoading(with: error) } } return true } return true } If I comment loadingRequest.finishLoading() or if I replace it by: loadingRequest.finishLoading(with: error), the app is not terminated but my decryption keys are not loaded. That's the only clue I have so far. Any help would be appreciated. What should I look for? Is there a way to get a detailed error stacktrace? Thanks.
1
2
742
Nov ’23
Issues with Playing MOVPKG Files (Domain=CoreMediaErrorDomain Code=-16845)
I've been encountering a substantial increase in the following error log and am eager to find its root cause. The pattern of these logs emerge predominantly when attempting to play downloaded FPS DRM files(MOVPKG files). Except for a few rare instances, most occurrences are associated with content downloaded in previous OS versions, leading to playback issues following recent OS updates. The error log I've been encountering is as follows: Error Domain=CoreMediaErrorDomain Code=-16845 "HTTP 400: (unhandled)" Even after searching, there are hardly any cases available, and the only thing I found is these issues https://github.com/jhomlala/betterplayer/issues?q=is%3Aissue+16845+is%3Aclosed I've been advising users to delete and re-download the affected content, which, in all cases, results in successful playback. I'm seeking advice from anyone who might have experienced similar issues. If you've encountered a comparable situation or have any suggestions, I would greatly appreciate your input.
0
2
684
Nov ’23
Safari native player thumbnails.
I have the m3u8 like this #EXTM3U #EXT-X-STREAM-INF:AVERAGE-BANDWIDTH=190000,BANDWIDTH=240000,RESOLUTION=240x160,FRAME-RATE=24.000,CODECS="avc1.42c01e,mp4a.40.2",CLOSED-CAPTIONS=NONE tracks-v1a1/mono.m3u8?thumbnails=10 #EXT-X-IMAGE-STREAM-INF:BANDWIDTH=10000,RESOLUTION=240x160,CODECS="jpeg",URI="images-240x160/tpl-0-60-10.m3u8?thumbnails=10" and I have no thumbnails in the Safari native player. Could you please tell me why?
0
0
446
Nov ’23
ContentKeyDelegate functions not being called in iOS 17
Hello, I'm having an issue where my app is in TestFlight, and some of my testers are reporting that FairPlay protected videos are not playing back in iOS 17. It's been working fine in iOS 16 (my app's initial target). I can see from the debug logs that for an online stream request - contentKeySession(_ session: AVContentKeySession, didProvide keyRequest: AVContentKeyRequest) is never called. Whereas, a download for offline playback request, the function is called. I've used much of the sample code in "HLS Catalog With FPS" as part of the FPS developer package. All of my m3u8 files are version 5 and contain encryption instructions like below: #EXT-X-KEY:METHOD=SAMPLE-AES,URI="skd://some-uuid",KEYFORMAT="com.apple.streamingkeydelivery",KEYFORMATVERSIONS="1" Here's a short excerpt of the code being run: let values = HTTPCookie.requestHeaderFields(with: cookies) let cookieOptions = ["AVURLAssetHTTPHeaderFieldsKey": values] assetUrl = "del\(assetUrl)" clip!.assetUrl = AVURLAsset(url: URL(string: assetUrl)!, options: cookieOptions) clip!.assetUrl!.resourceLoader.setDelegate(self, queue: DispatchQueue.global(qos: .default)) ContentKeyManager.shared.contentKeySession.addContentKeyRecipient(clip!.assetUrl!) urlAssetObserver = self.observe(\.isPlayable, options: [.new, .initial]) { [weak self] (assetUrl, _) in guard let strongSelf = self else { return } strongSelf.playerItem = AVPlayerItem(asset: (self!.clip!.assetUrl)!) strongSelf.player.replaceCurrentItem(with: strongSelf.playerItem) } The error thrown is: Task .<8> finished with error [18,446,744,073,709,550,614] Error Domain=NSURLErrorDomain Code=-1002 "unsupported URL" UserInfo={NSLocalizedDescription=unsupported URL, NSErrorFailingURLStringKey=skd://some-uuid, NSErrorFailingURLKey=skd://some-uuid, _NSURLErrorRelatedURLSessionTaskErrorKey=( "LocalDataTask .<8>" ), _NSURLErrorFailingURLSessionTaskErrorKey=LocalDataTask .<8>, NSUnderlyingError=0x2839a7450 {Error Domain=kCFErrorDomainCFNetwork Code=-1002 "(null)"}} Which I believe is being thrown from AVPlayerItem. Without the delegate, it appears to playback fine. However, I need the delegate (I think), since I'm appending some query params to each request for the segments. I have an observer on the playerItem, per the example project which is changing the status to .failed once he -1002 error is thrown. Please let me know if anything rings to mind to try, or if I can provide any additional info. Thanks in advance!
1
0
750
Nov ’23
mediafilesegmenter error with hvc1
I'm using mediafilesegmenter with input as a fragmented mp4 hvc1 file and got this error: Nov 23 2023 17:48:25.948: Fragmented MP4 is the only supported container format for the segmentation of HEVC content Nov 23 2023 17:48:25.948: Unsupported media type 'hvc1' in track 0 Nov 23 2023 17:48:25.948: Unable to find any valid tracks to segment. Segmenting failed (-12780).
2
1
780
Nov ’23
Playing Offline HLS Content with AES-128 encryption on iOS
Hi Team, Offline playback with AES-128 encryption I'm downloading HLS content that is AES-128 encrypted and using the AVAssetResourceLoaderDelegate method shouldWaitForLoadingOfRequestedResource to parse the manifest to fetch the AES key URL. After fetching the key URL, I'll download and save the AES key locally. I will use the locally saved key to start the offline playback. Since AVContentKeySession has been there for quite some time, is it okay to use the resource loader delegate method to parse and download the AES key? Is there any chance that Apple will deprecate the downloading keys through the resource loader delegate? Thanks, Deepak.N
0
0
474
Nov ’23
FairPlay and Extended Clear Lead
We're experimenting with a stream that has a large (10 minutes) clear portion in front of the protected section w/Fairplay. We're noticing that AVPlayer/Safari trigger calls to fetch the license key even while it's playing the clear part, and once we provide the key, playback fails with: name = AVPlayerItemFailedToPlayToEndTimeNotification, object = Optional(<AVPlayerItem: 0x281ff2800> I/NMU [No ID]), userInfo = Optional([AnyHashable("AVPlayerItemFailedToPlayToEndTimeErrorKey"): Error Domain=CoreMediaErrorDomain Code=-12894 "(null)"]) - name : "AVPlayerItemFailedToPlayToEndTimeNotification" - object : <AVPlayerItem: 0x281ff2800> I/NMU [No ID] ▿ userInfo : 1 element ▿ 0 : 2 elements ▿ key : AnyHashable("AVPlayerItemFailedToPlayToEndTimeErrorKey") - value : "AVPlayerItemFailedToPlayToEndTimeErrorKey" - value : Error Domain=CoreMediaErrorDomain Code=-12894 "(null)" It seems like AVPlayer is trying to decrypt the clear portion of the stream...and I'm wondering if it's because we've set up our manifest incorrectly. Here it is: #EXTM3U #EXT-X-VERSION:8 #EXT-X-TARGETDURATION:20 #EXT-X-MEDIA-SEQUENCE:0 #EXT-X-INDEPENDENT-SEGMENTS #EXT-X-PLAYLIST-TYPE:VOD #EXT-X-MAP:URI="clear-asset.mp4",BYTERANGE="885@0" #EXT-X-DEFINE:NAME="path0",VALUE="clear-asset.mp4" #EXTINF:9.98458, #EXT-X-BYTERANGE:81088@885 {$path0} #EXTINF:19.96916, #EXT-X-BYTERANGE:159892@81973 {$path0} #EXTINF:19.96916, #EXT-X-BYTERANGE:160245@241865 {$path0} #EXT-X-DISCONTINUITY #EXT-X-MAP:URI="secure-asset.mp4",BYTERANGE="788@0" #EXT-X-DEFINE:NAME="path1",VALUE="secure-asset.mp4" #EXT-X-KEY:METHOD=SAMPLE-AES,URI="skd://guid",KEYFORMAT="com.apple.streamingkeydelivery",KEYFORMATVERSIONS="1" #EXTINF:19.96916, #EXT-X-BYTERANGE:159928@5196150 {$path1} #EXT-X-ENDLIST
0
0
468
Dec ’23
IDR keyframes now required by Safari (as of iOS 16.3.1?)
Can we confirm that as of iOS 16.3.1, key frames for MPEGTS via HLS are mandatory now? I've been trying to figure out why https://chaney-field3.click2stream.com/ shows "Playback Error" across Safari, Chrome, Firefox, etc.. I ran the diagnostics against one of the m3u8 files that is generated via Developer Tools (e.g. mediastreamvalidator "https://e1-na7.angelcam.com/cameras/102610/streams/hls/playlist.m3u8?token=" and then hlsreport validation_data.json) and see this particular error: Video segments MUST start with an IDR frame Variant #1, IDR missing on 3 of 3 Does Safari and iOS devices explicitly block playback when it doesn't find one? From what I understand AngelCam simply acts as a passthrough for the video/audio packets and does no transcoding but converts the RTSP packets into HLS for web browsers But IP cameras are constantly streaming their data and a user connecting to the site may be receiving the video between key frames, so it would likely violate this expectation. From my investigation it also seems like this problem also started happening in iOS 16.3? I'm seeing similar reports for other IP cameras here: https://ipcamtalk.com/threads/blue-iris-ui3.23528/page-194#post-754082 https://www.reddit.com/r/BlueIris/comments/1255d78/ios_164_breaks_ui3_video_decode/ For what it's worth, when I re-encoded the MPEG ts files (e.g. ffmpeg-i /tmp/streaming-master-m4-na3.bad/segment-375.ts -c:v h264 /tmp/segment-375.ts) it strips the non key frames in the beginning and then playback works properly if I host the same images on a static site and have the iOS device connect to it. It seems like Chrome, Firefox, VLC, and ffmpeg are much more forgiving on missing key frames. I'm wondering what the reason for enforcing this requirement? And can I confirm it's been a recent change?
1
0
541
Dec ’23
Safari Fairplay WebKitMediaKeyError (code: 6, systemCode: 4294955417)
Hi Apple Team, we are observing following error intermittently when trying to playback FairPlay protected HLS streams. The error happens immediately after loading the certificate. Playback with same certificate on same device(Mac, iPhone) works most of time but intermittently this error is observed with following codes. The code=6 means MEDIA_KEYERR_DOMAIN but I did not find any information on what does systemCode=4294955417 mean? Is there a way to check what does this system code mean and what could be causing this intermittent behaviour? { "code": 6, "systemCode": 4294955417 }
1
0
705
Dec ’23
Stereo video HLS
I am trying to set up HLS with MV HEVC. I have an MV HEVC MP4 converted with AVAssetWriter that plays as a "spatial video" in Photos in the simulator. I've used ffmpeg to fragment the video for HLS (sample m3u8 file below). The HLS of the mp4 plays on a VideoMaterial with an AVPlayer in the simulator, but it is hard to determine if the streamed video is stereo. Is there any guidance on confirming that the streamed mp4 video is properly being read as stereo? Additionally, I see that REQ-VIDEO-LAYOUT is required for multivariant HLS. However if there is ONLY stereo video in the playlist is it needed? Are there any other configurations need to make the device read as stereo? Sample m3u8 playlist #EXTM3U #EXT-X-VERSION:3 #EXT-X-TARGETDURATION:13 #EXT-X-MEDIA-SEQUENCE:0 #EXTINF:12.512500, sample_video0.ts #EXTINF:8.341667, sample_video1.ts #EXTINF:12.512500, sample_video2.ts #EXTINF:8.341667, sample_video3.ts #EXTINF:8.341667, sample_video4.ts #EXTINF:12.433222, sample_video5.ts #EXT-X-ENDLIST
5
1
1.7k
Dec ’23
Issue in playing fairplay video
We have to play some encrypted videos from server. In AVAssetResourceLoaderDelegate we got the ckc data correctly and responded with that. Then video just starts playing and stops immdediately. When we check the playerItem error description we got Error Domain=AVFoundationErrorDomain Code=-11819 "Cannot Complete Action" UserInfo={NSLocalizedDescription=Cannot Complete Action, NSLocalizedRecoverySuggestion=Try again later.}. Any one encountered this?
0
0
505
Dec ’23