Streaming

RSS for tag

Deep dive into the technical specifications that influence seamless playback for streaming services, including bitrates, codecs, and caching mechanisms.

Streaming Documentation

Post

Replies

Boosts

Views

Activity

Samsung/LG/Sony TV airplay fails to seek when audio track is not default
Hello, We have a TV app, based on react-native-video, which was tweaked to suit our requirements. There is a problem with AirPlay streaming. An asset can be streamed on AppleTV, but when we try to stream it on any TV with AirPlay and choose a language different from the default in the manifest there is a problem. Seek freezes the picture and nothing happens. The funny thing is if we do seek back to the starting point +/-20 sec, the video resumes. The obvious difference with AppleTV, which we were able to recognize, is that with AppleTv search an isPlaybackBufferEmpty is observed, while with 3rd party TVs, there are only isPlaybackLikelyToKeepUp events firing. Maybe, there is a solution to that issue? Or at least, there is a way to forcefully empty the buffer when search is called? Thank you
1
0
300
Aug ’24
AVPlayer and HLS streams timeout
I find the default timeout of 1 second to download a segment is not reasonable when playing an HLS stream from a server that is transcoding. Does anyone know if it's possible to change this networking timeout? Error status: -12889, Error domain: CoreMediaErrorDomain, Error comment: No response for map in 1s. Event: <AVPlayerItemErrorLogEvent: 0x301866250> Also there is a delegate to control downloading HLS for offline viewing but no delegate for just streaming HLS.
0
0
353
Jul ’24
Issues with launching a stream in AVPlayer using AppIntent on iOS
I am implementing Siri/Shortcuts for radio app for iOS. I have implemented AppIntent that sends notification to app and app should start playing the stream in AVPlayer. AppIntent sometimes works, sometimes it doesn't. So far I couldn't find the pattern when/why it works and when/why it doesn't. Sometimes it works even if app is killed or is in the background. Sometimes it doesn't work when the app is in the background and when it is killed. I have been observing logs in Console and apparently sometimes it stops when AVPlayer tries to figure out buffer size (then I am getting in console AVPlayerWaitingToMinimizeStallsReason and the AVPlayerItem status is set to .unknown). Sometimes it figures out quickly (for the same stream) and starts playing. Sometimes when the app is killed, after AppIntent call the app is launched in the background (at least I see it as a process in Console) and receives notification from AppIntent and start playing. Sometimes... the app is not called at all, and its process is not visible in the console, so it doesn't receives the notification and doesn't play. I have setup Session correctly (set to .playback without any options and activated), I set AVPlayerItem's preferredForwardBufferDuration to 0 (default), and AVPlayer's automaticallyWaitsToMinimizeStalling to true. Background processing, Audio, AirPlay, Picture in Picture and Siri are added in Singing & Capabilities section of the app project settings. Here are the code examples: Play AppIntent (Stop App Intent is constructed the same way): @available(iOS 16, *) struct PlayStationIntent: AudioPlaybackIntent { static let title: LocalizedStringResource = "Start playing" static let description = IntentDescription("Plays currently selected radio") @MainActor func perform() async throws -> some IntentResult { NotificationCenter.default.post(name: IntentsNotifications.siriPlayCurrentStationNotificationName, object: nil) return .result() } } AppShortcutsProvider: struct RadioTestShortcuts: AppShortcutsProvider { static var appShortcuts: [AppShortcut] { AppShortcut( intent: PlayStationIntent(), phrases: [ "Start station in \(.applicationName)", ], shortTitle: LocalizedStringResource("Play station"), systemImageName: "radio" ) } } Player object: class Player: ObservableObject { private let session = AVAudioSession.sharedInstance() private let streamURL = URL(string: "http://radio.rockserwis.fm/live")! private var player: AVPlayer? private var item: AVPlayerItem? var cancellables = Set<AnyCancellable>() typealias UInfo = [AnyHashable: Any] @Published var status: Player.Status = .stopped @Published var isPlaying = false func setupSession() { do { try session.setCategory(.playback) } catch { print("*** Error setting up category audio session: \(error), \(error.localizedDescription)") } do { try session.setActive(true) } catch { print("*** Error setting audio session active: \(error), \(error.localizedDescription)") } } func setupPlayer() { item = AVPlayerItem(url: streamURL) item?.preferredForwardBufferDuration = TimeInterval(0) player = AVPlayer(playerItem: item) player?.automaticallyWaitsToMinimizeStalling = true player?.allowsExternalPlayback = false let metaDataOuptut = AVPlayerItemMetadataOutput(identifiers: nil) } func play() { setupPlayer() setupSession() handleInterruption() player?.play() isPlaying = true player?.currentItem?.publisher(for: \.status) .receive(on: DispatchQueue.main) .sink(receiveValue: { status in self.handle(status: status) }) .store(in: &self.cancellables) } func stop() { player?.pause() player = nil isPlaying = false status = .stopped } func handle(status: AVPlayerItem.Status) { ... } func handleInterruption() { ... } func handle(interruptionType: AVAudioSession.InterruptionType?, userInfo: UInfo?) { ... } } extension Player { enum Status { case waiting, ready, failed, stopped } } extension Player { func setupRemoteTransportControls() { ... } } Content view: struct ContentView: View { @EnvironmentObject var player: Player var body: some View { VStack(spacing: 20) { Text("AppIntents Radio Test App") .font(.title) Button { if player.isPlaying { player.stop() } else { player.play() } } label: { Image(systemName: player.isPlaying ? "pause.circle" : "play.circle") .font(.system(size: 80)) } } .padding() } } #Preview { ContentView() } Main struct: ```import SwiftUI @main struct RadioTestApp: App { let player = Player() let siriPlayCurrentPub = NotificationCenter.default.publisher(for: IntentsNotifications.siriPlayCurrentStationNotificationName) let siriStop = NotificationCenter.default.publisher(for: IntentsNotifications.siriStopRadioNotificationName) var body: some Scene { WindowGroup { ContentView() .environmentObject(player) .onReceive(siriPlayCurrentPub, perform: { _ in player.play() }) .onReceive(siriStop, perform: { _ in player.stop() }) } } }
1
0
401
Jul ’24
AVPlayer Error -16170
Hi I'm trying to run a 4K video on my Apple TV 4K, but I get error in AVPlayer. Error Domain=CoreMediaErrorDomain Code=-16170 I can't get any more information. Example HSL Manifest with video track in 4K: #EXT-X-STREAM-INF:AUDIO="aud_mp4a.40.2",AVERAGE-BANDWIDTH=11955537,BANDWIDTH=12256000,VIDEO-RANGE=SDR,CODECS="hvc1.1.6.L153.90,mp4a.40.2",RESOLUTION=3840x2160,FRAME-RATE=50,HDCP-LEVEL=TYPE-1 video_4/stream.m3u8 Maybe, problem with hvc1 ? But as far as I know, Apple TV supports HEVC.
0
0
368
Jul ’24
LL-HLS magical number of 1002 for --part-target-duration-ms in --iso-fragmented
Hi Guys, I'm working on adding LL-HLS support to the Ant Media Server. I'm following up the documentation in hlstools for streaming and testing mediastreamsegmenter and tsrecompressor. What I wonder is that the sample uses 1002 ms for --part-target-duration-ms (-w in short form) as below mediastreamsegmenter -w 1002 -t 4 224.0.0.50:9123 -s 16 -D -T -f /Library/WebServer/Documents/2M/` It works in this way. mediastreamsegmenter -w 1000 -t 4 224.0.0.50:9123 -s 16 -D -T -f /Library/WebServer/Documents/2M/` It works in this way mediastreamsegmenter -w 1000 -t 4 224.0.0.50:9123 -s 16 -D -T --iso-fragmented -f /Library/WebServer/Documents/2M/` It crashes in this way when I add --iso-fragmented and mediastreamsegmenter gives the following error encountered failure write segment failed (-17543) - exiting It works if I use 1001 or 1003. I wondering if there is a reason for that or is it a bug?
0
0
272
Jul ’24
Multiview HLS with HDR
I have an HDR10+ encoded video that if loaded as a mov plays back on the Apple Vision Pro but when that video is encoded using the latest (1.23b) Apple HLS tools to generate an fMP4 - the resulting m3u8 cannot be played back in the Apple Vision Pro and I only get back a "Cannot Open" error. To generate the m3u8, I'm just calling mediafilesegmenter (with -iso-fragmented) and then variantplaylistcreator. This completes with no errors but the m3u8 will playback on the Mac using VLC but not on the Apple Vision Pro. The relevant part of the m3u8 is: #EXT-X-STREAM-INF:AVERAGE-BANDWIDTH=40022507,BANDWIDTH=48883974,VIDEO-RANGE=PQ,CODECS="ec-3,hvc1.1.60000000.L180.B0",RESOLUTION=4096x4096,FRAME-RATE=24.000,CLOSED-CAPTIONS=NONE,AUDIO="audio1",REQ-VIDEO-LAYOUT="CH-STEREO" {{url}} Has anyone been able to use the HLS tools to generate fMP4s of MV-HEVC videos with HDR10?
1
2
463
Jul ’24
AVPlayer "Server Not Properly Configured" Error in Production
Issue found in Native App or Hybrid App:Native OS Version:Any Device:Any 4.Description: We are using AVPlayer for streaming videos in our iOS application. The streaming works fine in lower sandbox environment, but we are encountering a "server not properly configured" error in the production environment. 5.Steps to Reproduce: Configure AVPlayer with a video URL from the production server. Attempt to play the video. 6.Expected Behavior: The video should stream successfully as it does in the sandbox environment. 7.Actual Behavior: AVPlayer fails to stream the video and reports a "server not properly configured" error.
0
0
333
Jul ’24
Is there a way to directly go from VideoToolbox to Metal for 10-bit/BT.2020 YCbCr HEVC?
tl;dr how can I get raw YUV in a Metal fragment shader from a VideoToolbox 10-bit/BT.2020 HEVC stream without any extra/secret format conversions? With VideoToolbox and 10-bit HEVC, I've found that it defaults to CVPixelBuffers w/ formats kCVPixelFormatType_Lossless_420YpCbCr10PackedBiPlanarFullRange or kCVPixelFormatType_Lossy_420YpCbCr10PackedBiPlanarFullRange. To mitigate this, I have the following snippet of code to my application: // We need our pixels unpacked for 10-bit so that the Metal textures actually work var pixelFormat:OSType? = nil let bpc = getBpcForVideoFormat(videoFormat!) let isFullRange = getIsFullRangeForVideoFormat(videoFormat!) // TODO: figure out how to check for 422/444, CVImageBufferChromaLocationBottomField? if bpc == 10 { pixelFormat = isFullRange ? kCVPixelFormatType_420YpCbCr10BiPlanarFullRange : kCVPixelFormatType_420YpCbCr10BiPlanarVideoRange } let videoDecoderSpecification:[NSString: AnyObject] = [kVTVideoDecoderSpecification_EnableHardwareAcceleratedVideoDecoder:kCFBooleanTrue] var destinationImageBufferAttributes:[NSString: AnyObject] = [kCVPixelBufferMetalCompatibilityKey: true as NSNumber, kCVPixelBufferPoolMinimumBufferCountKey: 3 as NSNumber] if pixelFormat != nil { destinationImageBufferAttributes[kCVPixelBufferPixelFormatTypeKey] = pixelFormat! as NSNumber } var decompressionSession:VTDecompressionSession? = nil err = VTDecompressionSessionCreate(allocator: nil, formatDescription: videoFormat!, decoderSpecification: videoDecoderSpecification as CFDictionary, imageBufferAttributes: destinationImageBufferAttributes as CFDictionary, outputCallback: nil, decompressionSessionOut: &decompressionSession) In short, I need kCVPixelFormatType_420YpCbCr10BiPlanar so that I have a straightforward MTLPixelFormat.r16Unorm/MTLPixelFormat.rg16Unorm texture binding for Y/CbCr. Metal, seemingly, has no direct pixel format for 420YpCbCr10PackedBiPlanar. I'd also rather not use any color conversion in VideoToolbox, in order to save on processing (and to ensure that the color transforms/transfer characteristics match between streamer/client, since I also have a custom transfer characteristic to mitigate blocking in dark scenes). However, I noticed that in visionOS 2, the CVPixelBuffer I receive is no longer a compressed render target (likely a bug), which caused GPU texture read bandwidth to skyrocket from 2GiB/s to 30GiB/s. More importantly, this implies that VideoToolbox may in fact be doing an extra color conversion step, wasting memory bandwidth. Does Metal actually have no way to handle 420YpCbCr10PackedBiPlanar? Are there any examples for reading 10-bit HDR HEVC buffers directly with Metal?
2
0
579
Jul ’24
How to create equalizer for HLS along with AVPlayer?
I saw equalizer in apps like Musi and Spotify. I think (but not sure) they use HLS streaming. If so, how to implement such an equalizer for HLS? I searched and tried several approaches but so far none works, like: AVAudioEngine seems only support local file; Download .ts and merge into .mp3 to make it local can not guarantee real time effect; MTAudioProcessingTap needs the audio track. For remote .mp3 I can extract the audio track but not for HLS. Any suggestion?
0
1
397
Jun ’24
FairPlay+AVC+AC-3 on iOS
We found that when enabled FairPlay to streams that uses AVC video codec and AC-3 + MP4 audio codec, the playback fails on some devices with CoreMedia error 1718449215 after started playback for about a second. Successful playback devices: iPhone SE Failed playback devices: iPhone 12 Pro, iPhone 14 Pro
0
0
326
Jun ’24
HLS Playback Issue with Discontinuity Tag and Low Bitrate Streams and Seek on iOS 17
I am writing to report an issue encountered with the playback of HLS (HTTP Live Streaming) streams that I believe is specific to iOS version 17. The problem manifests when certain conditions are met during the playback of concatenated HLS segments, particularly those with low video bitrate. Below, I will detail the background, symptoms, and steps required to reproduce the issue. Background: Our business scenario requires concatenating two HLS playlists, referred to as 1.m3u8 and 2.m3u8, into a single playlist 12.m3u8. An example of such a playlist is as follows: #EXTM3U #EXT-X-VERSION:3 #EXT-X-ALLOW-CACHE:YES #EXT-X-TARGETDURATION:2 #EXT-X-MEDIA-SEQUENCE:0 #EXTINF:2.0, 1.1.ts #EXTINF:2.0, 1.2.ts #EXTINF:2.0, 1.3.ts #EXT-X-DISCONTINUITY #EXTINF:2.0, 2.1.ts #EXTINF:2.0, 2.2.ts #EXT-X-ENDLIST Problem Symptoms: On PC web browsers, Android devices, and iOS versions 13 and 15, the following is observed: Natural playback completion occurs without any issues. Seeking to different points within the stream (e.g., from 3 seconds to 9 seconds) works as expected. However, on iOS version 17, there is a significant issue: Natural playback completion is unaffected. When seeking to various points within the first playlist (1.m3u8) after playing for 1, 2, or 3 seconds, the audio for the last 3 seconds of 1.m3u8 gets lost. Conditions for Replication: The issue only arises when all the following conditions are satisfied: The video content is generated from a single image and an audio track, ensuring sound presence in the final 3 seconds. The video stream bitrate is below 500 Kbps. (Tested with 1393 Kbps bitrate, which did not trigger the issue.) The HLS streams are concatenated using the #EXT-X-DISCONTINUITY tag to form a virtual 11.m3u8 playlist. (No issues occur when streams are not concatenated.) Seek operations are performed during playback. (No issues occur without seek operations.) The issue is exclusive to iOS version 17. (No issues reported on iOS versions 13 and 15.) Disrupting any one of these conditions results in normal playback behavior. Steps to Reproduce: Using FFmpeg, generate a video from a single image and an audio track, with a suggested duration of 10 to 20 seconds for testing convenience. If the video's bitrate exceeds 1000 Kbps, consider transcoding it to 500 Kbps or lower to avoid potential edge-case issues. Convert the 1.mp4 file into 1.m3u8 using FFmpeg. The segment duration can be set to between 1 and 5 seconds (tested with both 2-second and 5-second durations). Duplicate 1.m3u8 as 2.m3u8, then concatenate 1.m3u8 and 2.m3u8 into 12.m3u8 as shown below: #EXTM3U #EXT-X-VERSION:3 #EXT-X-ALLOW-CACHE:YES #EXT-X-TARGETDURATION:2 #EXT-X-MEDIA-SEQUENCE:0 #EXTINF:2.0, 1.1.ts #EXTINF:2.0, 1.2.ts #EXT-X-DISCONTINUITY #EXTINF:2.0, 1.1.ts #EXTINF:2.0, 1.2.ts #EXT-X-ENDLIST On an iOS 17 device, play 12.m3u8 for 1, 2, or 3 seconds, then seek to any point between 7 and 9 seconds (within the duration of 1.m3u8). This action results in the loss of audio for the last 3 seconds of 1.m3u8.
0
0
403
Jun ’24
Error 15517 when playing HLS
Playing fMP4 HLS stream on VisionOS beta. This is the stream, HEVC main 10 and EAC3 6 channel: #EXT-X-STREAM-INF:BANDWIDTH=6760793,AVERAGE-BANDWIDTH=6760793,VIDEO-RANGE=PQ,CODECS="hvc1.2.4.L150.B0,mp4a.a6",RESOLUTION=3840x2160,FRAME-RATE=23.976,SUBTITLES="subs" This is what AVPlayer says: Error Domain=AVFoundationErrorDomain Code=-11848 "Cannot Open" UserInfo={NSLocalizedFailureReason=The media cannot be used on this device., NSLocalizedDescription=Cannot Open, NSUnderlyingError=0x3009e37b0 {Error Domain=CoreMediaErrorDomain Code=-15517 "(null)"}} I can't find any documentation for the underlying error 15517. Is it because "mp4a.a6" is declared in the codec list and not "ec-3"? hlsreport has these MUST FIX issues: 1. Measured peak bitrate compared to multivariant playlist declared value exceeds error tolerance Multivariant Playlist Stream Definition for All Variants 2. Stereo audio in AAC-LC, HE-AAC v1, or HE-AAC v2 format MUST be provided Multivariant Playlist 3. If Dolby Digital Plus is provided then Dolby Digital MUST be provided also Multivariant Playlist 4. I-frame playlists ( EXT-X-I-FRAME-STREAM-INF ) MUST be provided to support scrubbing and scanning UI Multivariant Playlist 5. The server MUST deliver playlists using gzip content-encoding All Variants All Renditions Multivariant Playlist 6. You MUST provide multiple bit rates of video Multivariant Playlist 7. Playlist codec type doesn't match content codec type All Variants 8. (Segment) The operation couldn’t be completed. (HTTPPumpErrorDomain error -16845 - HTTP 400: (unhandled)) (list of subtitle renditions) 9. (Segment) HTTP 400 - HTTP/2.0 400 Bad Request (list of subtitle renditions) 10. Multichannel audio MUST be separate audio stream All Variants 11. If EXT-X-INDEPENDENT-SEGMENTS is not in the multivariant playlist, then you MUST use the EXT-X-INDEPENDENT-SEGMENTS tag in all video media playlists All Variants 12. The CODECS attribute MUST include every media format present All Variants, does not declare EC-3
1
1
576
Jun ’24
Low Latency streaming via CDN
Hello gents, we have a problem serving LL HLS streams via CDN. Im kind of desperate what is wrong as stream plays correctly without CDN. The error thrown is: .invalidStream: Error Domain=CoreMediaErrorDomain Code=-15416 "Blocking Playlist Reload failed" (See -[AVPlayerItem errorLog] for 6 events) UserInfo={NSDescription=Blocking Playlist Reload failed, NSDebugDescription=See -[AVPlayerItem errorLog] for 6 events} errorLog: <AVPlayerItemErrorLog: 0x30174d5a0> when i check in manifest we have this enabled: #EXT-X-SERVER-CONTROL:CAN-BLOCK-RELOAD=YES,HOLD-BACK=6.000,PART-HOLD-BACK=4.000 what i do not understand is what this feature does or if CDN can somehow impact this feature. Any help is appreciated. Thanks!
0
0
407
Jun ’24
Stopped playing hls+fairplay, but was played yesterday
device iPhone 13 Pro max AVPlayer recieve error #Version: 1.0 #Software: AppleCoreMedia/1.0.0.21E236 (iPhone; U; CPU OS 17_4_1 like Mac OS X; en_us) #Date: 2024/06/11 14:06:18.018 CoreMediaErrorDomain error -42716 On system log we see propertyKey:IsLeaseExpired стандартное 14:38:28.008912+0300 mediaplaybackd <<<< FigCPECryptorPKD >>>> FigPKDCPECryptorSetKeyRequestResponse: 0xcc4160140 612C5D12-F92F-4897-B55D-854F842B41B4 keyResponseOptions:[] keyRequestResponse:0xcc5759d70 err:-42716 стандартное 14:38:28.009052+0300 mediaplaybackd keyboss ckb_transitionRequestToTerminalState: 0xcc5440380 <private> reqFin err <private> (-42716) dokeyCallbacksExist 0 стандартное 14:38:28.009138+0300 mediaplaybackd keyboss ckb_customURLReadCallback: 0xcc5440380 <private> customURLReqID 8 isComplete 1 err 0 error <private> (0) dokeyCallbacksExist 0 стандартное 14:38:28.009169+0300 mediaplaybackd <<<< FigPKDKeyManager >>>> keyManager_copyPropertyForEntryInternal: 0xcc546b280 6143FF61-5E6F-4506-A17D-4DB6B6BA0C5B propertyKey:IsLeaseExpired propertyValue: 0x16e5b17a0 стандартное 14:38:28.009196+0300 mediaplaybackd <<<< FigPKDKeyManager >>>> PKDKeyManagerSetKeyRequestError: keyManager:0xcc546b280 keyID:6143FF61-5E6F-4506-A17D-4DB6B6BA0C5B error:<private> (-19160) err:0 on server leaseDuration always is 0 But we have another device iPhone 15 Pro max, iOS 17.5.1 and he played correctly with the same crypt key from server
0
0
395
Jun ’24
AVPlayer CoreMediaErrorDomain -12642
Hi everyone, I am having a problem on AVPlayer when I try to play some videos. The video starts for a few seconds, but immediately after I see a black screen and in the console there is the following error: <__NSArrayM 0x14dbf9f30>( { StreamPlaylistError = "-12314"; comment = "have audio audio-aacl-54 in STREAMINF without EXT-X-MEDIA audio group"; date = "2024-05-13 20:46:19 +0000"; domain = CoreMediaErrorDomain; status = "-12642"; uri = "http://127.0.0.1:8080/master.m3u8"; }, { "c-conn-type" = 1; "c-severity" = 2; comment = "Playlist parse error"; "cs-guid" = "871C1871-D566-4A3A-8465-2C58FDC18A19"; date = "2024-05-13 20:46:19 +0000"; domain = CoreMediaErrorDomain; status = "-12642"; uri = "http://127.0.0.1:8080/master.m3u8"; } )
1
0
811
May ’24
Fairplay on M2/M3 MacBook's not honouring lease duration in license
Doing some testing around player behaviour when a license expires in Safari on MacOS. I had the following outcomes: On Safari 17.4.1 on an intel based Mac running Ventura, playback stopped when the license expired. On Safari 17.4.1 on an M3 running Sonoma, playback stalled briefly, and then continued to play unlimited. On Safari 16.5.2 on an M2 running Ventura, playback stalled briefly, and then continued to play unlimited. When playback stalled briefly, was at the time the license expired. I parsed the license and everything is set correctly for a lease license type: { "version" : 1, "payloadLength" : 1072, "iv" : "0H9NCXLQeh1ziYpmJXsnwQ==", "assetId" : "䙁㉌䉃\u0000\u0000\u0000", "hdcp" : "TYPE_0_REQUIRED", "contentKeyDuration" : { "leaseDurationSeconds" : 300, "rentalDurationSeconds" : 0, "persistenceAllowed" : false }, "keyType" : "Lease" } I cannot find any information relating to this behaviour. Per the docs for FPS, a lease license type: If the content key is not renewed, the Apple device stops the playback when the lease expires. Which is what is observed on the intel based macbook.
0
0
466
May ’24
AVPlayer and TLS 1.3 compliance for low latency HLS live stream
Hi guys, I'm investigating failure to play low latency Live HLS stream and I'm getting following error: (String) “<AVPlayerItemErrorLog: 0x30367da10>\n#Version: 1.0\n#Software: AppleCoreMedia/1.0.0.21L227 (Apple TV; U; CPU OS 17_4 like Mac OS X; en_us)\n#Date: 2024/05/17 13:11:46.046\n#Fields: date time uri cs-guid s-ip status domain comment cs-iftype\n2024/05/17 13:11:16.016 https://s2-h21-nlivell01.cdn.xxxxxx.***/..../xxxx.m3u8 -15410 \“CoreMediaErrorDomain\” \“Low Latency: Server must support http2 ECN and SACK\” -\n2024/05/17 13:11:17.017 -15410 \“CoreMediaErrorDomain\” \“Invalid server blocking reload behavior for low latency\” -\n2024/05/17 13:11:17.017 The stream works when loading from dev server with TLS 1.3, but fails on CDN servers with TLS 1.2. Regular Live streams and VOD streams work normally on those CDN servers. I tried to configure TLSv1.2 in Info.plist, but that didn't help. When running nscurl --ats-diagnostics --verbose it is passing for the server with TLS 1.3, but failing for CDN servers with TLS 1.2 due to error Code=-1005 "The network connection was lost." Is TLS 1.3 required or just recommended? Refering to https://developer.apple.com/documentation/http-live-streaming/enabling-low-latency-http-live-streaming-hls and https://datatracker.ietf.org/doc/html/draft-pantos-hls-rfc8216bis Is it possible to configure AVPlayer to skip ECN and SACK validation? Thanks.
1
1
579
May ’24