There are significant crash reports coming from iOS 18 users regarding AVKit framework that starts from this line [AVPlayerController _observeValueForKeyPath:oldValue:newValue:] which seems to be coming from iOS internal SDK. There are 2 kinds of crash we found:
UI modification on background thread
From the stack trace it seems like when AVPictureInPictureController is being deallocated and its view is being removed from superview somehow the code is being executed in background thread because there is this line there _AssertAutoLayoutOnAllowedThreadsOnly highlighted before the crash.
But I’ve checked our code that plays around AVPictureInPictureController, in the locations where we would deallocate the object it will always be called on main thread which are insideviewDidLoad and deinit inside UIViewController class. From the log, it seems like the crash happened when user try to open another content when PIP player is active resulting in the current PIP instance will be replaced with a new one. My suspect is the observation logic inside AVPlayerController could be the hint to this issue, probably something broken over there since this issue happened across our app versions on iOS 18 users only.
Unfortunately, I was unable to reproduce this issue yet but one of my colleagues reproduced it once but haven’t been able to do it again since. The reports keep raising each day up to 1.3k events in the last 30 days now.
Over release object
This one has lower reports than the first one but I decided to include it since it might have relevant information regarding the first crash since the starting stack trace is similar. The crash timing seems to be similar to the first one, where we deallocate existing AVPictureInPictureController and later replace it with a new one and also found only in iOS 18 users which also refers to [AVPlayerController _observeValueForKeyPath:oldValue:newValue:]. I also was unable to reproduce this issue so far.
Oh, and both of the issues happened on both iPhone and iPad.
We’d appreciate any advice on what we can do to avoid this in the future and probably any hint on why it could happened.
I have reported this issue with bug number: FB15620734
I also attached one sample crash report for each of the crashes here.
non ui thread access.crash
over release.crash
AVKit
RSS for tagCreate view-level services for media playback, complete with user controls, chapter navigation, and support for subtitles and closed captioning using AVKit.
Posts under AVKit tag
73 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
Observing 4K playback issues on tvOS 18. Encountering HTTP 416 (Range Not Satisfiable) errors when the player attempts to request byte ranges that are outside the available data on the server. This leads to fatal playback error resulting in the error
CoreMediaErrorDomain error -12939 - HTTP 416: Requested Range Not Satisfiable
Notably, there are no customizations or modifications to the standard AVPlayerViewController on tvOS player.
AVPlayer is trying to request the resource of length equals 739 bytes with an invalid byte range (739-) request. Since the request is not satisfiable server returns with 416. Note this is only limited to tvOS 18 and we are trying to understand why AVPlayer is making this invalid request in tvOS 18 resulting in playback error.
In iOS, when I use AVPlayerViewController to play back a slow motion video, it has a "ramp-up" stage at the start and a "ramp-down" stage at the end, and the video plays at the normal speed (i.e. not slow motion) during these stages.
My question is: are these non-slow-motion stages defined in the video file itself? (e.g. some kind of meta data?) Or, is it just a playback approach used by AVPlayerViewController ?
Thanks!
When the native info panel (which displays the title, subtitle, description, and custom buttons) opens, the focus immediately shifts to the first button. As a result, VoiceOver skips the description, which is crucial for users relying on accessibility features.
I haven’t found a way to detect when it opens. Knowing this would allow me to trigger custom VoiceOver announcements or adjust the focus order dynamically.
Are any other people experiencing this issue, and how do we solve it?
I am trying to build my app with Xcode16 on iOS18, once app is going background PIP is enable but whenever app is coming back to foreground by tapping on PIP, black screen appears(video is playing in background). It only happening with build created by Xcode16 and running on iOS18.
It working fine with build created by Xcode15.4.
My first thought on it that somehow PIP is not stoping.
I think I have the simplest possible Mac app trying to see if I can have VideoPlayer work in an Xcode Preview. It works in an iOS app project. In a Mac app project it builds and runs. But if I preview in Xcode it crashes.
The diagnostic says:
| [Remote] Unknown Error: The operation couldn’t be completed. XPC error received on message reply handler
|
| BSServiceConnectionErrorDomain (3):
| ==NSLocalizedFailureReason: XPC error received on message reply handler
| ==BSErrorCodeDescription: OperationFailed
The code I'm using is the exact code from the VideoPlayer documentation page. See this link.
Any ideas about this XPC error, and how to work around?
I'm using Xcode 16.0 on macOS 14.6.1
Hi everyone,
I’ve encountered an issue with the showsPlaybackControls property in AVPlayerViewController after updating to iOS 18. Even though it’s set to true, the native playback controls (play, pause, etc.) are no longer appearing as they used to in previous iOS versions. This behavior was consistent and worked perfectly prior to iOS 18.
Additionally, I’m seeing the same problem when using the VideoPlayer in SwiftUI. The native controls that should appear by default seem to have vanished after the update. Has anyone else experienced this? Is there any workaround or additional configuration required to restore the native controls?
Any help or insights would be appreciated. Thanks!
struct CustomPlayerView: UIViewControllerRepresentable {
let player: AVPlayer
func updateUIViewController(_ playerController: AVPlayerViewController, context: Context) {
playerController.player = player
playerController.showsPlaybackControls = true
player.play()
}
func makeUIViewController(context: Context) -> AVPlayerViewController {
return AVPlayerViewController()
}
}
Case-ID: 9391388
Our application uses timed Metadata as part of a rating control system.
We noticed a problem in production and diagnosis shows that we stop receiving timed Metadata on iOS18 only
Our live streams are primed with metadata at least once per second but we are seeing extended gaps in receiving this content, in excess of 10 minutes.
We have also observed that this happens more as the player climbs the bitrate ladder, and doesn't happen if we cap to a low resolution
i.e. a preferredMaximumResolution of 768x432.
Furthermore, if we throttle network conditions after we stop receiving metadata the we start receiving them again.
Following is a simple example that demonstrates the above behaviour, unfortunately I cannot share the live stream endpoint which is primed with metadata publicly, but can provide privately to Apple to reproduce the problem.
import UIKit
import AVKit
class ViewController: UIViewController, AVPlayerItemMetadataOutputPushDelegate {
var player: AVPlayer?
var itemMetadataOutput: AVPlayerItemMetadataOutput?
override func viewDidAppear(_ animated: Bool) {
guard let url = URL(string: "endpoint redacted") else {
return
}
let player = AVPlayer(url: url)
let controller = AVPlayerViewController()
controller.player = player
self.player = player
present(controller, animated: true) {
player.play()
let currentItem = player.currentItem
let itemMetadataOutput = AVPlayerItemMetadataOutput(identifiers: nil)
self.itemMetadataOutput = itemMetadataOutput
self.itemMetadataOutput?.setDelegate(self, queue: .main)
currentItem?.add(itemMetadataOutput)
}
}
public func metadataOutput(_ output: AVPlayerItemMetadataOutput,
didOutputTimedMetadataGroups groups: [AVTimedMetadataGroup],
from track: AVPlayerItemTrack?) {
print("received metadata \(Date())")
}
}
Dear all, how do I add any extra buttons to the AVPlayer / VideoPlayer menu? I would like to add a button e.g. linking to a particular website.
Thank you!
As you can see, the value shown in the AVCaptureSystemZoomSlider is not the same as the raw camera zoom factor.
I tried to calculate this value, and it seems it's 0.8. (5-1)*0.8=4.2-1 in this image.
It seems this factor only applies to the default wide-angle camera. And I can't get this value from anywhere. (It's not displayVideoZoomFactorMultiplier btw, I checked that.)
What is it?
I have a FairPlay-encrypted HLS stream and played the video in an AVPlayer.And I want to generate scrubbing thumbnails using the AVAssetImageGenerator.
Also, I am able to generate thumbnails for clear streams but get errors for protected content.
*How to generate thumbnails for protected content.
func getImageThumbnail(forTime: CMTime) {
let generator = AVAssetImageGenerator(asset: asset)
generator.appliesPreferredTrackTransform = true
generator.cancelAllCGImageGeneration()
generator.generateCGImagesAsynchronously(forTimes: [NSValue(time: forTime)]) { [weak self] requestedTime, image, actualTime, result, error in
if let error = error {
print("Error generate: \(error.localizedDescription)")
return
}
if let image = image {
DispatchQueue.main.async {
let image = UIImage(cgImage: image).jpegData(compressionQuality: 1.0)
self?.playerImg.image = UIImage(data: image!)
}
}
}
}
Here is a code snippet about AVPlayer.
avPlayer.addPeriodicTimeObserver(forInterval: CMTime(value: 1, timescale: 60), queue: .main) { [weak self] _ in
// Call main actor-isolated instance methods
}
Xcode shows warnings that Call to main actor-isolated instance method '***' in a synchronous nonisolated context; this is an error in the Swift 6 language mode. How can I fix this?
avPlayer.addPeriodicTimeObserver(forInterval: CMTime(value: 1, timescale: 60), queue: .main) { [weak self] _ in
Task { @MainActor in
// Call main actor-isolated instance methods
}
}
Can I use this solution above? But it seems switching actors frequently can slow down performance.
AVKit provides the SwiftUI view VideoPlayer, and allows you to add an interactive overlay. But that overlay is normally placed behind the system-provided playback controls.
Is there any way to suppress those controls, without resorting to wrapping AVPlayerView?
When setting the now playing info for playing media in MPNowPlayingInfoCenter we can set artwork. But it seems the Apple API for creating the artwork is crashing on iOS 18 (FB15145734).
On iOS 17 this gave the warning that the completion handler was not run on the main thread.
I've tried to seek help here: https://stackoverflow.com/questions/78989543/swift-data-race-with-appkit-mpmediaitemartwork-function/78990231?noredirect=1#comment139277425_78990231
but it seems that it's not possible to override the completion handler and therefor it's up to Apple to fix this issue.
.task {
await MainActor.run {
let nowPlayingInfoCenter = MPNowPlayingInfoCenter.default()
var nowPlayingInfo = [String: Any]()
let image = NSImage(named: "image")!
// warning: data race detected: @MainActor function at MPMediaItemArtwork/ContentView.swift:22 was not called on the main thread
nowPlayingInfo[MPMediaItemPropertyArtwork] = MPMediaItemArtwork(boundsSize: image.size, requestHandler: { _ in
// Not on main thread here!
return image
})
nowPlayingInfoCenter.nowPlayingInfo = nowPlayingInfo
}
}
I'm wondering if there is an alternative method to set the now playing artwork?
let debugString = "<speak><emphasis level=\"reduced\">Hello</emphasis></speak>"
let utterance = AVSpeechUtterance(ssmlRepresentation: debugString)! // <--- Freezes
I encountered this bug in iOS 18 beta
I sent a feedback through Feedback app.
I'm trying to secure my m3u8 streaming link with a token. To achieve this, I'm using AVAssetResourceLoaderDelegate in my SwiftUI app. However, the video doesn't play in AVPlayer when I'm using the AVAssetResourceLoaderDelegate. I can see that data is being received in the resourceLoader, but the player does not start playback.
Here's the code I'm using:
@State private var player: AVPlayer?
@EnvironmentObject var pilot: UIPilot<AppRoute>
var body: some View {
VStack {
VerticalSpacer(height: 50)
HStack {
Image(systemName: "arrow.left")
.onTapGesture {
pilot.pop()
}
Spacer()
Text("liveStreamData.titleShort")
.font(.poppins(.semibold, size: 18))
.lineLimit(1)
HorizontalSpacer(width: 16)
Spacer()
}
.padding(.horizontal)
if let player = player {
VideoPlayer(player: player)
.onAppear {
player.play()
}
.onDisappear {
player.pause()
}
} else {
Text("Loading video...")
}
}
.onAppear {
setupPlayer()
}
}
private func setupPlayer() {
guard let url = URL(string: "https://assets.afcdn.com/video49/20210722/v_645516.m3u8") else {
print("Invalid URL")
return
}
// Replace the scheme with a custom scheme
var components = URLComponents(url: url, resolvingAgainstBaseURL: false)
components?.scheme = "customscheme" // Change the scheme to a custom one
guard let customURL = components?.url else {
print("Failed to create custom URL")
return
}
let asset = AVURLAsset(url: customURL)
// Set the resource loader delegate
let resourceLoaderDelegate = VideoResourceLoaderDelegate()
asset.resourceLoader.setDelegate(resourceLoaderDelegate, queue: DispatchQueue.main)
let playerItem = AVPlayerItem(asset: asset)
player = AVPlayer(playerItem: playerItem)
}
}
class VideoResourceLoaderDelegate: NSObject, AVAssetResourceLoaderDelegate {
func resourceLoader(_ resourceLoader: AVAssetResourceLoader, shouldWaitForLoadingOfRequestedResource loadingRequest: AVAssetResourceLoadingRequest) -> Bool {
guard let url = loadingRequest.request.url else {
print("Invalid request URL")
return false
}
// Replace the custom scheme with the original HTTP/HTTPS scheme
var components = URLComponents(url: url, resolvingAgainstBaseURL: false)
components?.scheme = "https" // Change the scheme back to HTTP/HTTPS
guard let originalURL = components?.url else {
print("Failed to convert URL back to HTTPS")
return false
}
// Fetch the data from the original URL
let urlSession = URLSession.shared
let task = urlSession.dataTask(with: originalURL) { data, response, error in
if let error = error {
print("Error loading resource: \(error)")
loadingRequest.finishLoading(with: error)
return
}
if let data = data, let dataRequest = loadingRequest.dataRequest {
print("Data loaded: \(data.count) bytes")
dataRequest.respond(with: data)
loadingRequest.finishLoading()
} else {
print("No data received")
loadingRequest.finishLoading(with: NSError(domain: "VideoResourceLoader", code: -1, userInfo: nil))
}
}
task.resume()
return true
}
func resourceLoader(_ resourceLoader: AVAssetResourceLoader, didCancel loadingRequest: AVAssetResourceLoadingRequest) {
print("Loading request was canceled")
}
}
Problem:
The video does not play when using AVAssetResourceLoaderDelegate. The data is being loaded correctly as confirmed by the logs, but AVPlayer fails to start playback.
Without the resource loader, the video plays without any issues.
Question:
What could be causing the player to not play the video when using AVAssetResourceLoaderDelegate?
Are there any additional steps or configurations I need to ensure smooth playback while using a resource loader?
Any help would be greatly appreciated!
We appear to be experiencing a bug with the latest beta for visionOS, we are attempting to playback a video with a transparent background in the app. In the previous beta playback worked as expected and the transparent parts of the video were transparent. In the latest beta the background appears black. The view we are using in a SwiftUI wrapped version of AVPlayerViewController, we have narrowed the bug down to only occurring only when playback is being presented in the embedded experience mode, if playback is being done in the expanded experience then playback is as expected.
This has only only been visible on an actual device, we have been unable to replicate the behaviour in the simulator using the latest Xcode 16.0 beta(beta 5 (16A5221g))
This is sample project that shows off the bug
I recently created a project, originally in xCode 13.3 if i am not miistaken. Then i update it to 13.4 then update to mac os 15.6 as well.
Previously the project worked fine, but then i notice if i activate breakpoints the projects stopped when i run it in xCode but it worked fine like before, without breakpoints activated.
Also, when i build a .app of the project, the .app crashed exactly when the breakpoints previously stopped.
I am very confused on how to continue, would like to get help from anyone regarding the issue. From what i can gather from the breakpoints and crash report, it's got something about UUIID registration? AV foundation?
I am so confused. Please Help!
Thanks.
We have developed a custom player tvOS application using AVPlayer Foundation. When we hit the Siri command "What did they say?" Playback will go backwards but subtitles will not work temporarily. Anyone please suggest a solution for this issue :)
Hi, I love VideoMaterial API that gives so much power to play video on any mesh. But I am trying to play a side-by-side 3D video usingVideoMaterial:
RealityView { content in
let mesh = MeshResource.generatePlane(width: 300.0, height: 300.0, cornerRadius: 0) //generate mesh
let vidMaterial = VideoMaterial(avPlayer: AVPlayer(url: URL(string: "https://someurl/test/master.m3u8")!)) //VideoMaterial
vidMaterial.controller.preferredViewingMode = .stereo //<-- no idea why it doesn't work for SBS video in simulator
vidMaterial.avPlayer?.play()
let planeEntity = Entity() //new entity
planeEntity.components.set(ModelComponent(mesh: mesh, materials: [vidMaterial])) //set a new ModelComponent to the entity
content.add(planeEntity)
}
this code works well for plain 2D video playback but how do I display a Side-by-Side or Top-Bottom 3D video?
I found GeometrySwitchCameraIndex in custom ShaderGraphMaterial but if I use input node as a image texture then how do I pass the video frame as texture into my custom shader to achieve the 3D effect or maybe there is an even better way to deal with this?
There seems to be additional API .preferredViewingMode on the VideoMaterial's controller that can be set to .stereo but it doesn't give any stereo effect. Perhaps it's only for MV-HEVC media playback?