Integrate video and other forms of moving visual media into your apps.

Posts under Video tag

83 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

How change videoGravity of VideoPlayer in SwiftUI
I have the new iOS 14 VideoPlayer: private let player = AVPlayer(url: Bundle.main.url(forResource: "TL20_06_Shoba3_4k", withExtension: "mp4")!) 		var body: some View { 								VideoPlayer(player: player) 								.aspectRatio(contentMode: .fill) ... This player setup can not display 4:3 video on 16:9 screen of tv without black stripes. Modifier aspectRatio does not work on VideoPlayer. How can I set videoGravity of existed AVPlayerLayer to resizeAspectFill via SwiftUI API?
6
0
5.2k
Aug ’23
Video Quality selection in HLS streams
Hello there, in our team we were requested to add the possibility to manually select the video quality. I know that HLS is an adaptive stream and that depending on the network condition it choose the best quality that fits to the current situation. I also tried some setting with preferredMaximumResolution and preferredPeakBitRate but none of them worked once the user was watching the steam. I also tried something like replacing the currentPlayerItem with the new configuration but anyway this only allowed me to downgrade the quality of the video. When I wanted to set it for example to 4k it did not change to that track event if I set a very high values to both params mentioned above. My question is if there is any method which would allow me to force certain quality from the manifest file. I already have some kind of extraction which can parse the manifest file and provide me all the available information but I couldn't still figure out how to make the player reproduce specific stream with my desired quality from the available playlist.
4
0
6.6k
Oct ’23
tvOS: How to avoid fast-forwarding in AVPlayerViewController
Due to legal restrictions I need to prevent my app's users from skipping and fast-forwarding the content that is played by AVPlayerViewController. I use playerViewController(:willResumePlaybackAfterUserNavigatedFrom:to:) and playerViewController(:timeToSeekAfterUserNavigatedFrom:to:) delegate methods to control the skipping behaviour. However, those delegate methods are only triggered for skip +/- 10, but not for fast-forwarding/rewinding.  Is there a way to prevent fast-forwarding in addition to skipping in AVPlayerViewController? Here is an example of the code I use: class ViewController: UIViewController {   override func viewDidAppear(_ animated: Bool) {     super.viewDidAppear(animated)     setUpPlayerViewController()   }   private func setUpPlayerViewController() {     let playerViewController = AVPlayerViewController()     playerViewController.delegate = self guard let url = URL(string: "https://devstreaming-cdn.apple.com/videos/streaming/examples/img_bipbop_adv_example_ts/master.m3u8") else {       debugPrint("URL is not found")       return     }     let playerItem = AVPlayerItem(url: url)     let player = AVPlayer(playerItem: playerItem)     playerViewController.player = player     present(playerViewController, animated: true) {       playerViewController.player?.play()     }   } } extension ViewController: AVPlayerViewControllerDelegate {   public func playerViewController(_ playerViewController: AVPlayerViewController, willResumePlaybackAfterUserNavigatedFrom oldTime: CMTime, to targetTime: CMTime) { // Triggered on skip +/- 10, but not on fast-forwarding/rewinding     print("playerViewController(_:willResumePlaybackAfterUserNavigatedFrom:to:)")   }   public func playerViewController(_ playerViewController: AVPlayerViewController, timeToSeekAfterUserNavigatedFrom oldTime: CMTime, to targetTime: CMTime) -> CMTime {     // Triggered on skip +/- 10, but not on fast-forwarding/rewinding     print("playerViewController(_:timeToSeekAfterUserNavigatedFrom:to:)")     return targetTime   } }
2
1
1.1k
Sep ’23
AVPlayer, AVAssetReader, AVAssetWriter - Using MXF files
Hi, I would like to read in .mxf files using AVPlayer and also AVAssetReader. I would like to write out to .mxf files using AVAssetWriter, Should this be possible ? Are there any examples of how to do this ? I found the VTRegisterProfessionalVideoWorkflowVideoDecoders() call but this did not seem to help. I would be grateful for any suggestions. Regards Tom
1
0
1.6k
Aug ’23
Unexpected errors when using AVPlayer in iOS 16
Hi, I'm trying to add a video to my first iOS app. From the tutorials I've read online, this seemed to be a simple process of creating a AVPlayer, providing a URL to the video file and using onAppear to start the video playing when the view is shown. Below is a simplified version of the code I'm using in my app: struct ContentView: View {   let avPlayer = AVPlayer(url: Bundle.main.url(forResource: "Intro", withExtension: "mp4")!)   var body: some View {     VStack{       VideoPlayer(player: avPlayer)         .onAppear{           avPlayer.play()         }     }   } } When I run this code, the video plays but when it finishes playing I receive the following errors in the Xcode output window: 2023-01-27 11:56:39.530526+1100 TestVideo[29859:2475750] [api] -[CIImage initWithCVPixelBuffer:options:] failed because the buffer is nil. 2023-01-27 11:56:39.676462+1100 TestVideo[29859:2475835] [TextIdentificationService] Text LID dominant failure: lidInconclusive 2023-01-27 11:56:39.676822+1100 TestVideo[29859:2475835] [VisualTranslationService] Visual isTranslatable: NO; not offering translation: lidInconclusive 2023-01-27 11:56:40.569337+1100 TestVideo[29859:2476091] Metal API Validation Enabled I have googled each of these error messages but have not been able to find any information explaining exactly what they mean or how to eliminate them. I am using Xcode 14.2 and testing on iOS 16.2. If anyone could please point me in the right direction of how to understand and eliminate these errors I'd really appreciate it. Thanks!
4
3
2.1k
Jul ’23
Quicktime Interactivity?
Is it still possible to tutor a QuickTime movie with hyperlinks? I'm building a website and I know at one point you could author a QuickTime movie that supported links inside the video - either to other timestamps in the video or to other web pages. I don't want to use a custom player, I'd prefer to use the system level. I've seen a really amazing example of this on the mobile version of the memory alpha (Star Trek nerds!) website. There is a movie that plays at the top of pages that is fully interactive. Is that still supported? Is it possible to author that way? I'm not making anything insanely complicate, I just thought it would be a nice way to build a website with tools I'm more comfortable working in.
1
0
1k
Aug ’23
AVPlayer queries redirecting URL twice
Hello, consider this very simple example: guard let url = URL(string: "some_url") else { return } let player = AVPlayer(url: url) let controller = AVPlayerViewController() controller.player = player present(controller, animated: true) { player.play() } When the video URL is using redirection and returns 302 when queried, AVPlayer's internal implementation is querying twice, which is proven by proxying. I'm not sure if I can provide the actual links, thus the screenshot is blurred. It can be seen though, that the redirecting URL, which receives 302 as a response, it is queried twice, and only after the 2nd attempt the actual redirection is taking place. This behavior is problematic to the backend services and we need to remediate it somehow. Do you have any idea on how to address this problem, please?
1
5
661
Jul ’23
MPNowPlayingInfoCenter not fully available if player playback controls disabled
If I disable playback controls for an AVPlayer (showsPlaybackControls), some feature of MPNowPlayingInfoCenter no longer working. (play/pause, skip forward and backward). I need custom video and audio controls on my AVPlayer in my app, that's why I disabled the iOS playback controls. But I also need the features of the MPNowPlayingInfoCenter. Is there another solution to achieve this?
1
0
786
Jul ’23
Video's sound become so low then to normal when recording using AVCaptureSession
Hi everyone, In my app i use AVCaptureSession to record video. I add videoDeviceInput and audioDeviceInput. And as output I use AVCaptureMovieFileOutput. And the result for some iPhone specially the iPhones after IPhoneX(iPhone 11,12, 13,14) has bad audio quality, The sound is like so low then after some seconds(around 7 secs) become normal. I have tried setting the audio setting for movie file out but it's still happen. Anyone know how to solve this issue?
0
0
364
Jul ’23
Seeking to specific frame index with AVAssetReader
We're using code based on AVAssetReader to get decoded video frames through AVFoundation. The decoding part per se works great but the seeking just doesn't work reliably. For a given H.264 file (in the MOV container) the decoded frames have presentation time stamps that sometimes don't correspond to the actual decoded frames. So for example: the decoded frame's PTS is 2002/24000 but the frame's actual PTS is 6006/24000. The frames have burnt-in timecode so we can clearly tell. Here is our code: - (BOOL) setupAssetReaderForFrameIndex:(int32_t) frameIndex { NSError* theError = nil; NSDictionary* assetOptions = @{ AVURLAssetPreferPreciseDurationAndTimingKey: @YES }; self.movieAsset = [[AVURLAsset alloc] initWithURL:self.filePat options:assetOptions]; if (self.assetReader) [self.assetReader cancelReading]; self.assetReader = [AVAssetReader assetReaderWithAsset:self.movieAsset error:&theError]; NSArray<AVAssetTrack*>* videoTracks = [self.movieAsset tracksWithMediaType:AVMediaTypeVideo]; if ([videoTracks count] == 0) return NO; self.videoTrack = [videoTracks objectAtIndex:0]; [self retrieveMetadata]; NSDictionary* outputSettings = @{ (id)kCVPixelBufferPixelFormatTypeKey: @(self.cvPixelFormat) }; self.videoTrackOutput = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:self.videoTrack outputSettings:outputSettings]; self.videoTrackOutput.alwaysCopiesSampleData = NO; [self.assetReader addOutput:self.videoTrackOutput]; CMTimeScale timeScale = self.videoTrack.naturalTimeScale; CMTimeValue frameDuration = (CMTimeValue)round((float)timeScale/self.videoTrack.nominalFrameRate); CMTimeValue startTimeValue = (CMTimeValue)frameIndex * frameDuration; CMTimeRange timeRange = CMTimeRangeMake(CMTimeMake(startTimeValue, timeScale), kCMTimePositiveInfinity); self.assetReader.timeRange = timeRange; [self.assetReader startReading]; return YES; } This is then followed by this code to actually decode the frame: CMSampleBufferRef sampleBuffer = [self.videoTrackOutput copyNextSampleBuffer]; CVPixelBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); if (!imageBuffer) { CMSampleBufferInvalidate(sampleBuffer); AVAssetReaderStatus theStatus = self.assetReader.status; NSError* theError = self.assetReader.error; NSLog(@"[AVAssetVideoTrackOutput copyNextSampleBuffer] didn't deliver a frame - %@", theError); return false; } Is this method by itself the correct way of seeking and if not: what is the correct way? Thanks!
0
0
556
Jul ’23
Progressively supply media data
I'm trying to use the resourceLoader of an AVAsset to progressively supply media data. Unable to because the delegate asks for the full content requestsAllDataToEndOfResource = true. class ResourceLoader: NSObject, AVAssetResourceLoaderDelegate { func resourceLoader(_ resourceLoader: AVAssetResourceLoader, shouldWaitForLoadingOfRequestedResource loadingRequest: AVAssetResourceLoadingRequest) -> Bool { if let ci = loadingRequest.contentInformationRequest { ci.contentType = // public.mpeg-4 ci.contentLength = // GBs ci.isEntireLengthAvailableOnDemand = false ci.isByteRangeAccessSupported = true } if let dr = loadingRequest.dataRequest { if dr.requestedLength > 200_000_000 { // memory pressure // dr.requestsAllDataToEndOfResource is true } } return true } } Also tried using a fragmented MP4 created using AVAssetWriter. But didn't work. Please let me know if it's possible for the AVAssetResourceLoader to not ask for the full content?
1
0
612
Aug ’23
Vision Pro Safari WebXR video playback not working
The Safari version for VisionOS (or spatial computing) supports WebXR, as reported here. I am developing a Web App that intends to leverage WebXR, so I've tested several code samples on the safari browser of the Vision Pro Simulator to understand the level of support for immersive web content. I am currently facing an issue that seems like a bug where video playback stops working when entering an XR session (i.e. going into VR mode) on a 3D web environment (using ThreeJS or similar). There's an example from the Immersive Web Community Group called Stereo Video (https://immersive-web.github.io/webxr-samples/stereo-video.html) that lets you easily replicate the issue, the code is available here. It's worth mentioning that video playback has been successfully tested on other VR platforms such as the Meta Quest 2. The issue has been reported in the following forums: https://discourse.threejs.org/t/videotexture-playback-html5-videoelement-apple-vision-pro-simulator-in-vr-mode-not-playing/53374 https://bugs.webkit.org/show_bug.cgi?id=260259
5
1
2k
Mar ’24
Black frames in resulting AVComposition.
I have multiple AVAssets with that I am trying to merge together into a single video track using AVComposition. What I'm doing is iterating over my avassets and inserting them in to a single AVCompositionTrack like so: - (AVAsset *)combineAssets { // Create a mutable composition AVMutableComposition *composition = [AVMutableComposition composition]; AVMutableCompositionTrack *compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid]; AVMutableCompositionTrack *compositionAudioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid]; // Keep track of time offset CMTime currentOffset = kCMTimeZero; for (AVAsset *audioAsset in _audioSegments) { AVAssetTrack *audioTrack = [[audioAsset tracksWithMediaType:AVMediaTypeAudio] firstObject]; // Add the audio track to the composition audio track NSError *audioError; [compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, audioAsset.duration) ofTrack:audioTrack atTime:currentOffset error:&audioError]; if (audioError) { NSLog(@"Error combining audio track: %@", audioError.localizedDescription); return nil; } currentOffset = CMTimeAdd(currentOffset, audioAsset.duration); } // Reset offset to do the same with videos. currentOffset = kCMTimeZero; for (AVAsset *videoAsset in _videoSegments) { { // Get the video track AVAssetTrack *videoTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] firstObject]; // Add the video track to the composition video track NSError *videoError; [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoTrack.timeRange.duration) ofTrack:videoTrack atTime:currentOffset error:&videoError]; if (videoError) { NSLog(@"Error combining audio track: %@", videoError.localizedDescription); return nil; } // Increment current offset by asset duration currentOffset = CMTimeAdd(currentOffset, videoTrack.timeRange.duration); } } return composition; } The issue is that when I export the composition using an AVExportSession I notice that there's a black frame between the merged segments in the track. In other words, if there were two 30second AVAssets merged into the composition track to create a 60 second video. You would see a black frame for a split second at the 30 second mark where the two assets combine. I don't really want to re encode the assets I just want to stitch them together. How can I fix the black frame issue?.
1
0
449
Aug ’23
Timestamps in AVPlayer
I want to show the user actual start and end dates of the video played on the AVPlayer time slider, instead of the video duration data. I would like to show something like this: 09:00:00 ... 12:00:00 (which indicates that the video started at 09:00:00 CET and ended at 12:00:00 CET), instead of: 00:00:00 ... 02:59:59. I would appreciate any pointers to this direction.
1
1
493
Sep ’23
Webrtc zoom/resize issue in safari 17
I am learning to develop webrtc apps and have noticed that starting with safari and safari mobile 17 there is a noticeable zoom distortion that occurs when resizing some webrtc players. This seems to be safari specific and only on version 17. What feature change could cause this? Here is an example of catalina vs Sonoma. Sorry i dont have access to any other versions in between atm but i have only seen this issue since updating to safari 17
2
4
571
Oct ’23