Hi guys, I'm implementing FairPlay support for a video streaming application. I've managed to get as far as generating the SPC and acquiring a license from the license server. However when it comes to parsing the license (CKC) returned from the server, the FPS module returns error code -42671. Has anyone else faced this before and / or knows what the fix is? I thought passing it the license should be enough unless additional data is required?
Video
RSS for tagIntegrate video and other forms of moving visual media into your apps.
Posts under Video tag
83 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
How Can I update the cookies of the previously set m3u8 video in AVPlayer without creating the new AVURLAsset and replacing the AVPlayer current Item with it
Does the new MV-HEVC vision pro spatial video format supports having an alpha channel? I've tried converting a side by side video with alpha channel enabled by using this Apple example project, but the alpha channel is being removed.
https://developer.apple.com/documentation/avfoundation/media_reading_and_writing/converting_side-by-side_3d_video_to_multiview_hevc
The Safari version for VisionOS (or spatial computing) supports WebXR, as reported here.
I am developing a Web App that intends to leverage WebXR, so I've tested several code samples on the safari browser of the Vision Pro Simulator to understand the level of support for immersive web content.
I am currently facing an issue that seems like a bug where video playback stops working when entering an XR session (i.e. going into VR mode) on a 3D web environment (using ThreeJS or similar).
There's an example from the Immersive Web Community Group called Stereo Video (https://immersive-web.github.io/webxr-samples/stereo-video.html) that lets you easily replicate the issue, the code is available here.
It's worth mentioning that video playback has been successfully tested on other VR platforms such as the Meta Quest 2.
The issue has been reported in the following forums:
https://discourse.threejs.org/t/videotexture-playback-html5-videoelement-apple-vision-pro-simulator-in-vr-mode-not-playing/53374
https://bugs.webkit.org/show_bug.cgi?id=260259
Hi everyone, I need to add spatial video maker in my app which was wrote in objective-c. I found some reference code by swift, can you help me with converting the code to objective -c?
let left = CMTaggedBuffer(
tags: [.stereoView(.leftEye), .videoLayerID(leftEyeLayerIndex)], pixelBuffer: leftEyeBuffer)
let right = CMTaggedBuffer(
tags: [.stereoView(.rightEye), .videoLayerID(rightEyeLayerIndex)],
pixelBuffer: rightEyeBuffer)
let result = adaptor.appendTaggedBuffers(
[left, right], withPresentationTime: leftPresentationTs)
Is there any way to play panoramic or 360 videos in an immersive space, without using VideoMaterial on a sphere?
I've tried using local videos with 4k and 8k quality and all of them look pixelated using this approach.
I tried both simulator as well as the real device, and I can't ever get a high-quality playback.
If the video is played on a regular 2D player, on the other hand, it shows the expected quality.
I want to get spatial videos in HEVC format, but after sharing to the share extension, I found that the video was automatically transcoded to AVC format.
Using version 14.3 of Safari can autoplay, version 15 and above requires user interaction to autoplay, I don't want the user to interact, what should I do?
Using version 14.3 of Safari can autoplay, version 15 and above requires user interaction to autoplay, I don't want the user to interact, what should I do?
This is my h5 code:
<video id="myVideo" src="xxxapp://***.***.xx/***/***.mp4" style="object-fit:cover;opacity:1;width:100%;height:100%;display:block;possition:absolute;" type="video/mp4"></video>
I want to load local large video, so, I use WKURLSchemeHandler.
- (void)webView:(WKWebView *)webView startURLSchemeTask:(id<WKURLSchemeTask>)urlSchemeTask {
NSURLRequest *request = [urlSchemeTask request];
NSURL *url = request.URL;
NSString *urlString = url.absoluteString;
NSString *videoPath = [[NSBundle mainBundle] pathForResource:@"***" ofType:@"mp4"];
NSData *videoData = [NSData dataWithContentsOfFile:videoPath options:nil error:nil];
NSURLResponse *response = [[NSURLResponse alloc] initWithURL:url MIMEType:@"video/mp4" expectedContentLength:videoData.length textEncodingName:nil];
[urlSchemeTask didReceiveResponse:response];
[urlSchemeTask didReceiveData:videoData];
[urlSchemeTask didFinish];
}
but its not work, data is not nil, but video do not play.
I would greatly appreciate it if someone could help me find a solution!!
ps: can make it, but we cannot use it due to some reasons.
Hey devs!
I recently started a project, a macOS app, which is like a Remote Desktop app but only on local network.
For this I wanted to use the MultipeerConnectivity framework, and it's my first time using it. So I have already done the device discovery side that is working well as it is the easier part. Now I just need someone who knows how it works and who has time to explain me (as I couldn't find any documentation about this) how does work the OutputStream And InputStream in MC and if its a good choice for my needs. It has to be low latency and high resolution... I also have seen other frameworks such as WebRTC that I could combine with a local WebSocket Server, but as I'm new to live video streaming and that I don't know anyone that is experimented with this I wanted to ask there for your advices.
Thank you in advance, TR-MZ (just an unknown Indie dev).
Hey, all!
I've been trying to upload a video preview to the AVP storefront for our app, but some of the export requirements seem to contradict one another.
For the AVP, a resolution of 4K is needed... which would require H264 level 5.2.
Yet, the H264 level can't be any higher than 4... which is 1080p.
It seems like a catch-22 where either the H264 level will be too high, or the resolution will be too low.
Does anyone have a fix or a way around this issue?
Does Video Toolbox’s compression session yield data I can decompress on a different device that doesn’t have Apple’s decompression? i.e. so I can network data to other devices that aren’t necessarily Apple?
or is the format proprietary rather than just regular h.264 (for example)?
If I can decompress without video toolbox, may I have reference to some examples for how to do this using cross-platform APIs? Maybe FFMPEG has something?
When I try to play video on my Apple Vision Pro simulator using a custom view with an AVPlayerLayer (as seen in my below VideoPlayerView), nothing displays but a black screen while the audio for the video i'm trying to play plays in the background. I've tried everything I can think of to resolve this issue, but to no avail.
import SwiftUI
import AVFoundation
import AVKit
struct VideoPlayerView: UIViewRepresentable {
var player: AVPlayer
func makeUIView(context: Context) -> UIView {
let view = UIView(frame: .zero)
let playerLayer = AVPlayerLayer(player: player)
playerLayer.videoGravity = .resizeAspect
view.layer.addSublayer(playerLayer)
return view
}
func updateUIView(_ uiView: UIView, context: Context) {
if let layer = uiView.layer.sublayers?.first as? AVPlayerLayer {
layer.frame = uiView.bounds
}
}
}
I have noticed however that if i use the default VideoPlayer (as demonstrated below), and not my custom VideoPlayerView, the video displays just fine, but any modifiers I use on that VideoPlayer (like the ones in my above custom struct), cause the video to display black while the audio plays in the background.
import SwiftUI
import AVKit
struct MyView: View {
var player: AVPlayer
var body: some View {
ZStack {
VideoPlayer(player: player)
Does anyone know a solution to this problem to make it so that video is able to display properly and not just appear as a black screen with audio playing in the background?
If an iphone user is recording vertical video it would be a great feature if the iphone records left and right of the visual in the black bar area as well - this eleminates the problem while cutting a movie in landscape format to use poor looking ghost picture in the left and right bars.
It would be nice if this feature is switchable in an option by using flags how the video was recorded and played back. Think about it - mainly people all over the world are recording vertical not because its cool, but because this is the best way to hold the device.
Any suggestions...
Thomas N.- Hamburg/Germany
Hello! I'm trying to save videos asynchronously. I've already used performChanges without the completionHandler, but it didn't work. Can you give me an example? Consider that the variable with the file URL is named fileURL. What would this look like asynchronously?
’m using the AVFoundation Swift APIs to record a Video (CMSampleBuffers) and Audio (CMSampleBuffers) to a file using AVAssetWriter.
Initializing the AVAssetWriter happens quite quickly, but calling assetWriter.startWriting() fully blocks the entire application AND ALL THREADS for 3 seconds. This only happens in Debug builds, not in Release.
Since it blocks all Threads and only happens in Debug, I’m lead to believe that this is an Xcode/Debugger/LLDB hang issue that I’m seeing.
Does anyone experience something similar?
Here’s how I set all of that up: startRecording(...)
And here’s the line that makes it hang for 3+ seconds: assetWriter.startWriting(...)
Since iOS 17.2. the video player in Safari becomes black if I jump forward in a HLS video stream. I only hear the sound of the video. If I close the full screen and reopen it the video continious normally.
I checked if the source meets all the requirements mentioned here and it does.
Does anybody have the same issue or maybe a solution for this problem?
This is my h5 code:
id="myVideo" src="xxxapp://***.***.xx/***/***.mp4" style="object-fit:cover;opacity:1;width:100%;height:100%;display:block;possition:absolute;" type="video/mp4"></video>
I want to load local large video, so, I use WKURLSchemeHandler.
- (void)webView:(WKWebView *)webView startURLSchemeTask:(id<WKURLSchemeTask>)urlSchemeTask {
NSURLRequest *request = [urlSchemeTask request];
NSURL *url = request.URL;
NSString *urlString = url.absoluteString;
NSString *videoPath = [[NSBundle mainBundle] pathForResource:@"***" ofType:@"mp4"];
NSData *videoData = [NSData dataWithContentsOfFile:videoPath options:nil error:nil];
NSURLResponse *response = [[NSURLResponse alloc] initWithURL:url MIMEType:@"video/mp4" expectedContentLength:videoData.length textEncodingName:nil];
[urlSchemeTask didReceiveResponse:response];
[urlSchemeTask didReceiveData:videoData];
[urlSchemeTask didFinish];
}
but its not work, data is not nil, but video do not play. I would greatly appreciate it if someone could help me find a solution!!
ps: can make it, but we cannot use it due to some reasons.
Hello. I have three questions about the Sensitive Content Analysis (SCA) framework:
SCA seems to be asynchronous. Is there a limit to how much a single app can send through it at a time?
For video analysis, can the video be broken into smaller chunks, and then all chunks be hit concurrently?
Can a video stream be sampled as it's being streamed? e.g. Maybe it samples one frame every 3 seconds and scans those?
Thanks.