Integrate video and other forms of moving visual media into your apps.

Posts under Video tag

86 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

how can i get camera access in ios12 application embedded html
i can sure the app already hava the camera access, but in the embedded html, i still cannot open the camera. And this HTML page is work at Safari, but cant work on app when the page is embedded in app. there is the error message: DOMException: undefine is not an object (evaluating 'navigator.mediaDevices.getUserMedia') and i also try to use 'navigator.getUserMedia' and 'navigator.mediaDevices.enumerateDevices()', this all dont work.
0
0
27
5h
Disable iOS Screen Mirroring for Apps
Hello Apple, I am concerned about the new iOS Screen Mirroring that is available on iOS. I have an app that is only meant to be viewed on iPhones (not Macs or Computers, due to security reasons. I am assuming that Screen Mirroring is using AirPlay underneath, otherwise is there an API being planned or coming that can disable this functionality or is there a way for my app to opt out out of iOS Screen Mirroring? Thanks.
1
0
164
1w
DestinationVideo -- MV-HEVC Files
In the code example provided there is a bool in the Video object to set a video as 3D: /// A Boolean value that indicates whether the video contains 3D content. let is3D: Bool I have a hosted spatial video that I know works correctly on the AVP player. When I point the Videos.json file to the this URL and set is3D=true my 3D video doesn't show up and I get the follow error: iPVC/1-0 Received playback error: [Error Domain=AVFoundationErrorDomain Code=-11850 "Operation Stopped" UserInfo={NSLocalizedFailureReason=The server is not correctly configured., NSLocalizedDescription=Operation Stopped, NSUnderlyingError=0x30227c510 {Error Domain=CoreMediaErrorDomain Code=-12939 "byte range length mismatch - should be length 2 is length 2434" UserInfo={NSDescription=byte range length mismatch - should be length 2 is length 2434, NSURL=https: <omitted for post> }}}] Can anyone tell me what might be going on? The error is telling me my server is not configured correctly. For context, I'm using a google drive to deliver dynamic images/videos using: https://drive.google.com/uc?export=download&id= <file ID> And the above works great for my images and 2d videos. Is there something I need to do specifically when delivering MV-HEVC videos?
1
0
303
4w
Controlling spacial video from floating window
I've created a Full Immersive VisionOS project and added a spacial video player in the ImmersiveView swift file. I have a few buttons on a different VideosView swift file on a floating window and i'd like switch the video playing in ImmersiveView when i click on a button in VideosView file. Video player working great in ImmersiveView: RealityView { content in if let videoEntity = try? await Entity(named: "Video", in: realityKitContentBundle) { guard let url = Bundle.main.url(forResource: "video1", withExtension: "mov") else {fatalError("Video was not found!")} let asset = AVURLAsset(url: url) let playerItem = AVPlayerItem(asset: asset) let player = AVPlayer() videoEntity.components[VideoPlayerComponent.self] = .init(avPlayer: player) content.add(videoEntity) player.replaceCurrentItem(with: playerItem) player.play() }else { print("file not found!") } } Buttons in floating window from VideosView: struct VideosView: View { var body: some View { VStack{ Button(action: {}) { Text("video 1").font(.title) } Button(action: {}) { Text("video 2").font(.title) } Button(action: {}) { Text("video 3").font(.title) } } } } In general how do I control the video player across views and how do I replace the video when each button is selected. Any help/code/links would be greatly appreciated.
1
0
320
May ’24
iOS to Android H264 encoding issue.
I'm trying to cast the screen from an iOS device to an Android device. I'm leveraging ReplayKit on iOS to capture the screen and VideoToolbox for compressing the captured video data into H.264 format using CMSampleBuffers. Both iOS and Android are configured for H.264 compression and decompression. While screen casting works flawlessly within the same platform (iOS to iOS or Android to Android), I'm encountering an error ("not in avi mode") on the Android receiver when casting from iOS. My research suggests that the underlying container formats for H.264 might differ between iOS and Android. Data transmission over the TCP socket seems to be functioning correctly. My question is: Is there a way to ensure a common container format for H.264 compression and decompression across iOS and Android platforms? Here's a breakdown of the iOS sender details: Device: iPhone 13 mini running iOS 17 Development Environment: Xcode 15 with a minimum deployment target of iOS 16 Screen Capture: ReplayKit for capturing the screen and obtaining CMSampleBuffers Video Compression: VideoToolbox for H.264 compression Compression Properties: kVTCompressionPropertyKey_ConstantBitRate: 6144000 (bitrate) kVTCompressionPropertyKey_ProfileLevel: kVTProfileLevel_H264_Main_AutoLevel (profile and level) kVTCompressionPropertyKey_MaxKeyFrameInterval: 60 (maximum keyframe interval) kVTCompressionPropertyKey_RealTime: true (real-time encoding) kVTCompressionPropertyKey_Quality: 1 (lowest quality) NAL Unit Handling: Custom header is added to NAL units Android Receiver Details: Device: RedMi 7A running Android 10 Video Decoding: MediaCodec API for receiving and decoding the H.264 stream
0
0
284
May ’24
Question regarding the kVTVideoEncoderList_IsHardwareAccelerated flag
I am a bit confused on whether certain Video Toolbox (VT) encoders support hardware acceleration or not. When I query the list of VT encoders (VTCopyVideoEncoderList(nil,&encoderList)) on an iPhone 14 Pro device, for avc1 (AVC / H.264) and hevc1 (HEVC / H.265) encoders, the kVTVideoEncoderList_IsHardwareAccelerated flag is not there, which -based on the documentation found on the VTVideoEncoderList.h- means that the encoders do not support hardware acceleration: optional. CFBoolean. If present and set to kCFBooleanTrue, indicates that the encoder is hardware accelerated. In fact, no encoders from this list return this flag as true and most of them do not include the flag at all on their dictionaries. On the other hand, when I create a compression session using the VTCompressionSessionCreate() and pass the kVTVideoEncoderSpecification_EnableHardwareAcceleratedVideoEncoder as true in the encoder specifications, after querying the kVTCompressionPropertyKey_UsingHardwareAcceleratedVideoEncoder using the following code, I get a CFBoolean value of true for both H.264 and H.265 encoder. In fact, I get a true value (for both of the aforementioned encoders) even if I don't specify the kVTVideoEncoderSpecification_EnableHardwareAcceleratedVideoEncoder during the creation of the compression session (note here that this flag was introduced in iOS 17.4 ^1). So the question is: Are those encoders actually hardware accelerated on my device, and if so, why isn't that reflected on the VTCopyVideoEncoderList() call?
3
1
348
May ’24
Reading 12K panoramic images system API read error, VisionOS does not support 12K panoramic photos view
xtension Entity { func addPanoramicImage(for media: WRMedia) { let subscription = TextureResource.loadAsync(named:"image_20240425_201630").sink( receiveCompletion: { switch $0 { case .finished: break case .failure(let error): assertionFailure("(error)") } }, receiveValue: { [weak self] texture in guard let self = self else { return } var material = UnlitMaterial() material.color = .init(texture: .init(texture)) self.components.set(ModelComponent( mesh: .generateSphere(radius: 1E3), materials: [material] )) self.scale *= .init(x: -1, y: 1, z: 1) self.transform.translation += SIMD3(0.0, -1, 0.0) } ) components.set(Entity.WRSubscribeComponent(subscription: subscription)) } func updateRotation(for media: WRMedia) { let angle = Angle.degrees( 0.0) let rotation = simd_quatf(angle: Float(angle.radians), axis: SIMD3<Float>(0, 0.0, 0)) self.transform.rotation = rotation } struct WRSubscribeComponent: Component { var subscription: AnyCancellable } } case .failure(let error): assertionFailure("(error)") Thread 1: Fatal error: Error Domain=MTKTextureLoaderErrorDomain Code=0 "Image decoding failed" UserInfo={NSLocalizedDescription=Image decoding failed, MTKTextureLoaderErrorKey=Image decoding failed}
1
0
361
Apr ’24
WebRTC - User medias - Black video screen when URL changed
Link sanbox: https://codesandbox.io/p/sandbox/webrtc-ios-lasted-issue-jzx9h5 Issue: Black video screen when url changed. Reproduce step: Get the source code on sanbox repo above Install packages by command "npm install" Start local web-app under https by command "HTTPS=true npm start" Update url by click button "Update URL search param" OS: iOS v17.4.1 Browser: Safari Device: iPhone 11 pro Anyone can help? Note: it's works on iPhone X iOS version 16 Link video issue: https://streamable.com/rj07u8
0
0
231
Apr ’24
WebRTC - User medias - Black video when url changed on iOS 17.4.1
Issue: WebRTC - User medias - Black video screen when url changed. Reproduce step: Get the source code on sanbox repo above Install packages by command "npm install" Start local web-app under https by command "HTTPS=true npm start" Update url by click button "Update URL search param" OS: iOS v17.4.1 Browser: Safari Device: iPhone 11 pro Anyone can help? Note: it's works on iPhone X iOS version 16 Link video issue: https://streamable.com/rj07u8
2
1
277
Apr ’24
Video Player controllers aren't showing for the video in immersive view
I have a AVPlayer() which loads the video and places it on the screen ModelEntity in the immersive view using the VideoMaterial. This also makes the video untappable as it is a VideoMaterial. Here's the code for the same: let screenModelEntity = model.garageScreenEntity as! ModelEntity let modelEntityMesh = screenModelEntity.model!.mesh let url = Bundle.main.url(forResource: "<URL>", withExtension: "mp4")! let asset = AVURLAsset(url: url) let playerItem = AVPlayerItem(asset: asset) let player = AVPlayer() let material = VideoMaterial(avPlayer: player) screenModelEntity.components[ModelComponent.self] = .init(mesh: modelEntityMesh, materials: [material]) player.replaceCurrentItem(with: playerItem) return player I was able to load and play the video. However, I cannot figure out how to show the player controls (AVPlayerViewController) to the user, similar to the DestinationVideo sample app. How can I add the video player controls in this case?
0
0
421
Apr ’24
IOS17+ safari video size rendering compatibility issue
<div class="container" style="background-size: contain; user-select: none; pointer-events: none; height: 787.5px; width: 1400px;"> <div class="container__header">header</div> <span> <div class="video-container" style="inset: 17.853% 68% 11.747% 1%; z-index: 2; opacity: 1;"> <div class="video-container__placeholder-image">image</div> <div class="video-container__content"> <div class="some-info"></div> <div class="video-canvas"></div> <div class="other-info"></div> </div> </div> <div class="video-container" style="inset: 17.853% 1% 11.747% 33%; z-index: 1; opacity: 1;"> <div class="video-container__placeholder-image">image</div> <div class="video-container__content"> <div class="video-canvas"> <div class="player" style="width: 100%; height: 100%; position: relative; overflow: hidden; background-color: black;"> <video playsinline="" muted="" style="object-fit: cover; width: 100%; height: 100%; position: absolute; left: 0px; top: 0px;"></video> </div> </div> </div> </div> </span> </div> The page looks like Then, the html changed as follows, <div class="container" style="background-size: contain; user-select: none; pointer-events: none; height: 787.5px; width: 1400px;"> <div class="container__header">header</div> <span> <div class="video-container" style="inset: 100% 100% 0% 0%; z-index: 2; opacity: 0;"> <div class="video-container__placeholder-image">image</div> <div class="video-container__content"> <div class="some-info"></div> <div class="video-canvas"></div> <div class="other-info"></div> </div> </div> <div class="video-container" style="style="inset: 6.106% 5.98719% 0%; z-index: 3; opacity: 1;""> <div class="video-container__placeholder-image">image</div> <div class="video-container__content"> <div class="video-canvas"> <div class="player" style="width: 100%; height: 100%; position: relative; overflow: hidden; background-color: black;"> <video playsinline="" muted="" style="object-fit: cover; width: 100%; height: 100%; position: absolute; left: 0px; top: 0px;"></video> </div> </div> </div> </div> </span> </div> From the mac developer tools, the width of the video is 1400px, but it render like the size is same as before in iOS17+(iOS17.1 and iOS17.3.1). The expected results looks like the actual results are looks like I tried the same operators in iOS 14.6 and 16.4 and it worked as expected, this problem likes only exists in iOS17+. Please help me to resolve this problom. Thanks.
0
0
442
Mar ’24
AVAssetDurationDidChange - notification not receiving in iOS 17
We utilized AVFragmentedAssetMinder to refresh the player data. While notifications for AVAssetDurationDidChange were consistently received whenever the player duration changed. However, following the release of iOS 17, notifications for AVAssetDurationDidChange ceased to be received. Could you please advise anyone why this notification is not being triggered? what we have to change NotificationCenter.default.addObserver(self, selector: #selector(self.onVideoUpdate), name: .AVAssetDurationDidChange, object: nil) #AVPLAyer, #AVMUtableMovie
0
0
355
Mar ’24
Implementing FairPlayStreaming and getting error code -42671
Hi guys, I'm implementing FairPlay support for a video streaming application. I've managed to get as far as generating the SPC and acquiring a license from the license server. However when it comes to parsing the license (CKC) returned from the server, the FPS module returns error code -42671. Has anyone else faced this before and / or knows what the fix is? I thought passing it the license should be enough unless additional data is required?
0
0
488
Mar ’24
Objective C implementation of Spatial Video(MV-HEVC) Maker
Hi everyone, I need to add spatial video maker in my app which was wrote in objective-c. I found some reference code by swift, can you help me with converting the code to objective -c? let left = CMTaggedBuffer( tags: [.stereoView(.leftEye), .videoLayerID(leftEyeLayerIndex)], pixelBuffer: leftEyeBuffer) let right = CMTaggedBuffer( tags: [.stereoView(.rightEye), .videoLayerID(rightEyeLayerIndex)], pixelBuffer: rightEyeBuffer) let result = adaptor.appendTaggedBuffers( [left, right], withPresentationTime: leftPresentationTs)
2
0
521
Mar ’24
MV-HEVC with alpha channel
Does the new MV-HEVC vision pro spatial video format supports having an alpha channel? I've tried converting a side by side video with alpha channel enabled by using this Apple example project, but the alpha channel is being removed. https://developer.apple.com/documentation/avfoundation/media_reading_and_writing/converting_side-by-side_3d_video_to_multiview_hevc
2
0
568
Mar ’24
Videos play in low quality when projected on a sphere using VideoMaterial
Is there any way to play panoramic or 360 videos in an immersive space, without using VideoMaterial on a sphere? I've tried using local videos with 4k and 8k quality and all of them look pixelated using this approach. I tried both simulator as well as the real device, and I can't ever get a high-quality playback. If the video is played on a regular 2D player, on the other hand, it shows the expected quality.
0
0
406
Feb ’24