I recently bought an insta360 flow gimbal. when recording video with the instaflow app, I cannot see the location in apple photos app and all other apple apps. However I can see the location in windows photos app once I downloaded the videos into my windows PC. The location is also visible in android app once I share it through google account.
With an exif app, I can see the location meta data in exif table as well, but again not shown as location.
exiftool in my pc can also see the meta data including location as in attached screenshot.
Compared to video shot with built-in camera app, I cannot find any difference in terms of location meta data.
What could be wrong? I contacted insta360 app support, they do not seem to understand what's going on, just asking for very simple questions again and again like do you enable GPS location access, are you shooting video?
I also contacted apple support, they are just saying it's thirdparty issue and refusing to help further. If it's really thirdparty issue how come the location data is actually embeded as meta data, and windows pc and android device can see the location? BTW, I air drop this video to all my apple devices like iPhone 15 ultra and ipad air, and very old iPhone, all of them cannot see the location.
Video
RSS for tagIntegrate video and other forms of moving visual media into your apps.
Posts under Video tag
83 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
The same H265 encrypted Fairplay content can be played in all Apple devices except A1625.
The clear H265 content is played in A1625.
The question is: will this model (A1625) support H265 Fairplay encrypted content?
A ticket was created here:
https://discussions.apple.com/thread/255658006?sortBy=best
i can sure the app already hava the camera access, but in the embedded html, i still cannot open the camera. And this HTML page is work at Safari, but cant work on app when the page is embedded in app.
there is the error message:
DOMException: undefine is not an object (evaluating 'navigator.mediaDevices.getUserMedia')
and i also try to use 'navigator.getUserMedia' and 'navigator.mediaDevices.enumerateDevices()', this all dont work.
Hello Apple,
I am concerned about the new iOS Screen Mirroring that is available on iOS.
I have an app that is only meant to be viewed on iPhones (not Macs or Computers, due to security reasons.
I am assuming that Screen Mirroring is using AirPlay underneath, otherwise is there an API being planned or coming that can disable this functionality or is there a way for my app to opt out out of iOS Screen Mirroring?
Thanks.
Simple question: can I shoot log using the new MVHEVC format for spatial video for viewing on the Vision Pro?
HELP! How could I play a spatial video in my own vision pro app like the official app Photos? I've used API of AVKit to play a spatial video in XCode vision pro simulator with the guild of the official developer document, this video could be played but it seems different with what is played through app Photos. In Photos the edge of the video seems fuzzy but in my own app it has a clear edge.
How could I play the spatial video in my own app with the effect like what is in Photos?
Hi Team,
I'm using AVPlayer on Apple TV 2nd generation which has that siri click pad with four buttons around and we need to detect where the user clicked on the Siri remote fast forward/backward. I have tested many different approaches to do it, but nothing is working for me.
Does anyone have any idea how to resolve it in Swift?
I've encountered an issue with the seek bar time display in the video player on iOS 17, specifically affecting live stream videos using HLS manifests with the time displayed in am/pm format.
As the video progresses, the displayed start time appears to shift backwards in time with a peculiar pattern:
Displayed Start Time = Normal Start Time - Viewed Duration
For instance, if a program begins at 9:00 AM, at 9:30 AM, the start time shown will erroneously be 8:30 AM. Similarly, at 9:40 AM, the displayed start time will be 8:20 AM.
This issue is observed with both VideoPlayer and AVPlayerViewController on iOS 17. The same implementation of the video player on iOS 16 displays the duration of the viewed program and doesn’t have any issues.
Please advise on any known workarounds or solutions to address this issue on iOS 17.
In the code example provided there is a bool in the Video object to set a video as 3D:
/// A Boolean value that indicates whether the video contains 3D content. let is3D: Bool
I have a hosted spatial video that I know works correctly on the AVP player. When I point the Videos.json file to the this URL and set is3D=true my 3D video doesn't show up and I get the follow error:
iPVC/1-0 Received playback error: [Error Domain=AVFoundationErrorDomain Code=-11850 "Operation Stopped" UserInfo={NSLocalizedFailureReason=The server is not correctly configured., NSLocalizedDescription=Operation Stopped, NSUnderlyingError=0x30227c510 {Error Domain=CoreMediaErrorDomain Code=-12939 "byte range length mismatch - should be length 2 is length 2434" UserInfo={NSDescription=byte range length mismatch - should be length 2 is length 2434, NSURL=https: <omitted for post> }}}]
Can anyone tell me what might be going on? The error is telling me my server is not configured correctly. For context, I'm using a google drive to deliver dynamic images/videos using:
https://drive.google.com/uc?export=download&id= <file ID>
And the above works great for my images and 2d videos. Is there something I need to do specifically when delivering MV-HEVC videos?
I've created a Full Immersive VisionOS project and added a spacial video player in the ImmersiveView swift file. I have a few buttons on a different VideosView swift file on a floating window and i'd like switch the video playing in ImmersiveView when i click on a button in VideosView file.
Video player working great in ImmersiveView:
RealityView { content in
if let videoEntity = try? await Entity(named: "Video", in: realityKitContentBundle) {
guard let url = Bundle.main.url(forResource: "video1", withExtension: "mov") else {fatalError("Video was not found!")}
let asset = AVURLAsset(url: url)
let playerItem = AVPlayerItem(asset: asset)
let player = AVPlayer()
videoEntity.components[VideoPlayerComponent.self] = .init(avPlayer: player)
content.add(videoEntity)
player.replaceCurrentItem(with: playerItem)
player.play()
}else {
print("file not found!")
}
}
Buttons in floating window from VideosView:
struct VideosView: View {
var body: some View {
VStack{
Button(action: {}) {
Text("video 1").font(.title)
}
Button(action: {}) {
Text("video 2").font(.title)
}
Button(action: {}) {
Text("video 3").font(.title)
}
}
}
}
In general how do I control the video player across views and how do I replace the video when each button is selected. Any help/code/links would be greatly appreciated.
I'm trying to cast the screen from an iOS device to an Android device.
I'm leveraging ReplayKit on iOS to capture the screen and VideoToolbox for compressing the captured video data into H.264 format using CMSampleBuffers. Both iOS and Android are configured for H.264 compression and decompression.
While screen casting works flawlessly within the same platform (iOS to iOS or Android to Android), I'm encountering an error ("not in avi mode") on the Android receiver when casting from iOS. My research suggests that the underlying container formats for H.264 might differ between iOS and Android.
Data transmission over the TCP socket seems to be functioning correctly.
My question is:
Is there a way to ensure a common container format for H.264 compression and decompression across iOS and Android platforms?
Here's a breakdown of the iOS sender details:
Device: iPhone 13 mini running iOS 17
Development Environment: Xcode 15 with a minimum deployment target of iOS 16
Screen Capture: ReplayKit for capturing the screen and obtaining CMSampleBuffers
Video Compression: VideoToolbox for H.264 compression
Compression Properties:
kVTCompressionPropertyKey_ConstantBitRate: 6144000 (bitrate)
kVTCompressionPropertyKey_ProfileLevel: kVTProfileLevel_H264_Main_AutoLevel (profile and level)
kVTCompressionPropertyKey_MaxKeyFrameInterval: 60 (maximum keyframe interval)
kVTCompressionPropertyKey_RealTime: true (real-time encoding)
kVTCompressionPropertyKey_Quality: 1 (lowest quality)
NAL Unit Handling: Custom header is added to NAL units
Android Receiver Details:
Device: RedMi 7A running Android 10
Video Decoding: MediaCodec API for receiving and decoding the H.264 stream
I am a bit confused on whether certain Video Toolbox (VT) encoders support hardware acceleration or not.
When I query the list of VT encoders (VTCopyVideoEncoderList(nil,&encoderList)) on an iPhone 14 Pro device, for avc1 (AVC / H.264) and hevc1 (HEVC / H.265) encoders, the kVTVideoEncoderList_IsHardwareAccelerated flag is not there, which -based on the documentation found on the VTVideoEncoderList.h- means that the encoders do not support hardware acceleration:
optional. CFBoolean. If present and set to kCFBooleanTrue, indicates that the encoder is hardware accelerated.
In fact, no encoders from this list return this flag as true and most of them do not include the flag at all on their dictionaries.
On the other hand, when I create a compression session using the VTCompressionSessionCreate() and pass the kVTVideoEncoderSpecification_EnableHardwareAcceleratedVideoEncoder as true in the encoder specifications, after querying the kVTCompressionPropertyKey_UsingHardwareAcceleratedVideoEncoder using the following code, I get a CFBoolean value of true for both H.264 and H.265 encoder.
In fact, I get a true value (for both of the aforementioned encoders) even if I don't specify the kVTVideoEncoderSpecification_EnableHardwareAcceleratedVideoEncoder during the creation of the compression session (note here that this flag was introduced in iOS 17.4 ^1).
So the question is: Are those encoders actually hardware accelerated on my device, and if so, why isn't that reflected on the VTCopyVideoEncoderList() call?
We're experiencing an issue on an iPhone 15 (iOS 17.1) where some video files can't be loaded from the results of a PHPickerViewController.
results[index].itemProvider.loadFileRepresentation(forTypeIdentifier: UTType.movie.identifier)
Gives error:
Cannot load representation of type public.movie
Video info (taken from Mac Finder):
H.264
MPEG-4 AAC
HD (1-1-1)
480x848px
Filetype .MP4
Origin: Recorded on iPhone 14, sent over WhatsApp, & auto saved from WhatsApp to an iPhone 15
The iPhone 15 has iCloud enabled and the videos failing are frequently viewed and used in testing, so are likely to be downloaded/cached locally.
I've tried changing the PHPickerConfiguration preferredAssetRepresentationMode to .current with no difference in the error.
I've also tried using the openInPlace alternative but it complains that it's not supported in the debug output.
xtension Entity {
func addPanoramicImage(for media: WRMedia) {
let subscription = TextureResource.loadAsync(named:"image_20240425_201630").sink(
receiveCompletion: {
switch $0 {
case .finished: break
case .failure(let error): assertionFailure("(error)")
}
},
receiveValue: { [weak self] texture in
guard let self = self else { return }
var material = UnlitMaterial()
material.color = .init(texture: .init(texture))
self.components.set(ModelComponent(
mesh: .generateSphere(radius: 1E3),
materials: [material]
))
self.scale *= .init(x: -1, y: 1, z: 1)
self.transform.translation += SIMD3(0.0, -1, 0.0)
}
)
components.set(Entity.WRSubscribeComponent(subscription: subscription))
}
func updateRotation(for media: WRMedia) {
let angle = Angle.degrees( 0.0)
let rotation = simd_quatf(angle: Float(angle.radians), axis: SIMD3<Float>(0, 0.0, 0))
self.transform.rotation = rotation
}
struct WRSubscribeComponent: Component {
var subscription: AnyCancellable
}
}
case .failure(let error): assertionFailure("(error)")
Thread 1: Fatal error: Error Domain=MTKTextureLoaderErrorDomain Code=0 "Image decoding failed" UserInfo={NSLocalizedDescription=Image decoding failed, MTKTextureLoaderErrorKey=Image decoding failed}
Link sanbox: https://codesandbox.io/p/sandbox/webrtc-ios-lasted-issue-jzx9h5
Issue: Black video screen when url changed.
Reproduce step:
Get the source code on sanbox repo above
Install packages by command "npm install"
Start local web-app under https by command "HTTPS=true npm start"
Update url by click button "Update URL search param"
OS: iOS v17.4.1
Browser: Safari
Device: iPhone 11 pro
Anyone can help?
Note: it's works on iPhone X iOS version 16
Link video issue: https://streamable.com/rj07u8
Issue: WebRTC - User medias - Black video screen when url changed.
Reproduce step:
Get the source code on sanbox repo above
Install packages by command "npm install"
Start local web-app under https by command "HTTPS=true npm start"
Update url by click button "Update URL search param"
OS: iOS v17.4.1
Browser: Safari Device:
iPhone 11 pro
Anyone can help?
Note: it's works on iPhone X iOS version 16
Link video issue: https://streamable.com/rj07u8
I have a AVPlayer() which loads the video and places it on the screen ModelEntity in the immersive view using the VideoMaterial. This also makes the video untappable as it is a VideoMaterial.
Here's the code for the same:
let screenModelEntity = model.garageScreenEntity as! ModelEntity
let modelEntityMesh = screenModelEntity.model!.mesh
let url = Bundle.main.url(forResource: "<URL>",
withExtension: "mp4")!
let asset = AVURLAsset(url: url)
let playerItem = AVPlayerItem(asset: asset)
let player = AVPlayer()
let material = VideoMaterial(avPlayer: player)
screenModelEntity.components[ModelComponent.self] = .init(mesh: modelEntityMesh, materials: [material])
player.replaceCurrentItem(with: playerItem)
return player
I was able to load and play the video. However, I cannot figure out how to show the player controls (AVPlayerViewController) to the user, similar to the DestinationVideo sample app.
How can I add the video player controls in this case?
<div class="container" style="background-size: contain; user-select: none; pointer-events: none; height: 787.5px; width: 1400px;">
<div class="container__header">header</div>
<span>
<div class="video-container" style="inset: 17.853% 68% 11.747% 1%; z-index: 2; opacity: 1;">
<div class="video-container__placeholder-image">image</div>
<div class="video-container__content">
<div class="some-info"></div>
<div class="video-canvas"></div>
<div class="other-info"></div>
</div>
</div>
<div class="video-container" style="inset: 17.853% 1% 11.747% 33%; z-index: 1; opacity: 1;">
<div class="video-container__placeholder-image">image</div>
<div class="video-container__content">
<div class="video-canvas">
<div class="player" style="width: 100%; height: 100%; position: relative; overflow: hidden; background-color: black;">
<video playsinline="" muted="" style="object-fit: cover; width: 100%; height: 100%; position: absolute; left: 0px; top: 0px;"></video>
</div>
</div>
</div>
</div>
</span>
</div>
The page looks like
Then, the html changed as follows,
<div class="container" style="background-size: contain; user-select: none; pointer-events: none; height: 787.5px; width: 1400px;">
<div class="container__header">header</div>
<span>
<div class="video-container" style="inset: 100% 100% 0% 0%; z-index: 2; opacity: 0;">
<div class="video-container__placeholder-image">image</div>
<div class="video-container__content">
<div class="some-info"></div>
<div class="video-canvas"></div>
<div class="other-info"></div>
</div>
</div>
<div class="video-container" style="style="inset: 6.106% 5.98719% 0%; z-index: 3; opacity: 1;"">
<div class="video-container__placeholder-image">image</div>
<div class="video-container__content">
<div class="video-canvas">
<div class="player" style="width: 100%; height: 100%; position: relative; overflow: hidden; background-color: black;">
<video playsinline="" muted="" style="object-fit: cover; width: 100%; height: 100%; position: absolute; left: 0px; top: 0px;"></video>
</div>
</div>
</div>
</div>
</span>
</div>
From the mac developer tools, the width of the video is 1400px, but it render like the size is same as before in iOS17+(iOS17.1 and iOS17.3.1).
The expected results looks like
the actual results are looks like
I tried the same operators in iOS 14.6 and 16.4 and it worked as expected, this problem likes only exists in iOS17+.
Please help me to resolve this problom. Thanks.
We utilized AVFragmentedAssetMinder to refresh the player data. While notifications for AVAssetDurationDidChange were consistently received whenever the player duration changed.
However, following the release of iOS 17, notifications for AVAssetDurationDidChange ceased to be received.
Could you please advise anyone why this notification is not being triggered? what we have to change
NotificationCenter.default.addObserver(self, selector: #selector(self.onVideoUpdate), name: .AVAssetDurationDidChange, object: nil)
#AVPLAyer, #AVMUtableMovie
Hello, I've noticed that my server-hosted video that is larger than 19 MB doesn't work on iOS mobile devices. What is the maximum size limit (in MB) for the video html tag on iOS mobile devices?