Hi folks:
I've been creating .reality files out of Reality Composer for over a year. Some of the files are up to 500 mB and, prior to the last month they opened fine as AR projected experiences on even basic iPhones and iPads. Now, I think since iOS 18, a 64Mb file will open as an AR experience but files it seems from about 350MB up don't open. Files just opens a window displaying the name of the file, that it's a .reality file and the file size. But it no longer opens into either an AR or Object display of the .reality scene. Has there been a new file size limit put on .reality files that Files will open or what else is going on here. Have a client who was about to launch and experience based on the .Reality file I can no longer open. Please help.
Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.
Post
Replies
Boosts
Views
Activity
Set 3 controls to the AVCaptureSession and remove them all. The number of controls in the session is indeed 0, but the camera controls button still shows the previous 3 controls. If it is only 3->2 or 3->1, it can be modified normally, 3->0 is not OK, 0->3 is OK.
f (self.captureControl.zoom) {
if (self.zoomScaleControl) {
self.zoomScaleControl.enabled = false;
[_session removeControl:self.zoomScaleControl];
}
AVCaptureSlider *zoomSlider = [self.captureControl.zoom fetchCaptureSlider];
[zoomSlider setActionQueue:dispatch_get_main_queue() action:^(float zoomFactor) {
@strongify(self);
if ([self.dataOutputDelegate respondsToSelector:@selector(videoCaptureSession:tryChangeZoomScale:)]) {
[self.dataOutputDelegate videoCaptureSession:self tryChangeZoomScale:zoomFactor];
}
}];
self.zoomScaleControl = zoomSlider;
} else {
if (self.zoomScaleControl) {
self.zoomScaleControl.enabled = false;
[_session removeControl:self.zoomScaleControl];
}
self.zoomScaleControl = nil;
}
if (self.captureControl.exposure) {
if (self.exposureBiasControl) {
self.exposureBiasControl.enabled = false;
[_session removeControl:self.exposureBiasControl];
}
AVCaptureSlider *exposureSlider = [self.captureControl.exposure fetchCaptureSlider];
[exposureSlider setActionQueue:dispatch_get_main_queue() action:^(float bias) {
@strongify(self);
if ([self.dataOutputDelegate respondsToSelector:@selector(videoCaptureSession:tryChangeExposureBias:)]) {
[self.dataOutputDelegate videoCaptureSession:self tryChangeExposureBias:bias];
}
}];
self.exposureBiasControl = exposureSlider;
} else {
if (self.exposureBiasControl) {
self.exposureBiasControl.enabled = false;
[_session removeControl:self.exposureBiasControl];
}
self.exposureBiasControl = nil;
}
if (self.captureControl.len) {
if (self.lenControl) {
self.lenControl.enabled = false;
[_session removeControl:self.lenControl];
}
ORLenCaptureControlCustomModel *len = self.captureControl.len;
AVCaptureIndexPicker *picker = [len fetchCaptureSlider];
[picker setActionQueue:dispatch_get_main_queue() action:^(NSInteger selectedIndex) {
@strongify(self);
if ([self.dataOutputDelegate respondsToSelector:@selector(videoCaptureSession:didChangeLenIndex:datas:)]) {
[self.dataOutputDelegate videoCaptureSession:self didChangeLenIndex:selectedIndex datas:self.captureControl.len.indexDatas];
}
}];
self.lenControl = picker;
} else {
if (self.lenControl) {
self.lenControl.enabled = false;
[_session removeControl:self.lenControl];
}
self.lenControl = nil;
}
if ([_session canAddControl:self.zoomScaleControl]) {
[_session addControl:self.zoomScaleControl];
} else {
self.zoomScaleControl = nil;
}
if ([_session canAddControl:self.lenControl]) {
[_session addControl:self.lenControl];
} else {
self.lenControl = nil;
}
if ([_session canAddControl:self.exposureBiasControl]) {
[_session addControl:self.exposureBiasControl];
} else {
self.exposureBiasControl = nil;
}
if (_session.controlsDelegate == nil) {
[_session setControlsDelegate:self queue:GetCaptureControlQueue()];
}
We captured a spatial video with iPhone 15 pro.
When we try to export the video with AVAssetExportSession and AVAssetExportPresetMVHEVC960x960 it always go failed state and
exportSession.error?.localizedDescription yield "Operation Stopped" error.
Code implementation is straight forward .. other HEVC file works well.This problem occurred with only mv-hevc file.
func exportSpatialVideo(videoFilePath: String, outputUrl: URL){
let url:URL? = URL(fileURLWithPath: videoFilePath)
let asset: AVAsset = AVAsset(url:url!)
print(asset.description)
print(asset.tracks.first?.mediaType.rawValue)
let preset = "AVAssetExportPresetMVHEVC960x960"
let exportSession:AVAssetExportSession = AVAssetExportSession(asset: asset, presetName: preset)!
exportSession.outputURL = outputUrl
exportSession.shouldOptimizeForNetworkUse = true
exportSession.outputFileType = AVFileType.mov
exportSession.exportAsynchronously(completionHandler: {
switch exportSession.status {
case .unknown:
print("Unknown Error")
case .waiting:
print( "waiting ... ")
case .exporting:
print( "exporting ...")
case .completed:
print( "completed.")
case .failed:
print("failed.\(String(describing: exportSession.error?.localizedDescription))")
case .cancelled:
}
})
}
is there any solution for it ?
We are currently in the process of migrating our application from using ALAssetsLibrary to PHPhotoLibrary to ensure compatibility with the latest versions of iOS. However, we have noticed a discrepancy in the file sizes of images obtained using PHPhotoLibrary compared to those obtained using ALAssetsLibrary.
Specifically, we would like to understand the following points:
1.Reason for File Size Differences:
What are the reasons for the difference in file sizes between images obtained using ALAssetsLibrary and those obtained using PHPhotoLibrary?
Could you provide detailed information on the settings and options in PHPhotoLibrary that affect the size and quality of the images?
2.Optimal Settings:
What are the optimal settings in PHPhotoLibrary to obtain images with the same quality and file size as those obtained using ALAssetsLibrary?
If possible, could you provide code examples or recommended option settings?
In iOS, when I use AVPlayerViewController to play back a slow motion video, it has a "ramp-up" stage at the start and a "ramp-down" stage at the end, and the video plays at the normal speed (i.e. not slow motion) during these stages.
My question is: are these non-slow-motion stages defined in the video file itself? (e.g. some kind of meta data?) Or, is it just a playback approach used by AVPlayerViewController ?
Thanks!
I am using PHImageRequestOptions and PHImageManager to load images to my app.
I use version.original and resizeMode.none, version.original and resizeMode.extract.
Both used to work well but since iOS18 version.original and resizeMode.extract doesn't work anymore.
The images are loaded but the they are not shown. (Only the frames?)
Anyone knows why?
Thank you for reading.
I'm trying to get the item that's assigned to the currentEntry when playing any song which is currently coming up nil when the song is playing. Note currentEntry returns:
MusicPlayer.Queue.Entry(id: "evn7UntpH::ncK1NN3HS", title: "SONG TITLE")
I'm a bit stumped on the API usage. if the song is playing, how could the queue item be nil?
if queueObserver == nil {
queueObserver = ApplicationMusicPlayer.shared.queue.objectWillChange
.sink { [weak self] in
self?.handleNowPlayingChange()
}
}
}
private func handleNowPlayingChange() {
if let currentItem = ApplicationMusicPlayer.shared.queue.currentEntry {
if let song = currentItem.item {
self.currentlyPlayingID = song.id
self.objectWillChange.send()
print("Song ID: \(song.id)")
} else {
print("NO ITEM: \(currentItem)")
}
} else {
print("No Entries: \(ApplicationMusicPlayer.shared.queue.currentEntry)")
}
}
The new iPhone 16 supports spatial audio recordings in the camera app when recording videos. Is it possible to also record spatial audio without video, and is it possible for 3rd party developers to do so? If so, how do I need to configure AVAudioSession and/or AVAudioEngine to record spatial audio in my audio recording app on iPhone 16?
I am making an app that can two two videos, and then stitch them together on the screen (one video on top half and the other on bottom half).
This is achieved with AVMutableComposition, and then I am using AVAssetExportSession to export a mp4 file out:
guard let export = AVAssetExportSession(asset: composition, presetName: AVAssetExportPreset1920x1080) else {
return
}
export.exportAsynchronously {
....
When the two input videos are around 1GB each, starting the export session immediately increases memory usage by ~2GB, as if it moves the input files into memory immediately (my guess), and at some point my app is killed for using too much memory.
Is there a way to avoid this upfront memory usage and/or avoid getting killed?
AVAudioRecorder leaves a completely useless chunk of file if a crash happens while recording.
I need to be able to recover. I'm thinking of streaming the recording to disk. I know that is possible with AVAudioEngine but I also know that API is a headache that will lead to unexpected crashes unless you're lucky and the person who built it.
Does Apple have a recommended strategy for failsafe audio recordings? I'm thinking of chunking recordings using many instances of AVAudioRecorder and then stitching those chunks together.
I'm working on a little light and sound controller in Swift, driving DMX lights and audio.
For the audio portion, I need to play a bunch of looping sounds (long-duration MP3s), and occasionally play sound effects (short-duration sounds, varying formats). I want all of this mixed into selected channels on specific devices. That is, I might have one audio stream going to the left channel, and a completely different one going to the right channel.
What's the right API to do this from Swift? Core Audio? AVPlayer stuff?
I'm running into an issue where in some cases, when the AUHostingServiceXPC_arrow process is shut down by Logic, the process is terminated abruptly without calling AP_Close on all of the plugins hosted in the process. In our case, we have filesystem resources we need to clean up, and having stale files around from the last run can cause issues in new sessions, so this leak is having some pretty gnarly effects.
I can reproduce the issue using only Apple sample plugins, and it seems to be triggered by a timeout. If I have two different AU plugins in the session, and I add a 1 second sleep to the destructor of one of the sample plugins, Logic will force terminate the process and the remaining destructors are not called (even for the plugins without the 1 second sleep).
Is there a way to avoid this behavior? Or to safely clean up our plugin even if other plugins in the session take a second to tear down?
I have an app that gets data from Music.app with both the iTunesLibrary and MusicKit.
iTunesLibrary has ITLibArtist.sortName and ITLibAlbum.sortTitle and ITLibAlbum.sortAlbumArtist.
I can’t seem to find an equivalent in MusicKit. How are those properties obtained using MusicKit? Thanks.
FYI I have filed FB15554956 on this. You also may see my code at https://github.com/bolsinga/itunes_json
iOS18 Added support for WebRTC HEVC RFC 7789 RTP Payload Format. (112001659),How can i determine whether iOS 18 WebRTC uses hardware or software decoding for HEVC?
I have noticed a problem when a PHAsset creation request is made with the resource type PHAssetResourceType.photoProxy.
let creationRequest = PHAssetCreationRequest.forAsset()
creationRequest.addResource(with: .photoProxy, data: photoData, options: nil)
creationRequest.location = location
creationRequest.isFavorite = true
After successfully saving the resulting asset through PHPhotoLibrary.shared().performChanges, I could verify it in the Photos app.
I noticed that the created photo was initially marked as Favorite and that the location was added to the info as expected. The title of the image changes from "Today" to "" too.
Next, the photo was refreshed, and location data was purged. However, the title remains unchanged and displays the .
This refresh was also observed in the code. PHPhotoLibraryChangeObserver protocols func photoLibraryDidChange(_ changeInstance: PHChange) receives a change notification. The same asset has been changed, and there is no location information anymore. isFavorite information persists correctly.
After debugging for a few hours, I discovered that changing the resource type to .photo fixes this issue. Location data is not removed in the Photos app, and no refresh callback is seen in func photoLibraryDidChange(_ changeInstance: PHChange).
I initially used .photoProxy because in the AVCapturePhotoCaptureDelegate implementation class, I always get the call in func photoOutput(_ output: AVCapturePhotoOutput, didFinishCapturingDeferredPhotoProxy deferredPhotoProxy: AVCaptureDeferredPhotoProxy?, error: Error?). So here is where I am capturing the photo data as photoData = deferredPhotoProxy?.fileDataRepresentation().
Good afternoon
since I’ve installed ios 18 on me iphone 15 pro I have problems using Apple car play with my Ford Puma with Sync 3. More in detail, problems with audio commands, selecting audio track, bluetooth, etc..
Are you aware about it?
Thanks a lot
Regards
Alberto
I think I have the simplest possible Mac app trying to see if I can have VideoPlayer work in an Xcode Preview. It works in an iOS app project. In a Mac app project it builds and runs. But if I preview in Xcode it crashes.
The diagnostic says:
| [Remote] Unknown Error: The operation couldn’t be completed. XPC error received on message reply handler
|
| BSServiceConnectionErrorDomain (3):
| ==NSLocalizedFailureReason: XPC error received on message reply handler
| ==BSErrorCodeDescription: OperationFailed
The code I'm using is the exact code from the VideoPlayer documentation page. See this link.
Any ideas about this XPC error, and how to work around?
I'm using Xcode 16.0 on macOS 14.6.1
After investing more than a week into getting a bunch of audio unit projects converted into app + appex + framework, they all are now correctly loaded in-process in the demo host app that is part of Xcode's template.
However, Logic Pro adamantly refuses to load them in-process.
Does Logic Pro simply not do that ever, or is there some hint or configuration my plugins need to provide to enable that? If it is unsupported, will it be supported in some future version of Logic?
The entire point of investing that week was performance, which is moot if it is impossible to test the impact of loading in-process in a real-world usage scenario.
Hi Apple Engineer,
My App is using ImageCapture Framwork to connect DSLR Camera, Before iOS 18 this method is effective,but When I upgraded my iPhone and iPad, found my app can`t connect DSLR Camera, open Setting -> Privacy & Security -> Files and Folders permission, can‘t found my app, I swear it worked before iOS 18.
I find other developers have the same problem.
https://forums.developer.apple.com/forums/thread/756960 .
https://developer.apple.com/forums/thread/765768.
I also found a process for reproducing this problem in ios 18,
Do reset all settings.
Can you help me with this problem? Or tell me how to use the API properly.Look forward to your reply. Thank you very much.
We have a new photo sharing app (https://photodare.ca).
We've had no issues with photos loading in North America and Caribbean, but so far 2 users (Germany, Netherlands) are saying they can't load photos even though they've proven they have permissions for photos enabled.
I can't reproduce this in Canada.
Anyone know about other permissions we need to setup for european countries, or is anyone in GDPR countries willing to try this for us?
They were on 17.6.1.
Thanks either way