Dear Experts,
When I use PHImageManager's requestImageDataAndOrientationForAsset method I always seem to get JPEG data, even when the original items in the photo library are PNGs (such as screenshots) or HEICs.
Have I missed a setting somewhere that determines whether or not a "most compatible" format is used in this API?
Thanks.
Photos & Camera
RSS for tagExplore technical aspects of capturing high-quality photos and videos, including exposure control, focus modes, and RAW capture options.
Post
Replies
Boosts
Views
Activity
I am developing a camera extension as described here Creating a camera extension as described in Creating a camera extension with Core Media I/O.
To ensure this wasn't some issue with my code, I reverted to the sample code added when you choose File > New > Target > Camera Extension in Xcode, in other words, I am using the example code provided by Apple.
I am able to install the camera extension and see it in QuickTime Player, where I choose it as the video input. In QuickTime Player I see the white line generated by the sample code moving up and down. For some period of time, it works.
But eventually, the video in QuickTime Player freezes. The thing that's really weird is if I add some NSLog() statements at the point in the code where it returns the newly created sample:
[self->_streamSource.stream sendSampleBuffer:sbuf discontinuity:CMIOExtensionStreamDiscontinuityFlagNone hostTimeInNanoseconds:(uint64_t)(CMTimeGetSeconds(timingInfo.presentationTimeStamp) * NSEC_PER_SEC)];
the samples are still being generated and sent to the stream. But the apparently QuickTime Player has decided to stop consuming them.
I thought maybe setting the discontinuity parameter to CMIOExtensionStreamDiscontinuityFlagTime or CMIOExtensionStreamDiscontinuityFlagSampleDropped if the delta time since the last sample was generated was off by a tiny bit, but this did not improve the situation.
Finally, could this have something to do with frequently installing and uninstalling my camera extension as part of the debugging and testing process?
Thanks in advance for any advice you might have!
Hello!
We want to try the new features of DockKit on an iOS 18 device. However, I am unable to pair the DockKit-compatible device with my iOS 18 device. Is there a way to successfully pair them?
各位开发者,想问问后续是否会慢慢恢复
I am using PHPickerViewController to get assets from the photo library. It works fine, but it presents the images so that the newest is first. I would like to choose the order. In some cases I prefer the oldest image to be first.
I have checked in PHPickerConfiguration with no success. I am just wondering if it is possible.
Hi,
After installing iPadOS18 Beta3 on my iPad 7th gen, the default camera app no longer detects QR codes.
I tried updating to Beta7, but the issue remained.
Also, third-party apps that use AVCaptureMetadataOutput in AVFoundation Framework to detect QR codes also no longer work.
You can reproduce the issue by running default camera app or the AVFoundation sample code from the Apple developer site on iPad 7th gen (iPadOS18Beta installed).
https://developer.apple.com/documentation/avfoundation/capture_setup/avcambarcode_detecting_barcodes_and_faces
Has anyone else experienced this issue?
I would like to know if this issue occurs on other iPad models as well.
This is similar to the following issue that previously occurred with iPadOS 17.4.
https://support.apple.com/en-lamr/118614
https://developer.apple.com/forums/thread/748092
Hi All , I use iPhone 15 Pro Max with ios 18 beta 5 , when i use x app I can't see Any Image ! All image show like now available
I have an Electron app built for macOS, and it was distributed via 'Developer ID' for years, it worked well and I was able to access the photos in the system Photos library. Surely I already have the 'NSPhotoLibraryUsageDescription' key in Info.plist.
Recently we are trying to publish this app to Mac App Store, so I have to turn on the sandbox, after that the app starts giving XPC errors while accessing the Photos library. The errors look like:
PHAuthorizationStatus: Authorized
CoreData: XPC: sendMessage: failed #0
CoreData: XPC: Unable to sendMessage: to server
...
CoreData: XPC: sendMessage: failed #7
CoreData: XPC: Unable to connect to server with options {
NSPersistentHistoryTrackingKey = 1;
NSXPCStoreServerEndpointFactory = "<PLXPCPhotoLibraryStoreEndpointFactory: 0x7fc67e8af370>";
skipModelCheck = 1;
}
CoreData: XPC: Unable to load metadata: Error Domain=NSCocoaErrorDomain Code=134060 "A Core Data error occurred." UserInfo={Problem=Unable to send to server; failed after 8 attempts.}
CoreData: fault: Unable to create token NSXPCConnection. NSXPCStoreServerEndpointFactory 0x7fc67e8af370 -newEndpoint returned nil
CoreData: error: Failed to create NSXPCConnection
It seems the app could detect the current PHAuthorizationStatus which is Authorized, but it can't fetch the photos from the Photos library (using PhotoKit).
I learned from here that I could look for errors from the sandboxd daemon, so I did that, here is what I saw:
Sandbox: Picture Keeper(32625) deny(1) mach-lookup com.apple.photos.service
Violation: deny(1) mach-lookup com.apple.photos.service
Process: Picture Keeper [32625]
Path: /Applications/Picture Keeper.app/Contents/MacOS/Picture Keeper
Load Address: 0x103bd3000
Identifier: com.simplifieditproducts.picturekeepermas
Version: 4575 (4.5.75)
Code Type: x86_64 (Native)
Parent Process: Picture Keeper [1]
Responsible: /Applications/Picture Keeper.app/Contents/MacOS/Picture Keeper
User ID: 501
Date/Time: 2024-08-26 16:16:14.645 EDT
OS Version: macOS 14.5 (23F79)
Release Type: User
Report Version: 8
MetaData: {"process_path":["Users","Kevin","Projects","Electron","picturekeeper-electron","dist","picturekeeper","mas-dev","Picture Keeper.app","Contents","MacOS","Picture Keeper"],"apple-internal":false,"primary-filter":"global-name","policy-description":"Sandbox","flags":5,"platform-policy":false,"build":"macOS 14.5 (23F79)","process-path":"\/Applications\/Picture Keeper.app\/Contents\/MacOS\/Picture Keeper","responsible-process-path":"\/Applications\/Picture Keeper.app\/Contents\/MacOS\/Picture Keeper","primary-filter-value":"com.apple.photos.service","platform_binary":"no","responsible-process-signing-id":"com.simplifieditproducts.picturekeepermas","hardware":"Mac","target":"com.apple.photos.service","action":"deny","mach_namespace":1,"checker-pid":1,"container":"\/Users\/Kevin\/Library\/Containers\/com.simplifieditproducts.picturekeepermas\/Data","binary-in-trust-cache":false,"team-id":"LU744924UY","process":"Picture Keeper","global-name":"com.apple.photos.service","platform-binary":false,"pid":32625,"summary":"deny(1) mach-lookup com.apple.photos.service","checker":"launchd","responsible-process-team-id":"xxxxx","operation":"mach-lookup","normalized_target":["com.apple.photos.service"],"errno":1,"uid":501,"profile-flags":0,"profile-in-collection":false,"sandbox_checker":"launchd","signing-id":"com.simplifieditproducts.picturekeepermas","release-type":"User"}
I believe I already have the necessary entitlements for the Photos library, see:
codesign -d --entitlements - /Applications/Picture\ Keeper.app/Contents/MacOS/Picture\ Keeper
[Dict]
[Key] com.apple.application-identifier
[Value]
[String] xxxx.com.simplifieditproducts.picturekeepermas
[Key] com.apple.developer.team-identifier
[Value]
[String] xxxx
[Key] com.apple.security.app-sandbox
[Value]
[Bool] true
[Key] com.apple.security.application-groups
[Value]
[Array]
[String] xxxx.com.simplifieditproducts.picturekeepermas
[Key] com.apple.security.assets.movies.read-only
[Value]
[Bool] true
[Key] com.apple.security.assets.music.read-only
[Value]
[Bool] true
[Key] com.apple.security.assets.pictures.read-write
[Value]
[Bool] true
[Key] com.apple.security.cs.allow-dyld-environment-variables
[Value]
[Bool] true
[Key] com.apple.security.cs.allow-jit
[Value]
[Bool] true
[Key] com.apple.security.cs.allow-unsigned-executable-memory
[Value]
[Bool] true
[Key] com.apple.security.cs.disable-executable-page-protection
[Value]
[Bool] true
[Key] com.apple.security.cs.disable-library-validation
[Value]
[Bool] true
[Key] com.apple.security.device.usb
[Value]
[Bool] true
[Key] com.apple.security.files.bookmarks.app-scope
[Value]
[Bool] true
[Key] com.apple.security.files.bookmarks.document-scope
[Value]
[Bool] true
[Key] com.apple.security.files.downloads.read-only
[Value]
[Bool] true
[Key] com.apple.security.files.user-selected.read-write
[Value]
[Bool] true
[Key] com.apple.security.network.client
[Value]
[Bool] true
[Key] com.apple.security.network.server
[Value]
[Bool] true
[Key] com.apple.security.personal-information.location
[Value]
[Bool] true
[Key] com.apple.security.personal-information.photos-library
[Value]
[Bool] true
By the way, the Photos library related code was built into a .node file (which is a dylib), and it will be loaded by the main executable during runtime.
Anything I missed? Thank you!
I am trying to retrieve all PHAssets from the Photos Library that have been edited. PHAsset contains the "hasAdjustments" boolean variable but I cannot use that in an NSPredicate to filter for edited photos.
What is the proper mechanism to filter out non-edited photos?
Here is the code I have written that crashes:
let formatString = "mediaType = %d && hasAdjustments == YES"
let allPhotosOptions = PHFetchOptions()
allPhotosOptions.sortDescriptors = [NSSortDescriptor(key: "creationDate", ascending: true)]
allPhotosOptions.predicate = NSPredicate(format: formatString, PHAssetMediaType.image.rawValue)
allPhotosOptions.includeAssetSourceTypes = [.typeUserLibrary, .typeiTunesSynced, .typeCloudShared]
return PHAsset.fetchAssets(with: allPhotosOptions)
Thanks,
Rick
I’ve built a iOS camera app that applies many CIFilters to an image captured by the camera. Some of my users have reported that on occasion the images have large parts that are blank, see below:
Frustratingly, I can’t reproduce this myself! Does anyone know what could he causing it, is it a memory issue? I haven’t posted the code as there’s a lot to look over and I’m not sure it would help diagnose it.
Thanks for any pointers.
How can I use the in-app sharing extension to get the livephoto file shared by the user through the album, including the video and the picture, instead of just the URL of the picture? I got the "com.apple.live-photo" and "public.jpeg" logos, but I can't get any data through "com.apple.live-photo"
We build a video conference app and want to turn off the gesture reaction feature by default by sepecifying NSCameraReactionEffectGesturesEnabledDefault. The menu bar FaceTime info shows that the feature was turn off but it still works!
Our app is based on electron and all of the application plist was added the key and the camera is used by a spawned process (not electron process).
By turning on and turning off gesture reaction on menu bar, it will be turely turned off. So manual value will affect whole application but default value only affect application process?
I have a third party app for controlling Sony mirrorless cameras over WiFi. I’m really excited to integrate the new camera controls on the iPhone 16 Pro with the app. I’ve found the documentation around this, and seems I need an AVCaptureSession setup in order to utilise them.
func configureControls(_ controls: [AVCaptureControl]) {
// Verify the host system supports controls; otherwise, return early.
guard captureSession.supportsControls else { return }
// Begin configuring the capture session.
captureSession.beginConfiguration()
// Remove previously configured controls, if any.
for control in captureSession.controls {
captureSession.removeControl(control)
}
// Iterate over the passed in controls.
for control in controls {
// Add the control to the capture session if possible.
if captureSession.canAddControl(control) {
captureSession.addControl(control)
} else {
print("Unable to add control \(control).")
}
}
// Commit the capture session configuration.
captureSession.commitConfiguration()
}
can I just use a freshly initialised capture session for this? Or does it need to be configured in any other ways? Are there any down sides to creating a session (CPU usage etc) that I may experience from this?
Also, the scope of the controls is quite narrow. For something like shutter speed or aperture that has quite a number of possible values but requires custom labels, and a non-linear scale (so the AVCaptureIndexPicker seems to be the way to go). Will that picker support enough values to represent something like shutter speed or aperture? Is there any chance we may get non-linear float based controls in the future, which may feel more natural from a UX perspective than index-based?
Apologies, lots of edits going on here as I think about this more.
Is there any way, or would any way be considered of putting these controls in a disabled state like with other UI elements in iOS? There are times (during capture for example) that a lot of these settings can be unavailable (as communicated by the Sony camera) to be changed by the user, and managing a queue of changes when the function is unavailable to be set is going to be a challenge. If there won’t be, how will they behave if controls are removed whilst being interacted with? Presumably they will disappear entirely from the UI?
Thanks!
I found these two documents for the Camera Control.
Human Interface Guidelines:
https://developer.apple.com/design/human-interface-guidelines/camera-control
Developer documentation:
https://developer.apple.com/documentation/avfoundation/capture_setup/enhancing_your_app_experience_with_the_camera_control
Any chance Apple has an updated or new code sample for the AVFoundation that would be integrating Camera Control? Can't find it yet.
We are seeing how most of the images loaded from the PHPickerViewController with:
result.itemProvider.loadFileRepresentation(forTypeIdentifier: imageType.identifier)
take a fraction of a second to load. However, since a few weeks ago, we are seeing more a more images that take minutes, 10 minutes in some cases.
This images shouldn't be in iCloud, they are recently taken, the devices have enough free space and we are trying with good network conditions.
Are others out there experiencing the same? Any tips to prevent these long times?
After upgrading to Xcode16RC, in an old project based on ObjC, I directly used the following controller code in AppDelegate:
- (void)viewDidLoad {
[super viewDidLoad];
// Do any additional setup after loading the view.
UIButton *b = [[UIButton alloc]initWithFrame:CGRectMake(100, 100, 44, 44)];
[b setTitle:@"title" forState:UIControlStateNormal];
[self.view addSubview:b];
[b addTarget:self action:@selector(onB:) forControlEvents:UIControlEventTouchUpInside];
}
- (IBAction)onB:(id)sender{
PHPickerConfiguration *config = [[PHPickerConfiguration alloc]initWithPhotoLibrary:PHPhotoLibrary.sharedPhotoLibrary];
config.preferredAssetRepresentationMode = PHPickerConfigurationAssetRepresentationModeCurrent;
config.selectionLimit = 1;
config.filter = nil;
PHPickerViewController *picker = [[PHPickerViewController alloc]initWithConfiguration:config];
picker.modalPresentationStyle = UIModalPresentationFullScreen;
picker.delegate = self;
[self presentViewController:picker animated:true completion:nil];
}
- (void)picker:(PHPickerViewController *)picker didFinishPicking:(NSArray<PHPickerResult *> *)results{
}
Environment: Simulator iPhone 15 Pro (iOS18)
Before this version (iOS17.4), clicking the button to pop up the system photo picker interface was normal (the top boundary was within the SafeAreaGuide area), but now the top boundary of the interface aligns directly to the top of the window, and clicking the photo cell is unresponsive.
If I create a new Target, using the same codes, the photo picker page does not have the above problem.
Therefore, I suspect it may be due to the old project’s .proj file’s info.plist, buildSetting, or buildPhase lacking some default configuration key value required by the new version, (My project was built years ago may be from iOS13 or earlier ) but I cannot confirm the final cause.
iOS18.0 has the additional messages:
objc[79039]: Class UIAccessibilityLoaderWebShared is implemented in both /Library/Developer/CoreSimulator/Volumes/iOS_22A3351/Library/Developer/CoreSimulator/Profiles/Runtimes/iOS 18.0.simruntime/Contents/Resources/RuntimeRoot/System/Library/AccessibilityBundles/WebCore.axbundle/WebCore (0x198028328) and /Library/Developer/CoreSimulator/Volumes/iOS_22A3351/Library/Developer/CoreSimulator/Profiles/Runtimes/iOS 18.0.simruntime/Contents/Resources/RuntimeRoot/System/Library/AccessibilityBundles/WebKit.axbundle/WebKit (0x1980fc398). One of the two will be used. Which one is undefined.
AX Safe category class 'SLHighlightDisambiguationPillViewAccessibility' was not found!
Has anyone encountered the same issue as me?
func deviceDidBecomeReady(withCompleteContentCatalog device: ICCameraDevice) {
print("deviceDidBecomeReady", device.mediaFiles?.count)
}
Mac studio M1 macos app or iphone 15 pro ios app device.mediaFiles is empty
im add
Photos Library Entitlement.
com.apple.security.device.usb
NSCameraUsageDescription
from:https://developer.apple.com/documentation/imagecapturecore
sign is develpment
camera is ILCE-7RM4A (snoy)
other 3rd apps connect cmaera and download files is right
Hello,
I am using the below code to request for video to be downloaded from iCloud. But the downloaded video size does not match with the original size of video.
PHVideoRequestOptions *options = [[PHVideoRequestOptions alloc] init];
options.version = PHVideoRequestOptionsVersionOriginal;
options.deliveryMode = PHVideoRequestOptionsDeliveryModeHighQualityFormat;
[options setNetworkAccessAllowed:YES];
[[PHImageManager defaultManager] requestAVAssetForVideo:asset options:options resultHandler:
^(AVAsset *avAsset, AVAudioMix *audioMix, NSDictionary *info) {
}
Original size of video, getting it from below code
NSArray *resources = [PHAssetResource assetResourcesForAsset:asset];
for (PHAssetResource *resource in resources) {
if ((resource.type == PHAssetResourceTypeVideo) || (resource.type== PHAssetResourceTypePhoto)){
return resource;
}
}
[resource valueForKey:@"fileSize"]
The original Size and the downloaded size of video is not matching.
Can anyone help me to debug what is the issue here
We are trying to build a simple image capture app using AVFoundation and AVCaptureDevice.
Custom settings are used for exposure point and bias.
But when image is captured using front camera , the image captured from the app and front native camera does not match.
The image captured from the app includes more area than the native app.
Also there is difference between the tilt angle between two images.
So is there any way to capture image exactly same as native camera using AVFoundation and AVCaptureDevice.
Native
Custom
When I set a custom exposure duration, like 1/8, and then switch back to continuous auto exposure, the exposure duration in areas that were previously 1/17 changes to something like 1/5 or 1/10. As a result, the screen becomes laggy and overexposed. I'm not sure why this is happening.