I have noticed a problem when a PHAsset creation request is made with the resource type PHAssetResourceType.photoProxy.
let creationRequest = PHAssetCreationRequest.forAsset()
creationRequest.addResource(with: .photoProxy, data: photoData, options: nil)
creationRequest.location = location
creationRequest.isFavorite = true
After successfully saving the resulting asset through PHPhotoLibrary.shared().performChanges, I could verify it in the Photos app.
I noticed that the created photo was initially marked as Favorite and that the location was added to the info as expected. The title of the image changes from "Today" to "" too.
Next, the photo was refreshed, and location data was purged. However, the title remains unchanged and displays the .
This refresh was also observed in the code. PHPhotoLibraryChangeObserver protocols func photoLibraryDidChange(_ changeInstance: PHChange) receives a change notification. The same asset has been changed, and there is no location information anymore. isFavorite information persists correctly.
After debugging for a few hours, I discovered that changing the resource type to .photo fixes this issue. Location data is not removed in the Photos app, and no refresh callback is seen in func photoLibraryDidChange(_ changeInstance: PHChange).
I initially used .photoProxy because in the AVCapturePhotoCaptureDelegate implementation class, I always get the call in func photoOutput(_ output: AVCapturePhotoOutput, didFinishCapturingDeferredPhotoProxy deferredPhotoProxy: AVCaptureDeferredPhotoProxy?, error: Error?). So here is where I am capturing the photo data as photoData = deferredPhotoProxy?.fileDataRepresentation().
Photos & Camera
RSS for tagExplore technical aspects of capturing high-quality photos and videos, including exposure control, focus modes, and RAW capture options.
Post
Replies
Boosts
Views
Activity
I am using PHImageRequestOptions and PHImageManager to load images to my app.
I use version.original and resizeMode.none, version.original and resizeMode.extract.
Both used to work well but since iOS18 version.original and resizeMode.extract doesn't work anymore.
The images are loaded but the they are not shown. (Only the frames?)
Anyone knows why?
Thank you for reading.
We are currently in the process of migrating our application from using ALAssetsLibrary to PHPhotoLibrary to ensure compatibility with the latest versions of iOS. However, we have noticed a discrepancy in the file sizes of images obtained using PHPhotoLibrary compared to those obtained using ALAssetsLibrary.
Specifically, we would like to understand the following points:
1.Reason for File Size Differences:
What are the reasons for the difference in file sizes between images obtained using ALAssetsLibrary and those obtained using PHPhotoLibrary?
Could you provide detailed information on the settings and options in PHPhotoLibrary that affect the size and quality of the images?
2.Optimal Settings:
What are the optimal settings in PHPhotoLibrary to obtain images with the same quality and file size as those obtained using ALAssetsLibrary?
If possible, could you provide code examples or recommended option settings?
Set 3 controls to the AVCaptureSession and remove them all. The number of controls in the session is indeed 0, but the camera controls button still shows the previous 3 controls. If it is only 3->2 or 3->1, it can be modified normally, 3->0 is not OK, 0->3 is OK.
f (self.captureControl.zoom) {
if (self.zoomScaleControl) {
self.zoomScaleControl.enabled = false;
[_session removeControl:self.zoomScaleControl];
}
AVCaptureSlider *zoomSlider = [self.captureControl.zoom fetchCaptureSlider];
[zoomSlider setActionQueue:dispatch_get_main_queue() action:^(float zoomFactor) {
@strongify(self);
if ([self.dataOutputDelegate respondsToSelector:@selector(videoCaptureSession:tryChangeZoomScale:)]) {
[self.dataOutputDelegate videoCaptureSession:self tryChangeZoomScale:zoomFactor];
}
}];
self.zoomScaleControl = zoomSlider;
} else {
if (self.zoomScaleControl) {
self.zoomScaleControl.enabled = false;
[_session removeControl:self.zoomScaleControl];
}
self.zoomScaleControl = nil;
}
if (self.captureControl.exposure) {
if (self.exposureBiasControl) {
self.exposureBiasControl.enabled = false;
[_session removeControl:self.exposureBiasControl];
}
AVCaptureSlider *exposureSlider = [self.captureControl.exposure fetchCaptureSlider];
[exposureSlider setActionQueue:dispatch_get_main_queue() action:^(float bias) {
@strongify(self);
if ([self.dataOutputDelegate respondsToSelector:@selector(videoCaptureSession:tryChangeExposureBias:)]) {
[self.dataOutputDelegate videoCaptureSession:self tryChangeExposureBias:bias];
}
}];
self.exposureBiasControl = exposureSlider;
} else {
if (self.exposureBiasControl) {
self.exposureBiasControl.enabled = false;
[_session removeControl:self.exposureBiasControl];
}
self.exposureBiasControl = nil;
}
if (self.captureControl.len) {
if (self.lenControl) {
self.lenControl.enabled = false;
[_session removeControl:self.lenControl];
}
ORLenCaptureControlCustomModel *len = self.captureControl.len;
AVCaptureIndexPicker *picker = [len fetchCaptureSlider];
[picker setActionQueue:dispatch_get_main_queue() action:^(NSInteger selectedIndex) {
@strongify(self);
if ([self.dataOutputDelegate respondsToSelector:@selector(videoCaptureSession:didChangeLenIndex:datas:)]) {
[self.dataOutputDelegate videoCaptureSession:self didChangeLenIndex:selectedIndex datas:self.captureControl.len.indexDatas];
}
}];
self.lenControl = picker;
} else {
if (self.lenControl) {
self.lenControl.enabled = false;
[_session removeControl:self.lenControl];
}
self.lenControl = nil;
}
if ([_session canAddControl:self.zoomScaleControl]) {
[_session addControl:self.zoomScaleControl];
} else {
self.zoomScaleControl = nil;
}
if ([_session canAddControl:self.lenControl]) {
[_session addControl:self.lenControl];
} else {
self.lenControl = nil;
}
if ([_session canAddControl:self.exposureBiasControl]) {
[_session addControl:self.exposureBiasControl];
} else {
self.exposureBiasControl = nil;
}
if (_session.controlsDelegate == nil) {
[_session setControlsDelegate:self queue:GetCaptureControlQueue()];
}
See Configuration Details at the end of this message.
Despite numerous attempts, I have been unable to determine the correct syntax to fetch photo albums from my iPad Pro 13.0 using Xcode and Swift.
All the photo album were synced to the iPad Pro 13-inch using the latest versions of Apple iTunes for Windows from an external Western Digital G-Drive hard drive (No iCloud). All synced albums appear under "From My Mac" on the iPad. I only want to access each album's photo and video count.
See sample code snippet below. I have tried multiple subtype options and album types without success. Zero albums are always returned despite having around 3900 albums in the iPad Pro 13.0 photo library. Authorization to the photo library does not appear to be the problem.
PHPhotoLibrary.requestAuthorization { status in
if status == .authorized {
let result = PHAssetCollection.fetchAssetCollections(with: .album, subtype: .any, options: nil)
if result.count == 0 {
print("No albums found.")
return
}
}
}
Any help or suggestions would greatly be appreciated.
ApplePhoto
Configuration Details
iPad Pro 13-inch (M4) (iPad16,6)
iPadOS = 17.7
iCloud = Turned Off
iPad Pro Photo Library Albums = 3900
iPad Pro Photo Library Photos = 118000
iPad Pro Photo Library Videos = 4800
MacOS = Sonoma 14.6.1
XCode Version = 16.0
Swift Version = 5.0 (Xcode Default)
Microsoft Windows 10 Pro Version = 22H2
Apple iTunes for Windows = 12.13.4.4
I have an app that edits photos in your library. When I call
try CIContext().writeHEIFRepresentation(of: editedImage, to: fileURL, format: .RGBA8, colorSpace: originalImage.colorSpace!)
The following is logged to the console:
writeImageAtIndex:1012: ⭕️ ERROR: 'App' is trying to save an opaque image (5712x4284) with 'AlphaLast'. This would unnecessarily increase the file size and will double (!!!) the required memory when decoding the image --> ignoring alpha.
What does that mean and how can I resolve it?
Xcode Version 16.0 (16A242d)
iOS 18.1 (22B82)
I've requested the authentication in my main app.
PHPhotoLibrary.requestAuthorization(for: .readWrite) { status in }
Add the privacy description in both the main app and the extension.
But No matter the device is locked or unlocked. When I call
let fetchResult = PHAsset.fetchAssets(with: .image, options: nil)
let count = fetchResult.count
the count is always zero, even after a new photo is saved to the album in the same session.
I have an app that allows you to edit your photos. To preserve HDR, I edit both the SDR image and gain map image, like so:
let sdrImage = CIImage(data: data, options: [.applyOrientationProperty: true])
let gainMapImage = CIImage(data: data, options: [.applyOrientationProperty: true, .auxiliaryHDRGainMap: true])
// edit them...
try CIContext().writeHEIFRepresentation(of: sdrImage, to: url, format: .RGBA8, colorSpace: colorSpace, options: [.hdrGainMapImage: gainMapImage])
I also support editing the still photo in Live Photos. To do this you create a PHLivePhotoEditingContext, set the frameProcessor block which gives you a CIImage that I edit when the frame.type is .photo, then you create a PHContentEditingOutput and call saveLivePhoto. I’m not seeing any way to preserve HDR here. Interestingly the frame processor is called twice with .photo frame.type, but I don’t see any difference between these images. How can I edit a gain map image to preserve HDR in the still photo of a Live Photo?
I extracted the gain map info from an image using
let url = Bundle.main.url(forResource: "IMG_1181", withExtension: "HEIC")
let source = CGImageSourceCreateWithURL(url! as CFURL, nil)
let portraitData = CGImageSourceCopyAuxiliaryDataInfoAtIndex(source!, 0, kCGImageAuxiliaryDataTypeHDRGainMap) as! [AnyHashable : Any]
let metaData = portraitData[kCGImageAuxiliaryDataInfoMetadata] as! CGImageMetadata
Then I printed all the metadata tags
func printMetadataProperties(from metadata: CGImageMetadata) {
guard let tags = CGImageMetadataCopyTags(metadata) as? [CGImageMetadataTag] else {
return
}
for tag in tags {
if let prefix = CGImageMetadataTagCopyPrefix(tag) as String?,
let namespace = CGImageMetadataTagCopyNamespace(tag) as String?,
let key = CGImageMetadataTagCopyName(tag) as String?,
let value = CGImageMetadataTagCopyValue(tag){
print("Namespace: \(namespace), Key: \(key), Prefix: \(prefix), value: \(value)")
} else {
}
}
}
//Namespace: http://ns.apple.com/ImageIO/1.0/, Key: hasXMP, Prefix: iio, value: True
//Namespace: http://ns.apple.com/HDRGainMap/1.0/, Key: HDRGainMapVersion, Prefix: HDRGainMap, value: 131072
//Namespace: http://ns.apple.com/HDRGainMap/1.0/, Key: HDRGainMapHeadroom, Prefix: HDRGainMap, value: 3.586325
I want to create a new CGImageMetadata and tags.
But when it comes to the HDR tags. It always fails to add to metadata.
let tag = CGImageMetadataTagCreate(
"http://ns.apple.com/HDRGainMap/1.0/" as CFString,
"HDRGainMap" as CFString,
"HDRGainMapHeadroom" as CFString,
.default,
3.56 as CFNumber
)
let path = "\(HDRGainMap):\(HDRGainMapHeadroom)" as CFString
let success = CGImageMetadataSetTagWithPath(metadata, nil, path, tag)// always false
The hasXMP works fine.
Is HDR a private dict for Apple?
To get calibration data during video recording, I use AVCaptureDepthDataOutput together with AVCaptureVideoDataOutput, synchronized via AVCaptureDataOutputSynchronizer (using the dataOutputSynchronizer method of CameraController in the example).
The issue is that AVCaptureDepthDataOutput can only be used with .builtInLiDARDepthCamera, meaning it isn’t available for devices without LiDAR.
Is it possible to obtain calibration data during video recording on devices without LiDAR, such as with .builtInWideAngleCamera?
Can lensDistortionLookupTable and lensDistortionCenter be used to undistort the sampleBuffer I receive from AVCaptureVideoDataOutput?
My Camera app is repeatedly opening even though I am not taking any action to open it when I use iPhone. Today during a FaceTime call, the Camera app opened while the phone was unlocked without me touching anything. It didn’t end the FaceTime call, but just put the video on pause for the person I was speaking with. I force closed the Camera app, then it happened again a few minutes later.
This has happened while using Google Maps and other apps as well, while the phone is unlocked.
This is also happening while the phone is locked, just sitting on a table. All the sudden I look over and the screen is active showing the camera view.
Today this has happened at least 20 times. I need to know how to stop it.
I am on iOS 18.1 and enrolled in iOS 18 Public Beta. There are no pending software updates.
Hello,
I have a problem reading a 2D data matrix type code with a camera. In the application, I use AVFoundation to operate the camera and work with 2D codes, and in the vast majority there is no problem with loading. Nothing special.
I originally thought it might be a problem in my code, but I got the same result when I tried with the Camera app integrated in IOS. It can be seen that only the LiveText API for text recognition worked.
But I am attaching the code with which the camera has a problem, even though the code looks perfectly fine at first glance. A classic handheld 2D code reader will read the code just fine.
Can someone please explain to me why the camera, which normally reads these codes at the speed of light, sometimes has a problem with the codes?
Thank you
[Personal Information Edited by Moderator]
I created a locked camera capture extension as explained in Apple's documentation.
I'm trying to explore the possibilities of using a bluetooth peripheral from that extension - anybody knows if this is possible?
The CBCentralManagerDelegate reports .unsupported in func centralManagerDidUpdateState, even if I have provided all the permissions in Info.plist.
I am experiencing a bug when using a AVCapturePhotoBracketSettings object to capture a bracketed photo sequence on iPhone 16 Pro.
Specifically, when I pass in an array of exposure values: [-x, 0, +x], where x >= 3.
Specifically, the high exposure photo capture returns a black image.
STEPS TO REPRODUCE
Run the sample app I have provided on an iPhone 16 Pro
Notice that bracketed images captured where the eV is set to [-3,0,+3], [-4,0,+4], or [-5,0,+5] return a black image for the high exposure photo.
Notice that on other iOS devices (like iPhone 13 Pro), the high exposure photo is returned as high brightness as expected.
I have also added two folders in the sample project that show screenshots of the bug: iPhone13Pro & iPhone16Pro
Sample Project:
https://www.icloud.com/iclouddrive/090O_68Z0Nh2UOxmPRwu56Tmw#Focused16ProBracketedCaptureBug
iOS (Official) Photos app can display some EXIF-related metadata (e.g. camera and lens info, ISO, shutter speed, F-number) even when photos are offloaded to iCloud and the device is not connected to internet (e.g. airplane mode).
However, with the Photos.framework, we need to download photos to retrive those metadata (which means it will not work with airplane mode).
I tried the following methods, but none of those worked when photos were offloaded to iCloud and the device was in airplane mode:
Requesting data with PHImageManager.default().requestImageDataAndOrientation
Result: It does not return Data if the photo is not stored locally on the device, even with options.deliveryMode = .fastFormat
Converting PHAsset#localIdentifier to an AssetsLibrary.framework URL (assets-library://asset/...)
(I am aware that AssetsLibrary.framework is deprecated, but this was just a test.)
Result: If PHImageManager does not returns Data, ALAsset#defaultRepresentation().metadata() returns an empty NSDictionary
While customizing ImagePicker and using it, we find out that the metadata is not reflected normally and report it.
The situation is as follows.
The time or time zone of an image is changed in the Photos app.
Changing the time zone of an image with an actual capture date of 2024:11:08 08:27:44 → 2024:11:07 17:27:44
Image data is extracted from a PHAsset using PHImageManager.
The metadata is obtained from this image data.
The time zone information exposed in the Exif tag information does not reflect the time or time zone changed in the Photos app.
let asset: PHAsset = ...
....
let options = PHImageRequestOptions()
options.isSynchronous = true
options.version = .current
options.deliveryMode = .highQualityFormat
options.resizeMode = .none
options.normalizedCropRect = .zero
options.isNetworkAccessAllowed = true
options.progressHandler = { progress, error, _, _ in }
PHImageManager.default().requestImageDataAndOrientation(for: asset, options: options) { imageData, uti, orientation, info in
let cgImageSource = CGImageSourceCreateWithData(imageData! as CFData, nil)
let properties = CGImageSourceCopyPropertiesAtIndex(cgImageSource!, 0, nil) as? Dictionary<String, Any>
let exif = properties!["{Exif}"]
let dictionary = exif as? Dictionary<String, Any>
}
Metadata Check
In this case, it is reflected in the creationDate of PHAsset, so it can be somewhat compensated by forcibly replacing the metadata.
However, because PHAsset does not include time zone information, when changing the time zone as well, it's impossible to calculate the correct time according to the time zone.
PHPicker
This issue is resolved when using the PHPickerResult provided by PHPicker.
extension PhotosPickerViewController: PHPickerViewControllerDelegate {
public func picker(_ picker: PHPickerViewController,
didFinishPicking results: [PHPickerResult]) {
.....
for result in results {
let identifier = UTType.image.identifier
if result.itemProvider.hasItemConformingToTypeIdentifier(identifier) {
result.itemProvider.loadDataRepresentation(forTypeIdentifier: identifier) { data, error in
guard let data = data,
let cgImageSource = CGImageSourceCreateWithData(data as CFData, nil),
let properties = CGImageSourceCopyPropertiesAtIndex(cgImageSource, 0, nil) as? Dictionary<String, Any>,
let exif = properties["{Exif}"],
let dictionary = exif as? Dictionary<String, Any>
else {
return
}
}
}
}
}
}
Metadata Check
Question
I wonder why this happens, and if this is normal behavior.
Instead of the System Picker that Apple provides as a base, I wonder if there is any way I can supplement it in that situation if I use a customizer.
I'm trying to apply a Core Image filter to an UIImage. For that I want to get the CIImage format of the UIImage.
I'm trying to obtain the CIImage of the UIImage as shown below.
if let inputImage = self.orginalImageView.image{
if let ciImage = CIImage(image: inputImage){
print(ciImage)
print(self.orginalImageView.image?.ciImage)
}
}
}
This method works. But one thing I noticed is that there is already a ciImage property and it inside UIImage and it is always nil.
According to documentation
ciImage
The underlying Core Image data.
var ciImage: CIImage? { get }
Discussion
If the UIImage object was initialized using a CGImage, the value of the property is nil.
Does accessing image property of UIImage comes from CGImage so that the ciImage porperty is nil?
I want to apply a SCNTechnique pipeline to the camera feed. To achieve this, I want to bring the camera input into the SceneKit world.
The perfects API seems to be:
let captureDevice = …
scnScene.background.contents = captureDevice
This is demonstrated in "SceneKit: What's New" (WWDC17) (at 44m19s) and is mentioned in the documentation of SCNMaterialProperty's contents.
Instead of showing camera feed, it crashes with these messages:
*** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '*** -[AVCaptureVideoDataOutput setVideoSettings:] Unsupported pixel format type - use -availableVideoCVPixelFormatTypes'
*** First throw call stack:
(0x18993c7cc <REDACTED> 0x211e18488)
libc++abi: terminating due to uncaught exception of type NSException
Please advise.
STEPS TO REPRODUCE
Create a new Xcode project, starting from the SceneKit game template.
Add Info.plist entry for NSCameraUsageDescription.
Add a capture device property to GameViewController:
class GameViewController: UIViewController {
let captureDevice = AVCaptureDevice.default(for: .video)
Set the background contents:
scene.background.contents = captureDevice
Run the app on device.
PLATFORM AND VERSION
iOS
Development environment: Xcode 16.1, macOS 15.0.1. Run-time configuration: iOS 18.1
Hey,
There's like this darkish line on my iPhone and iPad when I open the Photos app. This scared the ding dong out of me the first time I saw it but then I realized in was a software issue when it disappeared as I swiped up to close the app. It's really weird because it's extremely faint but I can't seem to catch it in screenshots. I know for a fact this is a software issue because it doesn't show up in any other apps. It also changes from horizontal to vertical depending on how I turn my iPhone. Can everyone please just check your own iPhone or iPad to make sure I'm not the only one? I'm on the 18.2 developer beta by the way.
Thanks!
I have an iPad app, written in objective-c and distributed through Enterprise developer, as it is not for public use but specific to some large companies.
The app has a local database and works offline
For some functions of the app I need to display images (not edit or cut them, just display them)
Right now there is integrated MWPhotoBrowser viewer, which has not been maintained for almost 10 years, so in addition to warnings in compilation I have to fight with some historical bugs especially on high resolution images. https://github.com/mwaterfall/MWPhotoBrowser
Do you know of a modern and maintained OFFLINE photo viewer? I evaluate both free and paid (maybe an SDK). My needs are very basic
I have found this one https://github.com/TimOliver/TOCropViewController, but I need to disable the photos edit features and especially I would lose the useful feature of displaying multiple images (mwphoto for multiple images showed a gallery)