Hello,
Since today, I no longer have the Camera app. It doesn't appear in searches, it's not in the library, and even the icon on the lock screen has disappeared.
Photos & Camera
RSS for tagExplore technical aspects of capturing high-quality photos and videos, including exposure control, focus modes, and RAW capture options.
Post
Replies
Boosts
Views
Activity
Open a photo in your phone photo album and swipe up to see more information. Some pictures have "Saved from ***". How can I get this information, or how to get this content through PHAsset.
like to (https://i.sstatic.net/c8oVS.png)
Hi,
I’m a photographer and recently after I upgrade to ios 18 beta, noticed that when I airdrop my photos (which I took with medium format cameras with extremely high quality) from my macbook pro to my iphone, they lose quality alot!! The sharpness is completely gone, and details are very less. Does anyone know how to solve this?!
Hello pals,
I investigated strange bug with video url
and found out that
on iOS 18
method PHCachingImageManager().requestAVAsset(forVideo:
returns very weird asset.url with strange suffix
"someFileName.MOV#YnBsaXN0MDDRAQJfEBtSZWNvbW1lbmRlZEZvckltbWVyc2l2ZU1vZGUQAAgLKQAAAAAAAAEBAAAAAAAAAAMAAAAAAAAAAAAAAAAAAAAr"
example:
PHCachingImageManager().requestAVAsset(forVideo: asset, options: options) { asset, _, _ in
if let asset = asset as? AVURLAsset {
print(asset.url)
// prints - file:///.../data/Media/DCIM/100APPLE/IMG_0011.MOV#YnBsaXN0MDDRAQJfEBtSZWNvbW1lbmRlZEZvckltbWVyc2l2ZU1vZGUQAAgLKQAAAAAAAAEBAAAAAAAAAAMAAAAAAAAAAAAAAAAAAAAr
}
}
on iOS below 18 - it return regular url "...someFile.MOV"
how to correct this bug for iOS 18 users?
Please suggest me something, or maybe I'm using this method incorrectly?
Hi, Experts
I am using phpickerviewcontroller to open photos in iPhone. It works well for most of photos, such as jpeg, heif. But it failed for photo with raw image on iPhone14 plus. And I found the TypeIdentifier for it is com.adobe.raw-image.
I use result.itemProvider.loadFileRepresentation(forTypeIdentifier: "com.adobe.raw-image") to load the raw photo, it always failed, with "Error loading file representation: Cannot load representation of type com.adobe.raw-image".
I had try some other param: such as forTypeIdentifier: public.image, public.camera-raw-image, both of them did not work.
How can I load this type of raw photo?
Below is my code details:
// MARK: - PHPickerViewControllerDelegate
func picker(_ picker: PHPickerViewController, didFinishPicking results: [PHPickerResult]) {
picker.dismiss(animated: true, completion: nil)
var resultIndex = 0
DDLogInfo("Pick \(results.count) photos")
for result in results {
resultIndex += 1
DDLogInfo("Process \(resultIndex) photo")
DDLogInfo("Registered type identifiers for itemProvider:")
for typeIdentifier in result.itemProvider.registeredTypeIdentifiers {
DDLogInfo("TypeIdentifier \(typeIdentifier)")
}
if(result.itemProvider.hasItemConformingToTypeIdentifier(UTType.image.identifier)) {
DDLogInfo("Result \(resultIndex) is image")
}
if result.itemProvider.canLoadObject(ofClass: UIImage.self) {
DDLogInfo("Can load \(resultIndex) image")
//more code for photo
} else {
DDLogInfo("Load special image, such as raw")
result.itemProvider.loadFileRepresentation(forTypeIdentifier: "com.adobe.raw-image") { url, error in
if let error = error {
DDLogInfo("Error loading file representation: \(error.localizedDescription)")
return
}
Hi,
Currently my app is using ImageCaptureCore framework to work with DSLR camera. But when I tested it in iOS 18, it turns out my camera cannot do connection with iPhone by wired connection.
It seems there are some developer run into the same problem, there are:
https://forums.developer.apple.com/forums/thread/756960
https://stackoverflow.com/questions/78618886/icdevicebrowser-fails-to-find-any-devices-after-ios-18-update
And it’s reproduced in some apps that expected to use ImageCaptureCore framework.
I’d like to clarify that:
Is the issue currently iOS 18 bugs?
Is there any plan of Apple to remove wired connection support of ImageCaptureCore framework?
Thank you.
On iOS 18 public beta, user issues with photo quality being degraded when taken in the camera app and getting error message of failing to produce high resolution image.
Hi! I am having a bit of trouble with the Photos Picker. In my app, users are able to select photos to appear in a grid, right in the app. I am using the new Photos Picker with SwiftUI. I want to be able to have my users select the images after they have been added to the View. So I want there to be a select button in the top toolbar on the leading side, and then once the user hits the select button, they can select the photos they want to remove on the grid, just like in the photos app, and then where the button to add photos originally is, there will be a trash icon to remove the selected photos from the grid.
How would I do this? I have attached my code below for my view, as well as my PhotoPicker:
import PhotosUI
struct LifestyleImagePicker: View {
@StateObject var imagePicker = ImagePicker()
@State private var showingDetail = false
@State private var selectedIndex = 0
@State private var isSelecting = false
@State private var isAddingPhoto = false
let columns = [GridItem(.adaptive(minimum: 100))]
var body: some View {
NavigationSplitView {
VStack {
if !imagePicker.images.isEmpty {
ScrollView {
LazyVGrid(columns: columns, spacing: 3) {
ForEach(imagePicker.images.indices, id: \.self) { index in
imagePicker.images[index]
.resizable()
.scaledToFit()
.onTapGesture {
selectedIndex = index
showingDetail = true
}
}
}
}
} else {
Text("Tap the plus icon to add photos to your own Inspo Board.")
.multilineTextAlignment(.center)
}
}
.padding()
.navigationTitle("Lifestyle")
.toolbar {
ToolbarItem(placement: .navigationBarTrailing) {
PhotosPicker(selection: $imagePicker.imageSelections,
maxSelectionCount: 10,
matching: .images,
photoLibrary: .shared()) {
Image(systemName: "photo.badge.plus")
.imageScale(.large)
}
}
}
} detail: {
Text("Pick your lifestyle")
}
.sheet(isPresented: $showingDetail) {
DetailImageView(images: $imagePicker.images, selectedIndex: selectedIndex)
}
}
}
#Preview {
LifestyleImagePicker()
}
import PhotosUI
import Combine
import Foundation
@MainActor
class ImagePicker: ObservableObject {
@Published var image: Image?
@Published var images: [Image] = []
@Published var imageSelection: PhotosPickerItem? {
didSet {
if let imageSelection {
Task {
try await loadTransferable(from: imageSelection)
}
}
}
}
@Published var imageSelections: [PhotosPickerItem] = [] {
didSet {
Task {
if !imageSelections.isEmpty {
try await loadTransferable(from: imageSelections)
imageSelections = []
}
}
}
}
func loadTransferable(from imageSelections: [PhotosPickerItem]) async throws {
do {
for imageSelection in imageSelections {
if let data = try await imageSelection.loadTransferable(type: Data.self) {
if let uiImage = UIImage(data: data) {
self.images.append(Image(uiImage: uiImage))
}
}
}
} catch {
print(error.localizedDescription)
}
}
func loadTransferable(from imageSelection: PhotosPickerItem?) async throws {
do {
if let data = try await imageSelection?.loadTransferable(type: Data.self) {
if let uiImage = UIImage(data: data) {
self.image = Image(uiImage: uiImage)
}
}
} catch {
print(error.localizedDescription)
image = nil
}
}
}
Hey, I’m building a camera app where I am applying real time effects to the view finder. One of those effects is a variable blur, so to improve performance I am scaling down the input image using CIFilter.lanczosScaleTransform(). This works fine and runs at 30FPS, but when running the metal profiler I can see that the scaling transforms use a lot of GPU time, almost as much as the variable blur. Is there a more efficient way to do this?
The simplified chain is like this:
Scale down viewFinder CVPixelBuffer (CIFilter.lanczosScaleTransform)
Scale up depthMap CVPixelBuffer to match viewFinder size (CIFilter.lanczosScaleTransform)
Create CIImages from both CVPixelBuffers
Apply VariableDepthBlur (CIFilter.maskedVariableBlur)
Scale up final image to metal view size (CIFilter.lanczosScaleTransform)
Render CIImage to a MTKView using CIRenderDestination
From some research, I wonder if scaling the CVPixelBuffer using the accelerate framework would be faster? Also, Instead of scaling the final image, perhaps I could offload this to the metal view?
Any pointers greatly appreciated!
Hi ,
For our application's usecase, we need to remove autofocus completely.
we dont need autofocus at all. is there anyway to remove or disable autofocus completely in the iOS applications?
I am creating an locked camera capture extension that allows you to take a video with an overlay image on top of it. I'm using AVMutableComposition in order to achieve that. It works perfect in my main app, but when initializing AVMutableComposition in the locked camera extension it always returns nil.
Is this expected?
In my project I’m using AVCaptureMultiCamSession where main camera device type is .builtInWideAngleCamera and second one is .builtInUltraWideCamera.
Backend team requires to send them following data:
• Intrinsic Matrix of each camera (3x3 matrix).
• Possibly accessible Distortion coefficients (not lensDistortionLookupTable).
So my questions are:
⁃ is it possible to retrieve intrinsicMatrix for both builtInWideAngleCamera and builtInUltraWideCamera during AVCaptureMultiCamSession?
⁃ is there a way to get distortion coefficient (not lensDistortionLookupTable) for both builtInWideAngleCamera and builtInUltraWideCamera during AVCaptureMultiCamSession?
Dear Experts,
In "limited access" photos mode, I present a PHPickerViewController. It shows the entire photo library, with a note at the top saying that the app can only access the items that I select. I select a photo.
In the delegate method, I get a PHPickerResult containing a plausible-looking string for the assetIdentifier. It's the same string that I get for that photo in "full access" mode.
Should this photo now be accessible, or do I need to do something else at this point?
When I call fetchAssetsWithLocalIdentifiers using this assetIdentifier, I get no results. When I call cloudIdentifierMappingsForLocalIdentifiers, I get error PHPhotosErrorIdentifierNotFound. In "full access" mode, both work OK.
What am I missing?
Thanks.
Does anyone know which control is used to automatically recognize objects in photos and achieve the function of cutout by right-clicking the mouse?
有人知道这个鼠标点击右键自动识别照片中的对象然后可以实现抠图的功能用的是哪个控件吗?
There is a long press recognition feature in the photo album of the mobile phone system. What is this feature called in Apple development, and which control should I use to have this feature?
手机系统相册中有个长按识别对象的功能,这个功能在苹果开发中叫做什么,我应该使用哪个控件才能拥有这个功能?
I am building a LockedCameraCaptureExtension for my app. The example code works as expected. I am now in the process of porting my existing camera UI to the LockedCameraCaptureExtension.
Currently, my custom UI is shown for a split second when I tap the control to open the LockedCameraCaptureExtension but it quickly exits back into Lock Screen. I assume my ported UI is doing something wrong.
What is the recommended way to debug a LockedCameraCaptureExtension? I can't get my break points to break and os_log does not seem to log anything to Console.
I was taking some photos of the Milky Way while using a tripod and the stars seemed to not be in focus. I tried plenty of times to get stars in focus. I switched to other apps that have a manual focus and the problem was resolved however the results weren’t as clear as the standard app would have been. I then tried again with the standard app but while focusing on an outdoor camera on our house and yet the camera would still not focus no matter how much I tried. I’m not sure if it is a bug with the iOS 18 developer beta. Either way I figured it was best to post something I had realized. Apple please add a manual focus option.
Is There Anyone Who implemented Rich Notification Recently I tried To Implement it but its not working, I Have went through blogs and tut which are Available On on internet but They are not working, Please let me know if there is someone who Can Help Me With That.