Hey, I’m building a camera app where I am applying real time effects to the view finder. One of those effects is a variable blur, so to improve performance I am scaling down the input image using CIFilter.lanczosScaleTransform(). This works fine and runs at 30FPS, but when running the metal profiler I can see that the scaling transforms use a lot of GPU time, almost as much as the variable blur. Is there a more efficient way to do this?
The simplified chain is like this:
Scale down viewFinder CVPixelBuffer (CIFilter.lanczosScaleTransform)
Scale up depthMap CVPixelBuffer to match viewFinder size (CIFilter.lanczosScaleTransform)
Create CIImages from both CVPixelBuffers
Apply VariableDepthBlur (CIFilter.maskedVariableBlur)
Scale up final image to metal view size (CIFilter.lanczosScaleTransform)
Render CIImage to a MTKView using CIRenderDestination
From some research, I wonder if scaling the CVPixelBuffer using the accelerate framework would be faster? Also, Instead of scaling the final image, perhaps I could offload this to the metal view?
Any pointers greatly appreciated!
Photos & Camera
RSS for tagExplore technical aspects of capturing high-quality photos and videos, including exposure control, focus modes, and RAW capture options.
Post
Replies
Boosts
Views
Activity
Hi! I am having a bit of trouble with the Photos Picker. In my app, users are able to select photos to appear in a grid, right in the app. I am using the new Photos Picker with SwiftUI. I want to be able to have my users select the images after they have been added to the View. So I want there to be a select button in the top toolbar on the leading side, and then once the user hits the select button, they can select the photos they want to remove on the grid, just like in the photos app, and then where the button to add photos originally is, there will be a trash icon to remove the selected photos from the grid.
How would I do this? I have attached my code below for my view, as well as my PhotoPicker:
import PhotosUI
struct LifestyleImagePicker: View {
@StateObject var imagePicker = ImagePicker()
@State private var showingDetail = false
@State private var selectedIndex = 0
@State private var isSelecting = false
@State private var isAddingPhoto = false
let columns = [GridItem(.adaptive(minimum: 100))]
var body: some View {
NavigationSplitView {
VStack {
if !imagePicker.images.isEmpty {
ScrollView {
LazyVGrid(columns: columns, spacing: 3) {
ForEach(imagePicker.images.indices, id: \.self) { index in
imagePicker.images[index]
.resizable()
.scaledToFit()
.onTapGesture {
selectedIndex = index
showingDetail = true
}
}
}
}
} else {
Text("Tap the plus icon to add photos to your own Inspo Board.")
.multilineTextAlignment(.center)
}
}
.padding()
.navigationTitle("Lifestyle")
.toolbar {
ToolbarItem(placement: .navigationBarTrailing) {
PhotosPicker(selection: $imagePicker.imageSelections,
maxSelectionCount: 10,
matching: .images,
photoLibrary: .shared()) {
Image(systemName: "photo.badge.plus")
.imageScale(.large)
}
}
}
} detail: {
Text("Pick your lifestyle")
}
.sheet(isPresented: $showingDetail) {
DetailImageView(images: $imagePicker.images, selectedIndex: selectedIndex)
}
}
}
#Preview {
LifestyleImagePicker()
}
import PhotosUI
import Combine
import Foundation
@MainActor
class ImagePicker: ObservableObject {
@Published var image: Image?
@Published var images: [Image] = []
@Published var imageSelection: PhotosPickerItem? {
didSet {
if let imageSelection {
Task {
try await loadTransferable(from: imageSelection)
}
}
}
}
@Published var imageSelections: [PhotosPickerItem] = [] {
didSet {
Task {
if !imageSelections.isEmpty {
try await loadTransferable(from: imageSelections)
imageSelections = []
}
}
}
}
func loadTransferable(from imageSelections: [PhotosPickerItem]) async throws {
do {
for imageSelection in imageSelections {
if let data = try await imageSelection.loadTransferable(type: Data.self) {
if let uiImage = UIImage(data: data) {
self.images.append(Image(uiImage: uiImage))
}
}
}
} catch {
print(error.localizedDescription)
}
}
func loadTransferable(from imageSelection: PhotosPickerItem?) async throws {
do {
if let data = try await imageSelection?.loadTransferable(type: Data.self) {
if let uiImage = UIImage(data: data) {
self.image = Image(uiImage: uiImage)
}
}
} catch {
print(error.localizedDescription)
image = nil
}
}
}
On iOS 18 public beta, user issues with photo quality being degraded when taken in the camera app and getting error message of failing to produce high resolution image.
Hi,
Currently my app is using ImageCaptureCore framework to work with DSLR camera. But when I tested it in iOS 18, it turns out my camera cannot do connection with iPhone by wired connection.
It seems there are some developer run into the same problem, there are:
https://forums.developer.apple.com/forums/thread/756960
https://stackoverflow.com/questions/78618886/icdevicebrowser-fails-to-find-any-devices-after-ios-18-update
And it’s reproduced in some apps that expected to use ImageCaptureCore framework.
I’d like to clarify that:
Is the issue currently iOS 18 bugs?
Is there any plan of Apple to remove wired connection support of ImageCaptureCore framework?
Thank you.
Hi, Experts
I am using phpickerviewcontroller to open photos in iPhone. It works well for most of photos, such as jpeg, heif. But it failed for photo with raw image on iPhone14 plus. And I found the TypeIdentifier for it is com.adobe.raw-image.
I use result.itemProvider.loadFileRepresentation(forTypeIdentifier: "com.adobe.raw-image") to load the raw photo, it always failed, with "Error loading file representation: Cannot load representation of type com.adobe.raw-image".
I had try some other param: such as forTypeIdentifier: public.image, public.camera-raw-image, both of them did not work.
How can I load this type of raw photo?
Below is my code details:
// MARK: - PHPickerViewControllerDelegate
func picker(_ picker: PHPickerViewController, didFinishPicking results: [PHPickerResult]) {
picker.dismiss(animated: true, completion: nil)
var resultIndex = 0
DDLogInfo("Pick \(results.count) photos")
for result in results {
resultIndex += 1
DDLogInfo("Process \(resultIndex) photo")
DDLogInfo("Registered type identifiers for itemProvider:")
for typeIdentifier in result.itemProvider.registeredTypeIdentifiers {
DDLogInfo("TypeIdentifier \(typeIdentifier)")
}
if(result.itemProvider.hasItemConformingToTypeIdentifier(UTType.image.identifier)) {
DDLogInfo("Result \(resultIndex) is image")
}
if result.itemProvider.canLoadObject(ofClass: UIImage.self) {
DDLogInfo("Can load \(resultIndex) image")
//more code for photo
} else {
DDLogInfo("Load special image, such as raw")
result.itemProvider.loadFileRepresentation(forTypeIdentifier: "com.adobe.raw-image") { url, error in
if let error = error {
DDLogInfo("Error loading file representation: \(error.localizedDescription)")
return
}
Hello pals,
I investigated strange bug with video url
and found out that
on iOS 18
method PHCachingImageManager().requestAVAsset(forVideo:
returns very weird asset.url with strange suffix
"someFileName.MOV#YnBsaXN0MDDRAQJfEBtSZWNvbW1lbmRlZEZvckltbWVyc2l2ZU1vZGUQAAgLKQAAAAAAAAEBAAAAAAAAAAMAAAAAAAAAAAAAAAAAAAAr"
example:
PHCachingImageManager().requestAVAsset(forVideo: asset, options: options) { asset, _, _ in
if let asset = asset as? AVURLAsset {
print(asset.url)
// prints - file:///.../data/Media/DCIM/100APPLE/IMG_0011.MOV#YnBsaXN0MDDRAQJfEBtSZWNvbW1lbmRlZEZvckltbWVyc2l2ZU1vZGUQAAgLKQAAAAAAAAEBAAAAAAAAAAMAAAAAAAAAAAAAAAAAAAAr
}
}
on iOS below 18 - it return regular url "...someFile.MOV"
how to correct this bug for iOS 18 users?
Please suggest me something, or maybe I'm using this method incorrectly?
Hi,
I’m a photographer and recently after I upgrade to ios 18 beta, noticed that when I airdrop my photos (which I took with medium format cameras with extremely high quality) from my macbook pro to my iphone, they lose quality alot!! The sharpness is completely gone, and details are very less. Does anyone know how to solve this?!
Open a photo in your phone photo album and swipe up to see more information. Some pictures have "Saved from ***". How can I get this information, or how to get this content through PHAsset.
like to (https://i.sstatic.net/c8oVS.png)
Hello,
Since today, I no longer have the Camera app. It doesn't appear in searches, it's not in the library, and even the icon on the lock screen has disappeared.
Can the extended code created by Capture Extension call the code of the main project?
I added the control via Widget Extension and I see the perform method is called in my intent but I am missing the part where this perform method will open the UI to capture the photo.
This is my intent:
struct MyAppCaptureIntent: CameraCaptureIntent {
static var title: LocalizedStringResource = "MyAppCaptureIntent"
typealias AppContext = MyAppContext
static let description = IntentDescription("Capture photos with MyApp.")
@MainActor
func perform() async throws -> some IntentResult {
let dialog = IntentDialog("Intent result")
do {
if let context = try await MyAppCaptureIntent.appContext {
return .result()
}
} catch {
// Handle error condition.
}
return .result()
}
}
struct MyAppContext: Decodable, Encodable {
var data = ContextData()
}
struct ContextData: IntentResult, Decodable, Encodable {
var value: Never? {
nil
}
}
How can I connect this with my LockedCameraCaptureExtension?
Can you provide a complete demo?
Ios 18 at this moment copy from photo not working.
Hi all,
Just wondering whether anyone knows there's anyway to support iPhone connecting with an external camera (e.g., USB-C webcam), like is enabled on the iPad?
Thank you!
I integrated Visionkit into my app, to get content of an image. That works really fine. Now I realized that iPhone8 crashes on launching the app, because of the following code:
available(iOS 17, *) extension Details: ImageAnalysisInteractionDelegate {}
is there a workaround for older versions?
I have a LockedCameraCapture extension working well, however there is one situation I cannot find a solution to. If the user has not yet provided camera access permission then the main app will be launched rather than the LockedCameraCapture extension. I cannot find a mechanism by which my main app can detect that this was the reason for the launch and thereby request permission.
When the button is pressed from the control center without permission the app is run and the CameraCaptureIntent is called so I can prompt the user from there. However, as best I can tell the CameraCaptureIntent is not called when launched from a locked Lock Screen, the app is simply opened.
My app has a variety of functions, most of which do not involve the camera so I cannot just always prompt the user for camera access on open. Is there any mechanism by which my main app can detect it was launched for this reason so it could ask for permission? Thank you!
Hello,
When using a UIImagePickerController with the .camera configuration I'm currently facing an issue where the delegate function imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [UIImagePickerController.InfoKey : Any]) is not firing. But rather UIKit internals dismisses the parent view to the UIImagePickerController. I'm showing the picker controller through a UIViewControllerRepresentable.
It does not always occur however, and the behavior is very flakey, sometimes it fires when pressing the b, sometimes it does not.
When setting a breakpoint at the dismiss function when pressing the "Use Photo" button, UIKit internals dismisses the view, not my own code.
I am working on an iPad app using the front camera. The camera logic is implemented using the AVFoundation framework. During a session I use central stage mode to center the face with front camera . Central stage mode works fine in all cases except when I set the session preset photo. In other presets: high, medium, low, cif352x288, vga640x480, hd1280x720, hd1920x1080, iFrame960x540, iFrame1280x720, inputPriority presets central stage mode working except photo session preset. Can you explain why this happens or maybe it is an unobvious bug?
Code snippet:
final class CameraManager {
//MARK: - Properties
private let captureSession: AVCaptureSession
private let photoOutput: AVCapturePhotoOutput
private let previewLayer: AVCaptureVideoPreviewLayer
//MARK: - Init
init() {
captureSession = AVCaptureSession()
photoOutput = AVCapturePhotoOutput()
previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
}
//MARK: - Methods Setup Camera and preview layer
func setupPreviewLayerFrame(view: UIView) {
previewLayer.frame = view.frame
view.layer.insertSublayer(previewLayer, at: 0)
setupCamera()
}
private func setupCamera() {
guard let videoCaptureDevice = AVCaptureDevice.default(.builtInWideAngleCamera, for: .video, position: .front) else { return }
AVCaptureDevice.centerStageControlMode = .app
AVCaptureDevice.isCenterStageEnabled = true
do {
let input = try AVCaptureDeviceInput(device: videoCaptureDevice)
captureSession.addInput(input)
captureSession.addOutput(photoOutput)
/// high, medium, low, cif352x288, vga640x480, hd1280x720, hd1920x1080, iFrame960x540, iFrame1280x720 and inputPriority presets working except photo session preset
captureSession.sessionPreset = .photo
DispatchQueue.global(qos: .userInteractive).async {
self.captureSession.startRunning()
}
} catch {
print("Error setting up camera: \(error.localizedDescription)")
}
}
}
Dear Experts,
When I use PHImageManager's requestImageDataAndOrientationForAsset method I always seem to get JPEG data, even when the original items in the photo library are PNGs (such as screenshots) or HEICs.
Have I missed a setting somewhere that determines whether or not a "most compatible" format is used in this API?
Thanks.
I am developing a camera extension as described here Creating a camera extension as described in Creating a camera extension with Core Media I/O.
To ensure this wasn't some issue with my code, I reverted to the sample code added when you choose File > New > Target > Camera Extension in Xcode, in other words, I am using the example code provided by Apple.
I am able to install the camera extension and see it in QuickTime Player, where I choose it as the video input. In QuickTime Player I see the white line generated by the sample code moving up and down. For some period of time, it works.
But eventually, the video in QuickTime Player freezes. The thing that's really weird is if I add some NSLog() statements at the point in the code where it returns the newly created sample:
[self->_streamSource.stream sendSampleBuffer:sbuf discontinuity:CMIOExtensionStreamDiscontinuityFlagNone hostTimeInNanoseconds:(uint64_t)(CMTimeGetSeconds(timingInfo.presentationTimeStamp) * NSEC_PER_SEC)];
the samples are still being generated and sent to the stream. But the apparently QuickTime Player has decided to stop consuming them.
I thought maybe setting the discontinuity parameter to CMIOExtensionStreamDiscontinuityFlagTime or CMIOExtensionStreamDiscontinuityFlagSampleDropped if the delta time since the last sample was generated was off by a tiny bit, but this did not improve the situation.
Finally, could this have something to do with frequently installing and uninstalling my camera extension as part of the debugging and testing process?
Thanks in advance for any advice you might have!
Hello!
We want to try the new features of DockKit on an iOS 18 device. However, I am unable to pair the DockKit-compatible device with my iOS 18 device. Is there a way to successfully pair them?
各位开发者,想问问后续是否会慢慢恢复