Discuss using the camera on Apple devices.

Posts under Camera tag

182 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

How can I use iPhone true depth front camera to detect if the captured depth map of a face is a true 3d face or spoofed 2d image
I'm trying to implement anti-spoofing in iOS app using iphone true depth front camera. I have checked the following questions still can't find a proper working solution. I trained a coreML model using 22000 depth human face images and 22000 non-human face(objects,food etc) images. The accuracy of the model is very less. When testing out with flat 2d images shown on a smartphone screen I found that I get depth map even for flat 2D images like this. Even though the image is flat how does it give the depth map for the person shown in the flat 2D picture so the model thinks that it is a real face instead of a spoofed one. I implemented depth capture by following this documentation and I made sure that I get depth map instead of disparity map https://developer.apple.com/documentation/avfoundation/additional_data_capture/capturing_photos_with_depth My next approach was to use NCNN framework to implement anti-spoofing by using the model used in the Mini-vision android anti-spoofing sample. I rewrote their library in iOS by using the objective C++ wrapper for C++ as the sample was only available for android app. And I tested by feeding 80x80 UI-Image in a open cv matrix format it's accurracy is less than the android one. How can I solve this problem.
0
0
120
3d
LockedCameraCaptureManager sessionContentUpdates sometimes is not called
Within my app, I have: for try await update in LockedCameraCaptureManager.shared.sessionContentUpdates { It seems that the first time my app opens from LockedCameraCapture (after enabling camera permissions etc...) this update is never called and the user will not see their capture (.added or .initial) If I then try to take another picture/video through my LockedCameraCapture control, it takes the video, opens the app as before, but this time sessionContentUpdates is called twice, once for the first video and once for the second video! After that it doesn't seem to occur again and all works perfectly! My device is: iPhone 16 Pro Max, iOS 18.2 developer beta Has anyone experienced this?
0
0
80
3d
IOSurface with System Extensions
Hi All, I'm working on a camera system extension where the main app is supposed to transfer a video stream using IOSurface memory sharing to the cam extension. I have built a sample app that does contains all the logic, but without a camera extension. So I'm essentially using IOSurface to render a video in one SwiftUI view and show the result in another SwiftUI view. Just for testing purposes. And everything works fine so far. Now, when moving the receiver code to the camera extensions, I'm having problems in accessing the IOSurface via ID. I am sharing the IOSurface ID via UserDefaults. I know from the logs the ID is correctly transferred. Here is the code that uses IOSurfaceLookup to get the IOSurface. But this fails with the given message. The error message prints the surface ID which is the correct one. I know this from the main app where I get the ID and print it as well. private var surfaceId: Int = -1 { didSet { logger.info("surfaceId has changed") if surfaceId == -1 { stopReceivingFrames() ioSurface = nil } else { guard let surface = IOSurfaceLookup(IOSurfaceID(surfaceId)) else { logger.error("failed to lookup IOSurface with ID: \(self.surfaceId)") return } self.ioSurface = surface logger.info("surface set, now starting receiving frames") startReceivingFrames() } } } My gut feeling says that this issue might be related to some missing entitlement, sandboxing. In general, I have a working camera extension. I'm just not able to render a video in the main app, and send it over to the camera extension to overlay another web cam. Both, the main app and camera extension are in the same XCode workspace and share the same AppGroup. In short, my actual questions are: Is there any entitlement required for using IOSurface between app and camera system extension? Is using IOSurface actually possible in system extensions? Is there any specific setting/requirement that I need to handle to make this work?
0
0
87
3d
seriously flawed iOS 18 slow-mo capture & playback
I love to use my iPhone as a camera. it is what makes this phone the best, ever, period But, ever since iOS 18 update, slow-mo capture & playback is a painful experience. Ok, I get it, it is not an action camera but I was able to have an instant-slow-mo-playback on prior iOS versions. But ever since iOS 18, I have to wait at least 5-10 minutes to "play" slow-mo video and I noticed something else, even though I didn't modify any settings on the video, iOS 18 creates slow-mo video file in phone's storage and takes up a LOT of space and often making me clean all of my files to find what was wrong the device. I think, in my all honesty, apple must pull iOS 18 update and work on it a little bit more. Till then, I'll be using my DJI Action Camera for 240fps, at least it works, instantly.
1
0
141
4d
App randomly be terminated due to Capture Application Requirements Unmet
Hi. I encounter some random crashes of my camera app. After some investigations, I found that it's terminated by the system and the crash log did be generated but the information is not quite useful, and here is the log found via the Console app. Termination & Crash log "Camera not actively used; AVCaptureEventInteraction not installed": Received termination request from [osservice<com.apple.SpringBoard>:10931] on <RBSProcessPredicate <RBSProcessInstancePredicate| [app<com.juniperphoton.PhotonCam]>> with context <RBSTerminateContext| explanation:Capture Application Requirements Unmet: "Camera not actively used; AVCaptureEventInteraction not installed" reportType:CrashLog maxTerminationResistance:Interactive> The crash log exported from the device will have some common information like: It's a EXC_CRASH (SIGKILL) type with no termination reason. Exception Type: EXC_CRASH (SIGKILL) Exception Codes: 0x0000000000000000, 0x0000000000000000 Termination Reason: RUNNINGBOARD 0 It's triggered by the main thread, but it seems to be waiting for an event to process. Triggered by Thread: 0 Thread 0 name: Dispatch queue: com.apple.main-thread Thread 0 Crashed: 0 libsystem_kernel.dylib 0x1ee165788 mach_msg2_trap + 8 1 libsystem_kernel.dylib 0x1ee168e98 mach_msg2_internal + 80 2 libsystem_kernel.dylib 0x1ee168db0 mach_msg_overwrite + 424 3 libsystem_kernel.dylib 0x1ee168bfc mach_msg + 24 4 CoreFoundation 0x19cbe47f4 __CFRunLoopServiceMachPort + 160 5 CoreFoundation 0x19cbe3ea0 __CFRunLoopRun + 1212 6 CoreFoundation 0x19cc36274 CFRunLoopRunSpecific + 588 7 GraphicsServices 0x1e9d6d4c0 GSEventRunModal + 164 8 UIKitCore 0x19f783480 -[UIApplication _run] + 816 9 UIKitCore 0x19f3a9410 UIApplicationMain + 340 10 UIKitCore 0x19fae4bb0 0x19f394000 + 7670704 11 PhotonCam 0x1002e7e3c 0x1002cc000 + 114236 12 dyld 0x1c2d5ade8 start + 2724 Address size fault on the main thread Thread 0 crashed with ARM Thread State (64-bit): ... far: 0x0000000000000000 esr: 0x56000080 Address size fault I have once tried to reproduce this issue when the app is attached with debugger, and it says: Terminated due to signal 9 When the crash or termination happened, the app: No AVCaptureSession is running. The app is in the foreground and users are interacting with some functions like viewing photos or editing photos in the app. When users exit the camera view, like entering the gallery or settings, the camera session will be stopped. Both TestFlight and Debug build will have the same issue. No 3rd party crash reporter is installed(I deliberately disable it in Debug build and TestFlight build) It has adopted the LockedCameraCapture, but current it's running on the main app target(if not, my app will have a button of unlock, so I can confirm about this). Also, when it comes to the memory consumption, there is no JetsamEvent around the crash time. Device and app information Additionally, some information about the tech stack and the current state of my device and my app: iPhone 16 Pro with iOS 18.2 Beta 3. The app is a camera based app(it's PhotonCam and you can find it on the App Store), its main functionality is the camera feature using AVFoundation + Core Image + Metal to deliver camera functionality. It has adopted the Camera Control, AVCaptureEventInteraction and LockedCameraCapture features. If I remember it right, it occurs in iOS 18.1 Release build, but currently I have no such device to confirm. But in iOS 17.x the issue has never happened. Regarding to this termination, on top of my head is the "watchdog" mechanism that will terminate the process that is running on the LockedCameraCapture feature. However I can make sure that currently the app is running as the main target on the home screen. Has anybody encountered this kind of issue and has found some solutions? Thanks in advance.
1
0
188
4d
iPhone Camera issue
I recently updated both my phones iPhone 14 Pro Max and iPhone 16 Pro Max on iOS 18.1, and after update when I open camera and go to SLO-MO the screen starts flickering, even though after recoding when I play the video. Video is playing the same with the flickering screen. is it iOS update issue or something else.
1
0
65
5d
Rendering YCbCr input using Metal
I would like to take YCbCr CVPixelBuffers from AVCaptureVideoDataOutput, apply some processing in RGB space, render to an MTKView, and pass to AVAssetWriter for recording. Right now, I'm doing this all manually – deswing the incoming data if necessary, choose the right matrix to convert to RGB, apply processing, etc. I also have to convert back to YCbCr before feeding the frames to AVAssetWriter because encoding performs much better if I do. Is there any efficient, built-in way to achieve the same? I can't use AVCaptureVideoPreviewLayer, since I need to do some further processing before display. I can't use AVCaptureVideoDataOutput's videoSettings to get automatic BGRA conversion because that would lose bit depth for 10 bit video formats (and isn't available on all formats anyway). I see these Accelerate functions, but they seemingly don't use the GPU, nor do they support all the formats and bit depths I'd need. I found reference to some undocumented MTLPixelFormats that seem to do exactly what I want, but I don't want to rely on something like this unless it's explicitly endorsed. This would also incur an RGB/YCbCr conversion on every texture read and write, right? Is there anything I'm missing here?
0
0
175
1w
Accessing camera from SSO extension
I'm trying to create an "Extensible Enterprise SSO" extension as described in the Introducing Extensible Enterprise SSO tech talk. My SSO extension works fine, but I want to be able to access the camera (via AVFoundation) from within the SSO extension. According to this thread (which I can't seem to be able to reply to) - it should be possible to access the camera from within an SSO extension, however this doesn't work for me. When I try to access the camera, I get the permission dialog, but after accepting, the camera preview is empty and no camera frames are produced. I don't get any errors/warnings in the logs, but it immediately fires AVCaptureSession.wasInterruptedNotification notification with AVCaptureSessionInterruptionReasonKey = 1 which corresponds to videoDeviceNotAvailableInBackground. However, the SSO extension view controller is clearly not in the background, so is this a bug - or are there special rules for requesting camera permission in an SSO extension? The same camera access works fine in the host app, just not inside the extension. Interestingly, accessing the camera in a WKWebView using various webcam test pages, doesn't work either. All of these tests have been on iPadOS 18.
0
1
175
1w
FaceTime camera returns promise before solving it
The strange_behaviour.mp4 video attached shows how when running a list of statements to open (startCamera) and close (closeCamera) the camera against a MacBook Pro 2019 using the FaceTime camera, these return their value almost immediately, when in reality the camera is still opening and closing. We believe that there might be a queue for statements to run against the camera and it finishes the awaits when all the statements are inserted in the queue instead of when they are actually solved. We also attached the expected_behaviour.mp4 video where we replicate the same flow but using an external camera instead of the FaceTime camera. In this video, the promises take a bit longer to return but they do once the camera has already been opened and closed the requested amount of times. The project used in the videos is attached as project.tar.xz. We would like to know if the behaviour in strange_behaviour.mp4 is replicable on your side and if there is a way to access the cameras to make it behave like when using an external camera (expected_behaviour.mp4). Attachments: https://drive.google.com/drive/folders/1cOeFb5GMbh4mPOeZiZyyevk3N778Kn1M?usp=sharing
0
0
132
2w
iPhone 14 Plus Rear Camera Issues After iOS 18 Update - Green Tint and Limited Functionality
Hi everyone, I’m experiencing a frustrating issue with my iPhone 14 Plus after updating to iOS 18. Ever since the update, the rear camera has become almost completely unusable, while the front camera still works fine. Here’s a breakdown of the problems: Camera app shows a black screen - When I open the Camera app and switch to the rear camera, the screen is just black with no visibility of any objects or scenes. Limited photo capability - I can only take photos using the 0.5x zoom, and even then, the results are abnormal. Every photo taken with the rear camera has a strange green light at the lower bottom, just like the one in the attached photo. Unavailable features - All rear camera features, like video, panorama, portrait, and cinematic mode, don’t work at all. It’s like the camera is only partially functional. “Clips” app works normally - Interestingly, the “Clips” app is still able to take videos with the rear camera, so it seems like the camera hardware itself is fine. This issue seems specific to the Camera app and any functions within it. I waited for two updates—iOS 18.0.1 and 18.1—hoping Apple would resolve this, but unfortunately, these updates haven’t fixed the issue. I even went to the Apple Service Centre, but since my device is out of warranty, they told me that if I want it fixed, I’d have to pay MYR905 (around USD$200+) to replace the rear camera. They diagnosed it as a hardware issue, but I can’t help but feel it’s related to the iOS 18 update, as the problem started right after updating. I’ve always taken good care of my device, so this doesn’t seem to be due to any physical damage or misuse on my part. Is anyone else experiencing something similar? Could this be a software bug in iOS 18? Any help or insights would be greatly appreciated. Thanks!
1
0
182
2w
Access main camera on Apple Vision Pro
From visionOS 2.0 we can access Apple Vision Pro's main camera but only for Enterprise account as it is enterprise API only, I have a normal Developer account and I want to use main camera and want to have a video call feature in app by using main camera of AVP, is it possible to do it using developer account only. Currently using that account I am not able to create entitlement certificate as there is no option.
2
0
212
2w
Issues with capturing bracketed photos using iPhone 16 Pro
I am experiencing a bug when using a AVCapturePhotoBracketSettings object to capture a bracketed photo sequence on iPhone 16 Pro. Specifically, when I pass in an array of exposure values: [-x, 0, +x], where x >= 3. Specifically, the high exposure photo capture returns a black image. STEPS TO REPRODUCE Run the sample app I have provided on an iPhone 16 Pro Notice that bracketed images captured where the eV is set to [-3,0,+3], [-4,0,+4], or [-5,0,+5] return a black image for the high exposure photo. Notice that on other iOS devices (like iPhone 13 Pro), the high exposure photo is returned as high brightness as expected. I have also added two folders in the sample project that show screenshots of the bug: iPhone13Pro & iPhone16Pro Sample Project: https://www.icloud.com/iclouddrive/090O_68Z0Nh2UOxmPRwu56Tmw#Focused16ProBracketedCaptureBug
1
0
204
2w
Camera not detect dataMatrix code
Hello, I have a problem reading a 2D data matrix type code with a camera. In the application, I use AVFoundation to operate the camera and work with 2D codes, and in the vast majority there is no problem with loading. Nothing special. I originally thought it might be a problem in my code, but I got the same result when I tried with the Camera app integrated in IOS. It can be seen that only the LiveText API for text recognition worked. But I am attaching the code with which the camera has a problem, even though the code looks perfectly fine at first glance. A classic handheld 2D code reader will read the code just fine. Can someone please explain to me why the camera, which normally reads these codes at the speed of light, sometimes has a problem with the codes? Thank you [Personal Information Edited by Moderator]
1
0
159
3w
com.apple.security.device.camera is being added to a MacCatalyst build Xcode 16.1
Hi, I have been building a MacCatalyst versions of an iOS app for years using a separate build that included a specific .entitlements file that excludes the com.apple.security.device.camera. Yet when I now build with Xcode 16.1 that entitlement is included. I have double checked my signing entitlement for my MacCatalyst build it is configured properly. I have check my .entitlement file to ensusre com.apple.security.device.camera is not there. All is as it should be. I have changed nothing, my build flow is the same. App Store Review has prevented the Mac build to be release becuse the com.apple.security.device.camera is set. What can I do to correct this?
1
0
143
3w
**Title:** Front-Facing Camera Rotation Matrix in ARKit: Consistency, Transformations, and `ARFrame.camera` Alignment
I'm seeking detailed information about the rotation matrix of the iPhone's front-facing (selfie) camera when using ARKit. Specifically, I need to understand: The exact rotation matrix applied to the front-facing camera's output in ARKit. Whether this matrix is consistent across all iPhone models or if there are variations. If there are any transformations applied to align the camera's coordinate system with the device's orientation, particularly in portrait mode. How this rotation matrix relates to the transform property of `ARFrame.camera
0
0
258
Oct ’24
Some use AVCaptureControl problems
Set 3 controls to the AVCaptureSession and remove them all. The number of controls in the session is indeed 0, but the camera controls button still shows the previous 3 controls. If it is only 3->2 or 3->1, it can be modified normally, 3->0 is not OK, 0->3 is OK. f (self.captureControl.zoom) { if (self.zoomScaleControl) { self.zoomScaleControl.enabled = false; [_session removeControl:self.zoomScaleControl]; } AVCaptureSlider *zoomSlider = [self.captureControl.zoom fetchCaptureSlider]; [zoomSlider setActionQueue:dispatch_get_main_queue() action:^(float zoomFactor) { @strongify(self); if ([self.dataOutputDelegate respondsToSelector:@selector(videoCaptureSession:tryChangeZoomScale:)]) { [self.dataOutputDelegate videoCaptureSession:self tryChangeZoomScale:zoomFactor]; } }]; self.zoomScaleControl = zoomSlider; } else { if (self.zoomScaleControl) { self.zoomScaleControl.enabled = false; [_session removeControl:self.zoomScaleControl]; } self.zoomScaleControl = nil; } if (self.captureControl.exposure) { if (self.exposureBiasControl) { self.exposureBiasControl.enabled = false; [_session removeControl:self.exposureBiasControl]; } AVCaptureSlider *exposureSlider = [self.captureControl.exposure fetchCaptureSlider]; [exposureSlider setActionQueue:dispatch_get_main_queue() action:^(float bias) { @strongify(self); if ([self.dataOutputDelegate respondsToSelector:@selector(videoCaptureSession:tryChangeExposureBias:)]) { [self.dataOutputDelegate videoCaptureSession:self tryChangeExposureBias:bias]; } }]; self.exposureBiasControl = exposureSlider; } else { if (self.exposureBiasControl) { self.exposureBiasControl.enabled = false; [_session removeControl:self.exposureBiasControl]; } self.exposureBiasControl = nil; } if (self.captureControl.len) { if (self.lenControl) { self.lenControl.enabled = false; [_session removeControl:self.lenControl]; } ORLenCaptureControlCustomModel *len = self.captureControl.len; AVCaptureIndexPicker *picker = [len fetchCaptureSlider]; [picker setActionQueue:dispatch_get_main_queue() action:^(NSInteger selectedIndex) { @strongify(self); if ([self.dataOutputDelegate respondsToSelector:@selector(videoCaptureSession:didChangeLenIndex:datas:)]) { [self.dataOutputDelegate videoCaptureSession:self didChangeLenIndex:selectedIndex datas:self.captureControl.len.indexDatas]; } }]; self.lenControl = picker; } else { if (self.lenControl) { self.lenControl.enabled = false; [_session removeControl:self.lenControl]; } self.lenControl = nil; } if ([_session canAddControl:self.zoomScaleControl]) { [_session addControl:self.zoomScaleControl]; } else { self.zoomScaleControl = nil; } if ([_session canAddControl:self.lenControl]) { [_session addControl:self.lenControl]; } else { self.lenControl = nil; } if ([_session canAddControl:self.exposureBiasControl]) { [_session addControl:self.exposureBiasControl]; } else { self.exposureBiasControl = nil; } if (_session.controlsDelegate == nil) { [_session setControlsDelegate:self queue:GetCaptureControlQueue()]; }
0
0
181
Oct ’24