Object Capture

RSS for tag

Turn photos from your iPhone or iPad into high‑quality 3D models that are optimized for AR using the new Object Capture API on macOS Monterey.

Posts under Object Capture tag

34 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

ObjectCapture from ARKit
We are currently using ObjectCapture from ARKit, and we would like to fix exposure time, white balance parameter and ISO. How can we do this ? Additionally, we'd like to obtain the following information from the ARKit : white balance parameters (in case we cannot fix them) and color correction matrices ?
0
0
132
1w
Are there other ways to train the model? Training the VisionOS tracking model using Create ML on the Mac takes too much time each time.
I am developing with Apple Vision Pro to implement object tracking functionality, but each model needs to go into Create ML for training, and the training time is very long. Are there other ways to shorten training time while obtaining reference files in the same format? Additionally, can the delay in object tracking be further optimized? Although the refresh rate has been optimized, there is still a noticeable delay.
1
0
268
1w
PhotogrammetrySession on non Pro Iphone
Hello, I'm creating an app that use PhotogrammetrySession Class to build 3D objects from photographs (https://developer.apple.com/documentation/realitykit/creating-3d-objects-from-photographs). I'm wondering why this class is working only on Pro iphone (12 Pro, 13 Pro, 14 Pro, 15 Pro and 16 Pro) and none non-Pro iPhone. My app does not use Lidar so it's not the problem. I thought it could be power-related but a18 soc from iPhone 16 is more powerful than a14 bionic from iPhone 12 Pro (i could also mention iPhone 13 Pro and iPhone 14 that both have a15 bionic whereas only the first one is compatible). Did I miss something that could explain these restrictions ? Is there any plan to make this class usable by every iPhone enough powerful to run it ? Thanks in advance for answering me
0
0
114
2w
Apple's Object capture
We are currently using Apple's Object capture module and wonder if it would be possible to collect the following data : Device information Current translation / rotation Focal length embedded to the image headers GPS localisation information. Information about the exposure time White balances and the color correction matrices We also have 2 additional questions : Is there an option to block close up accomodation of the camera ? Is there a way for the object capture module to take a video instead of a series of picture ?
1
0
249
Oct ’24
Constant color API improvement
I've experimented quite a bit with the new API designed to neutralize image colors using the iPhone flash, and I think the concept is brilliant. The flash could potentially serve as a substitute for a color checker, given our full control over it. However, I believe there are several areas where this API could be improved. Firstly, the resulting images often appear "unattractive"—colors tend to look faded, and the images themselves can be overly bright and washed out, losing the natural ambiance, shadows, and introducing unwanted flash reflections. There is also inconsistency in color rendering; for example, yellows sometimes appear unnatural, possibly due to reflections. In some cases, all the colors in the image are completely desaturated or become black and white if another light source does not fully illuminate the scene. Additionally, the shadows cast by the flash don't correct the colors properly since they fall outside the flash's range. I think many of these issues could be resolved if we had access to ProRAW images capturing both the ambient light (without flash) and the flash-illuminated scene. With these, we could use specific colors in the image as a reference, similar to a color checker, to create an ICC profile or color transformation matrix to adjust the image colors more globally. This approach could help retain the shadows from the ambient light while still correcting colors to a neutral tone. Access to ProRAW data is crucial for this, as it would provide images without the saturation issues that can affect some colors and with a linear tone curve. I hope this suggestion makes sense and could help improve the API's effectiveness.
0
1
307
Aug ’24
Object Capture API crash frequently when start generating model
I have updated the sample code so that the scan will start generating when 15 photos r captured. I hope I can catch this error so the app wont crash.... really need help on this and thank you in advanced ! Hardware Model: iPhone14,2 OS Version: iPhone OS 17.6.1 (21G93) Exception Type: EXC_BREAKPOINT (SIGTRAP) Exception Codes: 0x0000000000000001, 0x000000023363518c Termination Reason: SIGNAL 5 Trace/BPT trap: 5 Terminating Process: exc handler [525] Triggered by Thread: 0 Thread 0 name: Thread 0 Crashed: 0 RealityKit_SwiftUI 0x000000023363518c CoveragePointCloudMiniView.interfaceOrientation.getter + 508 (CoveragePointCloudMiniView.swift:0) 1 RealityKit_SwiftUI 0x0000000233634cdc closure #1 in closure #2 in CoveragePointCloudMiniView.body.getter + 124 (CoveragePointCloudMiniView.swift:75) 2 RealityKit_SwiftUI 0x000000023363db9c partial apply for closure #1 in closure #2 in CoveragePointCloudMiniView.body.getter + 20 (:0) 3 SwiftUI 0x0000000195c4bbac closure #1 in withTransaction(::) + 276 (Transaction.swift:243) 4 SwiftUI 0x0000000195c4ba90 partial apply for closure #1 in withTransaction(::) + 24 (:0) 5 libswiftCore.dylib 0x00000001903f8094 withExtendedLifetime<A, B>(::) + 28 (LifetimeManager.swift:27) 6 SwiftUI 0x0000000195b17d78 withTransaction(::) + 72 (Transaction.swift:228) 7 SwiftUI 0x0000000195b17d04 withAnimation(::) + 116 (Transaction.swift:280) 8 RealityKit_SwiftUI 0x0000000233634bfc closure #2 in CoveragePointCloudMiniView.body.getter + 664 (CoveragePointCloudMiniView.swift:73) 9 SwiftUI 0x0000000195bef134 closure #1 in closure #1 in SubscriptionView.Subscriber.updateValue() + 72 (SubscriptionView.swift:66) 10 SwiftUI 0x0000000195b3f57c thunk for @escaping @callee_guaranteed () -> () + 28 (:0) 11 SwiftUI 0x0000000195b3c864 static Update.dispatchActions() + 1140 (Update.swift:151) 12 SwiftUI 0x0000000195b3bedc static Update.end() + 144 (Update.swift:58) 13 SwiftUI 0x0000000195a691fc closure #1 in SubscriptionView.Subscriber.updateValue() + 700 (SubscriptionView.swift:66) 14 SwiftUI 0x0000000195a68eb0 partial apply for thunk for @escaping @callee_guaranteed (@in_guaranteed A.Publisher.Output) -> () + 28 (:0) 15 SwiftUI 0x0000000195a68e78 closure #1 in ActionDispatcherSubscriber.respond(to:) + 76 (SubscriptionView.swift:98) 16 SwiftUI 0x0000000195a68c80 ActionDispatcherSubscriber.respond(to:) + 816 (SubscriptionView.swift:97) 17 SwiftUI 0x0000000195a68938 ActionDispatcherSubscriber.receive(:) + 16 (SubscriptionView.swift:110) 18 SwiftUI 0x0000000195a6786c SubscriptionLifetime.Connection.receive(:) + 100 (SubscriptionLifetime.swift:195) 19 Combine 0x000000019aed29d4 Publishers.Autoconnect.Inner.receive(:) + 52 (Autoconnect.swift:142) 20 Combine 0x000000019aed2928 Publishers.Multicast.Inner.receive(:) + 244 (Multicast.swift:211) 21 Combine 0x000000019aed2828 protocol witness for Subscriber.receive(_:) in conformance Publishers.Multicast<A, B>.Inner + 24 (:0) .... (FBSScene.m:812) 46 FrontBoardServices 0x00000001aa892844 __94-[FBSWorkspaceScenesClient _queue_updateScene:withSettings:diff:transitionContext:completion:]_block_invoke_2 + 152 (FBSWorkspaceScenesClient.m:692) 47 FrontBoardServices 0x00000001aa8926cc -[FBSWorkspace _calloutQueue_executeCalloutFromSource:withBlock:] + 168 (FBSWorkspace.m:411) 48 FrontBoardServices 0x00000001aa8977fc __94-[FBSWorkspaceScenesClient _queue_updateScene:withSettings:diff:transitionContext:completion:]_block_invoke + 344 (FBSWorkspaceScenesClient.m:691) 49 libdispatch.dylib 0x00000001999aedd4 _dispatch_client_callout + 20 (object.m:576) 50 libdispatch.dylib 0x00000001999b286c _dispatch_block_invoke_direct + 288 (queue.c:511) 51 FrontBoardServices 0x00000001aa893d58 FBSSERIALQUEUE_IS_CALLING_OUT_TO_A_BLOCK + 52 (FBSSerialQueue.m:285) 52 FrontBoardServices 0x00000001aa893cd8 -[FBSMainRunLoopSerialQueue _targetQueue_performNextIfPossible] + 240 (FBSSerialQueue.m:309) 53 FrontBoardServices 0x00000001aa893bb0 -[FBSMainRunLoopSerialQueue performNextFromRunLoopSource] + 28 (FBSSerialQueue.m:322) 54 CoreFoundation 0x0000000191adb834 CFRUNLOOP_IS_CALLING_OUT_TO_A_SOURCE0_PERFORM_FUNCTION + 28 (CFRunLoop.c:1957) 55 CoreFoundation 0x0000000191adb7c8 __CFRunLoopDoSource0 + 176 (CFRunLoop.c:2001) 56 CoreFoundation 0x0000000191ad92f8 __CFRunLoopDoSources0 + 340 (CFRunLoop.c:2046) 57 CoreFoundation 0x0000000191ad8484 __CFRunLoopRun + 828 (CFRunLoop.c:2955) 58 CoreFoundation 0x0000000191ad7cd8 CFRunLoopRunSpecific + 608 (CFRunLoop.c:3420) 59 GraphicsServices 0x00000001d65251a8 GSEventRunModal + 164 (GSEvent.c:2196) 60 UIKitCore 0x0000000194111ae8 -[UIApplication run] + 888 (UIApplication.m:3713) 61 UIKitCore 0x00000001941c5d98 UIApplicationMain + 340 (UIApplication.m:5303) 62 SwiftUI 0x0000000195ccc294 closure #1 in KitRendererCommon(:) + 168 (UIKitApp.swift:51) 63 SwiftUI 0x0000000195c78860 runApp(:) + 152 (UIKitApp.swift:14) 64 SwiftUI 0x0000000195c8461c static App.main() + 132 (App.swift:114) 65 SoleFit 0x0000000103046cd4 static SoleFitApp.$main() + 24 (SoleFitApp.swift:0) 66 SoleFit 0x0000000103046cd4 main + 36 67 dyld 0x00000001b52af154 start + 2356 (dyldMain.cpp:1298)
1
0
321
Aug ’24
Create 3D models with Object Capture VS Create 3D models with MAC
Create 3D models with Object Capture VS Create 3D models with MAC 1.After testing the model generated by the pictures taken on the mobile phone and comparing the .raw progress generated by the same set of data on the MAC side, the highest accuracy model has different effects. Sometimes the mobile phone model has higher accuracy, and sometimes the MAC model has higher accuracy. What are the two ends? The difference is that according to WWDC2023 MAC, a higher-precision model can be generated. However, in actual testing, it is possible that the integrity of MAC generation is not as good as that of the mobile phone. This is why. 2.Is it possible to set the accuracy of the generated model on the mobile phone?
0
0
530
Jul ’24
ObjectCapture and ARObjectAnchor
Is it possible to both capture the images required for ObjectCapture and the scan data required to create an ARObjectAnchor (and be able to align the two to each other)? Perhaps an extension of this WWDC 2020 example that also integrates usdz object capture (instead of just import external one)? https://developer.apple.com/documentation/arkit/arkit_in_ios/content_anchors/scanning_and_detecting_3d_objects?changes=_2
2
0
561
Jul ’24
VisionOS Object Capture get entity position
Hello :) As title, I have used RCP with reference objects to capture items in real world. My next step is to detect how close the user finger is that object. I had tried to get the entity's relative position to root but found the position, somehow, is always the same regardless of how and where I move around the camera or the object. The entity has a child transform with a collision component, which is used to detect collision when the finger is closed enough to calculate the distance, but it fails as well... Any help will be appreciated, ty
1
0
540
Jul ’24
3D Object Capture not working on iphone 12 pro
The 3D object capture feature doesn’t seem to work on my iphone 12 pro. The circle that is supposed to show up when you begin to begin to move around the object doesnt show up so object capture doesn’t even begin. It says ‘more light..’ or ‘move closer’ but this doesnt happen on my iphone 14 pro. Works perfectly fine on that even with the same lighting. How can this be fixed?
1
0
522
Jul ’24
Object Tracking Training: Objects to avoid
I'm working on training an object tracking model in CreateML for visionOS that has fan blades on it and looking to try to train while ignoring a section of the geometry. When I train currently, it can detect the object if the fan blades are in the same orientation as when scanned but if they move it struggles. I see there is an "objects to avoid" data source that can be added but upon reading the description, I don't think it does what I'm needing but I could be wrong. Is there anyway to have the training ignore a part of the geometry that has a significant effect on the silhouette of the object?
0
0
536
Jul ’24
Trouble with Core ML Object Tracking for Spherical Objects Using WWDC Sample Code and Object Capture
Hi everyone, I'm working with Core ML for object tracking and have successfully trained a couple of models. However, when I try to use the reference object file in the object tracking sample code from the WWDC video, it doesn't work. I'm training the model on a single-color plastic spherical object. Could this be the reason for the issue? I also attempted to use USDZ 3D assets that resemble the real object. Do these need to be trained with the Object Capture app to work properly? Speaking of the Object Capture app, my experience hasn't been great. When I scanned my spherical object, the result was far from a sphere—it looked more like a mushy dough. Any insights or suggestions would be greatly appreciated!
2
0
774
Jul ’24
New Object Capture Sample Code
Hi, Object Capture's original sample code was released last year, and this year there was a talk about adding area mode to it. The talk links to the old Object Capture code - when can I expect to have the new one with area mode, and is there anything I can help you with to have it published faster? Thanks!
3
1
865
Jun ’24
Website Header Disabled In Mcbook
Hy my name is zubair and i a blogger. I have recently create a website on gold price in oman url omangoldprices.com I maked a well design header footer and every thing as compared with my competitors but the header of my website disabled and not working on Macbook and macbook pro and the logo does not showing on IOS version Iphone 6 12.6.1. I am very comfused about that can anyone help me in?
0
0
524
May ’24
Object Capture to USDZ Scaling
Im trying to take an object capture, and scale it. What I did so far is create a Reality Composer project, insert the .objcap file into the project, and then scaled it from 100%, to 200%. I then extracted it as a USDZ. it just won't show up in the Xcode preview now, and im not sure why it doesn't show. Is there any way to fix this? im going crazy trying to find a fix for this to work.
1
0
814
Apr ’24
Meet Object Capture for iOS
Hi ,
We are using Scanning objects using Object Capture app provided by apple it was working fine but suddenly it started crashing while scanning the object with bounding box settings.
Getting device log but showing different reasons for app crash.
Device : iPad Pro / iPhone 15 Pro
iOS : 17.4
Attaching the device log for crash, waiting for your response. -------------------------------------
Translated Report (Full Report Below)
-------------------------------------
Incident Identifier: 0797ADAF-D653-4C92-8AA0-300AA167002B
CrashReporter Key: 48724a3b30ef15e069f513afff7d1aa2e935a520
Hardware Model: iPhone16,1
Process: GuidedCapture [918]
Path: /private/var/containers/Bundle/Application/ACCE6C58-98F0-4DD7-AA6E-732190E0FD30/GuidedCapture.app/GuidedCapture
Identifier: com.example.apple-samplecode.GuidedCaptureH28X75MLUY
Version: 1.0 (1)
Code Type: ARM-64 (Native)
Role: Foreground
Parent Process: launchd [1]
Coalition: com.example.apple-samplecode.GuidedCaptureH28X75MLUY [743]
Date/Time: 2024-03-26 18:23:07.8269 +0530
Launch Time: 2024-03-26 18:20:02.5964 +0530
OS Version: iPhone OS 17.4.1 (21E236)
Release Type: User
Baseband Version: 1.55.04
Report Version: 104
Exception Type: EXC_BREAKPOINT (SIGTRAP)
Exception Codes: 0x0000000000000001, 0x000000022e6577a4
Termination Reason: SIGNAL 5 Trace/BPT trap: 5
Terminating Process: exc handler [918]
Triggered by Thread: 33
Thread 33 name: Dispatch queue: com.apple.coreoc.queues.serial.session
Thread 33 Crashed:
0 CoreOC 0x22e6577a4 0x22e657400 + 932
1 CoreOC 0x22e657588 0x22e657401 + 391
2 CoreOC 0x22e697628 0x22e697221 + 1031
3 CoreOC 0x22e684864 0x22e684431 + 1075
4 CoreOC 0x22e6831ec 0x22e6828d5 + 2327
5 CoreOC 0x22e6c1fb4 0x22e6c1f5d + 87
6 CoreOC 0x22e5ef128 0x22e5ef105 + 35
7 libdispatch.dylib 0x19291113c _dispatch_call_block_and_release + 31
8 libdispatch.dylib 0x192912dd4 _dispatch_client_callout + 19
9 libdispatch.dylib 0x19291a400 _dispatch_lane_serial_drain + 747
10 libdispatch.dylib 0x19291af30 _dispatch_lane_invoke + 379
11 libdispatch.dylib 0x192925cb4 _dispatch_root_queue_drain_deferred_wlh + 287
12 libdispatch.dylib 0x192925528 _dispatch_workloop_worker_thread + 403
13 libsystem_pthread.dylib 0x1e69f8f20 _pthread_wqthread + 287
14 libsystem_pthread.dylib 0x1e69f8fc0 start_wqthread + 7
0
0
612
Mar ’24