Integrate iOS device camera and motion features to produce augmented reality experiences in your app or game using ARKit.

ARKit Documentation

Post

Replies

Boosts

Views

Activity

Testing ARGeoTrackingConfiguration
I am developing an app using ARKit4 ARGeoTrackingConfiguration following https://developer.apple.com/documentation/arkit/content_anchors/tracking_geographic_locations_in_ar. I am outside of the US, so my location is not supported. I simulate a location but the CoachingState always stays at .initializing Is there a way to test geotracking apps outside of the US?
1
0
778
Apr ’21
How the depth-map is aligned to rgb image ?
I want to know are depth map and RGB image are perfectly aligned(do both have the same principle point)? If yes then how the depth- map is created. The depth map on iphone12 has 256x192 resolution as opposed to an RGB image (1920x1440). I am interested in exact pixel-wise depth. Is it possible to get the raw depth map of 1920x1440 resolution ? How is the depth-map is created at 256 x 192 resolution? Behind the scenes does the pipeline captures it at 1920 x1440 resolution and then resize it to 256x192? I have so many questions as there are no intrinsic, extrinsic, and calibration data given regarding the lidar. I would greatly appreciate it if someone can explain the steps from a computer-vision perspective. Many Thanks
6
2
2.4k
Aug ’21
RealityKit PhotogrammetrySession not recognizing depth scale
I'm creating a custom scanning solution for iOS and using RealityKit Object Capture PhotogrammetrySession API to build a 3D model. I'm finding the data I'm sending to it is ignoring the depth and not building the model to scale. The documentation is a little light on how to format the depth so I'm wondering if someone could take a look at some example files I send to the PhotogrammetrySession. Would you be able to tell me what I'm not doing correctly? https://drive.google.com/file/d/1-GoeR_KMhX_i7-y8M8ElDRrySasOdoHy/view?usp=sharing Thank you!
1
0
1.8k
Jan ’22
Get distance from uvd and intrinsic matrix?
Hello! I am having trouble calculating accurate distances in the real world using the camera's returned intrinsic matrix and pixel coordinates/depths captured from the iPhone's LiDAR. For example, in the image below, I set a mug 0.5m from the phone. The mug is 8.5cm wide. The intrinsic matrix returned from the phone's AVCameraCalibrationData class has focalx = 1464.9269, focaly = 1464.9269, cx = 960.94916, and cy = 686.3547. Selecting the two pixel locations denoted in the image below, I calculated each one's xyz coordinates using the formula: x = d * (u - cx) / focalx y = d * (v - cy) / focaly z = d Where I get depth from the appropriate pixel in the depth map - I've verified that both depths were 0.5m. I then calculate the distance between the two points to get the mug width. This gives me a calculated width of 0.0357, or 3.5 cm, instead of the 8.5cm I was expecting. What could be accounting for this discrepancy? Thank you so much for your help!
5
0
2.3k
May ’22
ARKit : Couldn't load AR reference object from URL or AR Resource group
I have a arobject files that's already tested and working perfectly in Reality Composer as an anchor. But for whatever reason when I try them both as AR Resource group in my asset or even loading it directly from the url it always fails (returns nil) I've double check all the file/group names and they seems fine and I couldn't find the error, just always nil. This is my code : var referenceObject: ARReferenceObject? if let referenceObjects = ARReferenceObject.referenceObjects(inGroupNamed: "TestAR", bundle: Bundle.main) { referenceObject = referenceObjects[referenceObjects.startIndex] } if let referenceObject = referenceObject{ delegate.didFinishScan(referenceObject, false) }else { do { if let url = Bundle.main.url(forResource: "Dragonball1", withExtension: "arobject") { referenceObject = try ARReferenceObject(archiveURL: url) } } catch let myError{ let error = myError as NSError print("try \(error.code)") } } Any idea? Thanks
1
1
703
Aug ’22
RealityComposer asset not showing up on test app OR Apple's own basic AR file project
hi I am not sure what is going on... I have been working on this model for a while on reality composer, and had no problem testing it that way...it always worked out perfectly. So I imported the file into a brand new Xcode project... I created a new ARApp, and used SwiftUI. I actually did it twice ... And tested the version apple has with the box. In Apple's version, the app appears but the whole part where it tries to detect planes didn't show up. So I am confused. I found a question that mentions the error messages I am getting but I am not sure how to get around it? https://developer.apple.com/forums/thread/691882 // // ContentView.swift // AppToTest-02-14-23 // // Created by M on 2/14/23. // import SwiftUI import RealityKit struct ContentView : View {   var body: some View {     return ARViewContainer().edgesIgnoringSafeArea(.all)   } } struct ARViewContainer: UIViewRepresentable {       func makeUIView(context: Context) -> ARView {           let arView = ARView(frame: .zero)           // Load the "Box" scene from the "Experience" Reality File     //let boxAnchor = try! Experience.loadBox()     let anchor = try! MyAppToTest.loadFirstScene()           // Add the box anchor to the scene     arView.scene.anchors.append(anchor)           return arView         }       func updateUIView(_ uiView: ARView, context: Context) {}     } #if DEBUG struct ContentView_Previews : PreviewProvider {   static var previews: some View {     ContentView()   } } #endif This is what I get at the bottom 2023-02-14 17:14:53.630477-0500 AppToTest-02-14-23[21446:1307215] Metal GPU Frame Capture Enabled 2023-02-14 17:14:53.631192-0500 AppToTest-02-14-23[21446:1307215] Metal API Validation Enabled 2023-02-14 17:14:54.531766-0500 AppToTest-02-14-23[21446:1307215] [AssetTypes] Registering library (/System/Library/PrivateFrameworks/CoreRE.framework/default.metallib) that already exists in shader manager. Library will be overwritten. 2023-02-14 17:14:54.716866-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/suFeatheringCreateMergedOcclusionMask.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle 2023-02-14 17:14:54.743580-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arKitPassthrough.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle 2023-02-14 17:14:54.744961-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/drPostAndComposition.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle 2023-02-14 17:14:54.745988-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arSegmentationComposite.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle 2023-02-14 17:14:54.747245-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute0.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle 2023-02-14 17:14:54.748750-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute1.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle 2023-02-14 17:14:54.749140-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute2.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle 2023-02-14 17:14:54.761189-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute3.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle 2023-02-14 17:14:54.761611-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute4.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle 2023-02-14 17:14:54.761983-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute5.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle 2023-02-14 17:14:54.762604-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute6.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle 2023-02-14 17:14:54.763575-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute7.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle 2023-02-14 17:14:54.764859-0500 AppToTest-02-14-23[21446:1307215] [Foundation.Serialization] Json Parse Error line 18: Json Deserialization; unknown member 'EnableARProbes' - skipping. 2023-02-14 17:14:54.764902-0500 AppToTest-02-14-23[21446:1307215] [Foundation.Serialization] Json Parse Error line 20: Json Deserialization; unknown member 'EnableGuidedFilterOcclusion' - skipping. 2023-02-14 17:14:55.531748-0500 AppToTest-02-14-23[21446:1307215] throwing -10878 2023-02-14 17:14:55.534559-0500 AppToTest-02-14-23[21446:1307215] throwing -10878 2023-02-14 17:14:55.534633-0500 AppToTest-02-14-23[21446:1307215] throwing -10878 2023-02-14 17:14:55.534680-0500 AppToTest-02-14-23[21446:1307215] throwing -10878 2023-02-14 17:14:55.534733-0500 AppToTest-02-14-23[21446:1307215] throwing -10878 2023-02-14 17:14:55.534777-0500 AppToTest-02-14-23[21446:1307215] throwing -10878 2023-02-14 17:14:55.534825-0500 AppToTest-02-14-23[21446:1307215] throwing -10878 2023-02-14 17:14:55.534871-0500 AppToTest-02-14-23[21446:1307215] throwing -10878 2023-02-14 17:14:55.534955-0500 AppToTest-02-14-23[21446:1307215] throwing -10878 2023-02-14 17:14:56.207438-0500 AppToTest-02-14-23[21446:1307383] [Technique] ARWorldTrackingTechnique <0x1149cd900>: World tracking performance is being affected by resource constraints [2] 2023-02-14 17:17:15.741931-0500 AppToTest-02-14-23[21446:1307414] [Technique] ARWorldTrackingTechnique <0x1149cd900>: World tracking performance is being affected by resource constraints [1] 2023-02-14 17:22:07.075990-0500 AppToTest-02-14-23[21446:1308137] [Technique] ARWorldTrackingTechnique <0x1149cd900>: World tracking performance is being affected by resource constraints [1] code-block
3
0
1.8k
Feb ’23
Texture not appling on roomplan wall object (capturedata)
We are attempting to update the texture on a node. The code below works correctly when we use a color, but it encounters issues when we attempt to use an image. The image is available in the bundle, and it image correctly in other parts of our application. This texture is being applied to both the floor and the wall. Please assist us with this issue." for obj in Floor_grp[0].childNodes { let node = obj.flattenedClone() node.transform = obj.transform let imageMaterial = SCNMaterial() node.geometry?.materials = [imageMaterial] node.geometry?.firstMaterial?.diffuse.contents = UIColor.brown obj.removeFromParentNode() Floor_grp[0].addChildNode(node) }
1
0
651
Oct ’23
Photogrammetry failed with crash(Assert: in line 417)
I'm developing a 3D scanner works on a iPad(6th gen, 12-inch). Photogrammetry with ObjectCaptureSession was successful, but other trials are not. I've tried Photogrammetry with URL inputs, these are pictures from AVCapturePhoto. It is strange... if metadata is not replaced, photogrammetry would be finished but it seems to be no depthData or gravity info were used. (depth and gravity is separated files). but if metadata is injected, this trial are fails. and this time i tried to Photogrammetry with PhotogrammetrySamples sequence and it also failed. the settings are: camera: back Lidar camera, image format: kCVPicelFormatType_32BGRA(failed with crash) or hevc(just failed) image depth format: kCVPixelFormatType_DisparityFloat32 or kCVPixelFormatType_DepthFloat32 photoSetting: isDepthDataDeliveryEnabled = true, isDepthDataFiltered = false, embeded = true I wonder iPad supports Photogrammetry with PhotogrammetrySamples I've already tested some sample codes provided by apple: https://developer.apple.com/documentation/realitykit/creating_a_photogrammetry_command-line_app https://developer.apple.com/documentation/avfoundation/additional_data_capture/capturing_depth_using_the_lidar_camera https://developer.apple.com/documentation/realitykit/taking_pictures_for_3d_object_capture What should I do to make Photogrammetry successful?
2
1
638
Nov ’23
When "zooming" in on the camera, the "joint information" comes out wrong.
Hello. I am a student who just studied Swift. It has succeeded in obtaining body joint information through 'skelton.jointLandmarks' from a typical screen, but every time I zoom in, there is a problem that this joint information is not on the human body and moves sideways and downward. From my guess, there may be a problem that the center of the AR screen is not located in the center of the cell phone screen. I've been searching for information for 3 days due to this problem, but I couldn't find a similar case, and I haven't been able to solve it. If there is a case of solving a similar problem, I would appreciate it if you could let me know. Below link is how I zoomed in on the ARView screen. https://stackoverflow.com/questions/64896064/can-i-use-zoom-with-arview Thank you. Below is how I'm currently having trouble.
0
0
485
Nov ’23
Reality Converter smoothing all geometry
This wasn't happening with older versions of Reality Converter... I open a GLTf file created in Cinema 4D and all corners are smoothed, completely transforming the object's geometry. Reality Converter was a very useful tool for quick object testing: save in USDz and a look using iPhone, unfortunately is this smoothing happens this is not usable anymore. Does anyone know why this is happening and how to turn it off? This is a screenshot from C4D And this is the same object imported in Reality Converter
0
0
293
Nov ’23
USD particles not supported in RealityKit with iOS (*not VisionOS*)
RealityKit doesn't appear to support particles. After exporting particles from Blender 4.0.1, in standard .usdz format, the particle system renders correctly in Finder and Reality Converter, but when loaded into and anchored in RealityKit...nothing happens. This appears to be a bug in RealityKit. I tried one or more particle instances and nothing renders.
0
0
722
Nov ’23