Discuss spatial computing on Apple platforms and how to design and build an entirely new universe of apps and games for Apple Vision Pro.

All subtopics

Post

Replies

Boosts

Views

Activity

Vuforia Model Target using ARKit
Hi, Does anyone have any experience using ARKit to emulate something like Vuforia's Model Target, where it will detect a 3D object within an environment corresponding to a 3D model and then overlay the 3D model on top of the real life object? Is it technically feasible or is Vuforia the only option? Thanks!
0
0
240
Oct ’23
Transitioning from SceneKit to RealityKit - shadows and custom shaders
We have a content creation application that uses SceneKit for rendering. In our application, we have a 3D view (non-AR), and an AR "mode" the user can go into. Currently we use a SCNView and an ARSCNView to achieve this. Our application currently targets iOS and MacOS (with AR only on iOS). With VisionOS on the horizon, we're trying to bring the tech stack up to date, as SceneKit no longer seems to be supported, and isn't supported at all on VisionOS. We'd like to use RealityKit for 3D rendering on all platforms; MacOS, iOS and VisionOS, in non-AR and AR mode where appropriate. So far this hasn't been too difficult. The greatest challenge has been adding gesture support to replace the allowsCameraControl option on the SCNView, as no such option on ARView. However, now we get to control shading, we're hitting a bit of a roadblock. When viewing the scene in Non-AR mode, we would like to add a ground plane underneath the object that only displays a shadow - in other words, it's opacity would be determined by light contribution. I've had a dig through the CustomMaterial API and it seems extremely primitive - there doesn't seem any way to get light information for a particular fragment, unless I'm missing something? Additionally, we support a custom shader that we can apply as materials. This custom shader allows the properties of the material to vary depending on the light contribution, light incidence angle...etc. Looking at the CustomMaterial, the API seems to be defining a CustomMaterial, whereas as guess we want to customise the BRDF calculation. We achieve this in SceneKit using a series of shader modifiers hooked into the various SCNShaderModifierEntryPoint. On VisionOS of course the lack of support for CustomMaterial is a shame, but I would hope something similar can be achieved with RealityComposer? We can live with the lack of custom material, but the shadow catcher is a killer for adoption for us. I'd even accept a different limited features on VisionOS, as long as we can matching our existing feature set on existing platforms. What am I missing?
1
1
973
Oct ’23
[BUG] Gizmo Doesn't Update When Changing 'Up Axis' to 'Z' in Reality Composer Pro
When changing the 'Up axis' setting in the Layer Data tab to 'Z', the gizmo does not reflect the change. It continues to display as if the Up axis is 'Y'. This results in the gizmo becoming disconnected from the object itself, making it challenging to perform accurate transformations using the gizmo. Steps to Reproduce: Open Reality Composer Pro in the latest XCode Beta. Click on empty space inside of your scene. Navigate to the Layer Data tab. Change the "Up axis" setting to 'Z'. Observe the gizmo's orientation.
0
0
416
Oct ’23
Deployment iOS target of app older than iOS currently running on iPhone?
I have an iPhone 8 Plus running iOS 16.7.1 I have made a very simplistic app only for personal use that I install from time to time on my iPhone. It doesn't even need internet. But ı take a error like that iPhone’s iOS 16.7.1 doesn’t match AugmentedRealityApp.app’s iOS 17.0 deployment target. I'd like to see your opinions or a comment if anyone has tried something similar. Thanks in advance.
0
0
503
Oct ’23
Downgrade iPad OS
I have several ipads that have been upgraded to 17.0.3 but I need to be able to back them up to 16.6.1 version. We have apps that do not work currently on 17. I have downloaded the 16.6.1 .ipsw file and every time I try to use it I get OS cannot be restored on "iPad". Personalization failed. Any way to get an os file that would work?
0
0
444
Oct ’23
Using ARView's project(_:) method to convert to screen coordinates.
I'm trying to understand how to use the project(_:) function provided by ARView to convert 3D model coordinates to 2D screen coordinates, but am getting unexpected results. Below is the default Augmented Reality App project, modified to have a single button that when tapped will place a circle over the center of the provided cube. However, when the button is pressed, the circle's position does not line up with the cube. I've looked at the documentation for project(_:), but it doesn't give any details about how to convert a point from model coordinates to "the 3D world coordinate system of the scene". Is there better documentation somewhere on how to do this conversion? // ContentView.swift import SwiftUI import RealityKit class Coordinator { var arView: ARView? var anchor: AnchorEntity? var model: Entity? } struct ContentView : View { @State var coord = Coordinator() @State var circlePos = CGPoint(x: -100, y: -100) var body: some View { ZStack { ARViewContainer(coord: coord).edgesIgnoringSafeArea(.all) VStack { Spacer() Circle() .frame(width: 10, height: 10) .foregroundColor(.red) .position(circlePos) Button(action: { showMarker() }, label: { Text("Place Marker") }) } } } func showMarker() { guard let arView = coord.arView else { return } guard let model = coord.model else { return } guard let anchor = coord.anchor else { return } print("Model position is: \(model.position)") // convert position into anchor's space let modelPos = model.convert(position: model.position, to: anchor) print("Converted position is: \(modelPos)") // convert model locations to screen coordinates circlePos = arView.project(modelPos) ?? CGPoint(x: -1, y: -1) print("circle position is now \(circlePos)") } } struct ARViewContainer: UIViewRepresentable { var coord: Coordinator func makeUIView(context: Context) -> ARView { let arView = ARView(frame: .zero) coord.arView = arView // Create a cube model let mesh = MeshResource.generateBox(size: 0.1, cornerRadius: 0.005) let material = SimpleMaterial(color: .gray, roughness: 0.15, isMetallic: true) let model = ModelEntity(mesh: mesh, materials: [material]) coord.model = model // Create horizontal plane anchor for the content let anchor = AnchorEntity(.plane(.horizontal, classification: .any, minimumBounds: SIMD2<Float>(0.2, 0.2))) anchor.children.append(model) coord.anchor = anchor // Add the horizontal plane anchor to the scene arView.scene.anchors.append(anchor) return arView } func updateUIView(_ uiView: ARView, context: Context) {} } #Preview { ContentView(coord: Coordinator()) }
1
0
689
Oct ’23
Getting Child ModelEntity from Reality Composer Pro
Hi, I have a file in Reality Composer Pro that has a deep hierarchy. I've downloaded it from an asset store so I don't know how it is build. As you can see from the screenshot, I'm trying to access banana and banana_whole entities as ModelEntity but I'm not able to load them as ModelEntity in Xcode. I can load them as Entity and show them in visionOS Simulator but not as ModelEntity which I need to do to do some operations. What should I do?
2
0
775
Oct ’23
Index out of range in CoreRE framework
Hi, encountered an issue when running Underwater RealityKit app example (https://developer.apple.com/documentation/realitykit/building_an_immersive_experience_with_realitykit) Target device was iPhone 14 Pro running iOS 17.1 The same issue occurred on my own project, so the bug is not in Underwater app The bug itself: Crashed Thread: Render Func: re::DataArray<re::MeshInstance>::get(re::DataArrayHandle<re::MeshInstance>) const + 80 Message: Index out of range in operator[]. index = 18 446 744 073 709 551 615, maximum = 112
0
1
331
Oct ’23
Exporting scripts to a USDZ file in Reality Composer PRO.
Hello. I've started exploring the new features in Reality Composer PRO and noticed that Composer now supports adding custom scripts as components to any objects in the scene. I'm curious about the following: will these scripts work if I export such a scene to a USDZ file and try to open it using Apple Quick Look? For instance, I want to add a 3D button and a cube model. When I press the button (touch it), I want to change the material or material color to another one using a script component. Is such functionality possible?
0
0
636
Oct ’23
Used to be able to open Reality Composer Pro (Dev Tool) on Intel-based Mac, no longer?
Hi all, Up until a couple of days ago I was able to open and run Reality Composer Pro on my intel-based Mac. I tried to open it again this morning and I now receive the notification "Reality Composer is not supported on this Mac". I understand that I will eventually need a new computer with Apple silicon but it was nice to be able to start exploring Shader Graphs with my existing computer for now. Any suggestions? Perhaps go back to an earlier version of the beta Xcode - maybe the latest version disabled my ability to run RCP? I'm running Version 15.1 beta (15C5042i) of Xcode on an Intel i7 MacBook Pro. Thanks, in advance!
1
0
812
Nov ’23
Object Capture : Pose Information
Hi, In the newly released Object Capture API, for a PhotogrammetrySession, we can get the poses of the saved images, and the same images will be used to create the model. But in the sample project, https://developer.apple.com/documentation/realitykit/guided-capture-sample Only the 3D model that's generated will be saved, but for the others, pose, poses, bounds, point cloud, and model entity, there was a comment added, saying that it is // Not supported yet When will this be available for the developers ?? Can you give us a tentative date at least???
1
0
581
Nov ’23
ARKitCoachingOverlay translation / localization
Hi, is it possible to localize the text for the ARKitCoachingOverlay? https://miro.medium.com/v2/resize:fit:640/format:webp/1*FDzypCQtuU10Ky203NQr-A.png I`m developing with Unity and use this sample script provided: https://github.com/Unity-Technologies/arfoundation-samples/blob/main/Assets/Scenes/ARKit/ARKitCoachingOverlay/ARKitCoachingOverlay.cs So in the best case, you can describe how I can get translations working for German. Thanks
2
0
307
Nov ’23
Reality Composer Pro | UnlitSurface and opacity maps error
Hello all - I'm experiencing a shading error when I have two UnlitSurface shaders using images for color and opacity. When the shaders are applied to two mesh planes, one placed in front of the other, the shader in front will render and the plane mesh will mask out and not render what is behind. Basically - it looks like the opacity map on the shader in front is creating a 'mask'. I've attached some images here to help explain. Has anyone experienced this error? And how can I go about fixing this - thx!
5
2
1.3k
Nov ’23