Developing a prototype Vision Pro app and would like to render a 3D scene made from Reality Composer Pro on an image anchor in a RealityView. But I have no luck so far to make it work and need some guidance to move on.
I got the image file stored in the assets like below:
And from below is the source codes:
import SwiftUI
import RealityKit
import RealityKitContent
struct AnchorView: View {
@State var imageEntity: Entity = {
let anchorEntity = AnchorEntity(.image(group: "AR Resources", name: "reanchor"))
return anchorEntity
}()
var body: some View {
RealityView { content in
do
{
// Add the initial RealityKit content
if let scene = try? await Entity(named: "Scene", in: realityKitContentBundle)
{
imageEntity.addChild(scene)
content.add(imageEntity)
}
}
catch
{
print("Error occurs when adding reality view content: \(error)")
}
}
}
}
Hello,
Your code looks fine here, as does your AR Resources folder.
My suspicion is that one of the following are true:
- The system is failing to recognize your target image. Try testing with a QR code for example, to rule that possibility out.
- The scene you are adding to the anchor is of a very large scale, and its added, but you can't see it because you are inside of it.
- The physical size of your image target specified in the AR Resources folder is very large, causing the scene to appear very far away.
Try looking into some of those possibilities, and let me know if you still can't figure it out!