I have an immersive space that is rendered using metal. Is there a way that I can position swiftUI views at coordinates relative to positions in my immersive space?
I know that I can display a volume with RealityKit content simultaneously to my metal content. The volume's coordinate system specifically, it's bounds, does not, coincide with my entire metal scene.
One approach I thought of would be to open two views in my immersive space. That way, I could simply add Attachment's to invisible RealityKit Entities in one view at positions where I have content in my metal scene.
unfortunately it seems that, while I can declare an ImmersiveSpace be composed of multiple RealityViews
ImmersiveSpace(){
RealityView { content in
// load first view
} update: { content in
// update
}
}
RealityView{ content in
//load second view
}
} update: { content in
//update
}
}
That results in two coinciding realty kit views in the immersive space.
I can not however do something like this:
ImmersiveSpace(){
CompositorLayer(configuration: ContentStageConfiguration()){ layerRenderer in
//set up my metal renderer and stuff
}
RealityView{ content in
//set up a view where I could use attachments
} update: { content in
}
}
Hey @FallAsleepBear,
This is a great suggestion, but something that is currently not supported. It would be great if you could file an enhancement request using Feedback Assistant, please do request this functionality and provide as much detail about your use-case as you can!
For a similar effect to attachments, you may find success in using ImageRender to create images from your SwiftUI views, but if you need user interactivity you'll have to add your own gestures to those images.
Thanks,
Michael