hello guys
as u know, full immersive boundary range is 3m * 3m, for safe the users.
but i need the full immersive space by more extended range.
bcuz i got the quite large area and to play more fun.
is the any problem?
thank you!
General
RSS for tagDiscuss Spatial Computing on Apple Platforms.
Post
Replies
Boosts
Views
Activity
Can anyone provide or point me to example code to fade in / out spotlights over 1 second?
Did not find anything on this topic in the docs:
https://developer.apple.com/documentation/realitykit/spotlight
I'm trying to hit an API URL, however I am getting this error
(501) Invalidation handler invoked, clearing connection
(501) personaAttributesForPersonaType for type:0 failed with error Error Domain=NSCocoaErrorDomain Code=4099 "The connection to service named com.apple.mobile.usermanagerd.xpc was invalidated: failed at lookup with error 159 - Sandbox restriction."
UserInfo={NSDebugDescription=The connection to service named com.apple.mobile.usermanagerd.xpc was invalidated: failed at lookup with error 159 - Sandbox restriction.}
Received port for identifier response: <(null)> with error:Error Domain=RBSServiceErrorDomain Code=1 "Client not entitled" UserInfo={RBSEntitlement=com.apple.runningboard.process-state, NSLocalizedFailureReason=Client not entitled, RBSPermanent=false}
elapsedCPUTimeForFrontBoard couldn't generate a task port
This is what my info.plist looks like -
This is the code I'm using to hit the URL and get a response
func sendMessage() {
guard let url = URL(string: "https://API_URL") else { return }
var request = URLRequest(url: url)
request.httpMethod = "POST"
request.setValue("application/json", forHTTPHeaderField: "Content-Type")
let body = ["query": message] //creates a dictionary with key:value pair
request.httpBody = try? JSONSerialization.data(withJSONObject: body) //converts the dictionary to json data and sets as body of request
isLoading = true
response = ""
URLSession.shared.dataTask(with: request) { data, _, error in //initiates async task for sending request
DispatchQueue.main.async { //async update of UI on main thread
isLoading = false
}
if let data = data, error == nil, //checks that data was received and that there is no error
let json = try? JSONSerialization.jsonObject(with: data) as? [String: String] { //parsing json response
DispatchQueue.main.async {
response = json["response"] ?? "No response"
}
}
}.resume() // starts the network request
}
Can anyone help me understand what the errors are and why I'm not able to get the response back?
We appear to be experiencing a bug with the latest beta for visionOS, we are attempting to playback a video with a transparent background in the app. In the previous beta playback worked as expected and the transparent parts of the video were transparent. In the latest beta the background appears black. The view we are using in a SwiftUI wrapped version of AVPlayerViewController, we have narrowed the bug down to only occurring only when playback is being presented in the embedded experience mode, if playback is being done in the expanded experience then playback is as expected.
This has only only been visible on an actual device, we have been unable to replicate the behaviour in the simulator using the latest Xcode 16.0 beta(beta 5 (16A5221g))
This is sample project that shows off the bug
Can we get the raw sensor data from the apple vision pro?
In my Volume, there is a RealityView that includes lighting effects. However, when the user drags the position of the window back and forth, the farther the distance between the volume and the user, the greater the brightness of the light effect. ( I believe this may be a Beta version of a bug.)
Note: The volume windowGroup has the .defaultWorldScaling(.dynamic) property.
Hello,
I'm trying to stream stereoscopic SBS video on the APV. I see that AVPlayerViewController supports MV-HEVC video playback but it's not clear how to play SBS video on the Apple Vision Pro.
Are there any docs or examples you can share?
For my use case, SBS is the only format I can support.
In visionOS2, there exists a function that enables users to raise their hand to display the home button. However, this functionality conflicts with the interaction required for the mixed display space utilized within my application. Therefore, I seek a method to disable this functionality.
AFAIK there's no way to programmatically detect when an ImmersiveSpaceContent is dismissed by a user (i.e. by pressing the home button).
By comparison, ImmersiveView has .onAppear() and .onDisappear():
ImmersiveSpace(id: appModel.immersiveSpaceID) {
ImmersiveView()
.environment(appModel)
.onAppear {
appModel.immersiveSpaceState = .open
}
.onDisappear {
appModel.immersiveSpaceState = .closed
}
}
In comparison:
// No similar callbacks for here:
struct MyImmersiveSpace: ImmersiveSpaceContent {
var body: CompositorLayer { /* ... */ }
}
I am testing RealityView on a Mac, and I am having troubles controlling the lighting.
I initially add a red cube, and everything is fine. (see figure 1)
I then activate a skybox with a star field, the star field appears, and then the red cube is only lit by the star field.
Then I deactivate the skybox expecting the original lighting to return, but the cube continues to be lit by the skybox. The background is no longer showing the skybox, but the cube is never lit like it originally was.
Is there a way to return the lighting of the model to the original lighting I had before adding the skybox?
I seem to recall ARView's environment property had both a lighting.resource and a background, but I don't see both of those properties in RealityViewCameraContent's environment.
Sample code for 15.1 Beta (24B5024e), Xcode 16.0 beta (16A5171c)
struct MyRealityView: View {
@Binding var isSwitchOn: Bool
@State private var blueNebulaSkyboxResource: EnvironmentResource?
var body: some View {
RealityView { content in
// Create a red cube 10cm on a side
let mesh = MeshResource.generateBox(size: 0.1)
let simpleMaterial = SimpleMaterial(color: .red, isMetallic: false)
let model = ModelComponent(
mesh: mesh,
materials: [simpleMaterial]
)
let redBoxEntity = Entity()
redBoxEntity.components.set(model)
content.add(redBoxEntity)
// Load skybox
let blueNeb2Name = "BlueNeb2"
blueNebulaSkyboxResource = try? await EnvironmentResource(named: blueNeb2Name)
}
update: { content in
if (blueNebulaSkyboxResource != nil) && (isSwitchOn == true) {
content.environment = .skybox(blueNebulaSkyboxResource!)
}
else {
content.environment = .default
}
}
.realityViewCameraControls(CameraControls.orbit)
}
}
Figure 1 (default lighting before adding the skybox):
Figure 2 (after activating skybox with star field; cube is lit by / reflects skybox):
Figure 3 (removing skybox by setting content.environment to .default, cube still reflects skybox; it is hard to see):
We’re looking to extend the capabilities of our Apple Vision Pro app to properly support the spatial playback of 180° 3D immersive videos. Currently, when these videos are played back, they are projected onto the entire 360° sphere, which results in a distorted and less-than-optimal experience for the user.
Our goal is to ensure that the 180° video content is correctly displayed within the horizontal hemisphere only, rather than across the full sphere. We’re unsure of the best approach to achieve this and would greatly appreciate your guidance.
Would it be possible for your team to review our code and provide us with the necessary steps or adjustments needed to achieve the desired playback results?
Case-ID: 8729125
Thank you for your assistance.
If I got a file or a file URL, how to judge it is a spatial photo, a panorama photo or a spatial video? Apple's Photo app can do it.
In the example code provided in the tutorial the following error is thrown when attempting to store actions in an animation library on the root. Specifically when trying to add actions. Is there another way to do this? The example code provided does not compile.
Hi, I would like to put a video in an ImmersiveSpace but, when people tap on maximize button from top left corner, the app crashes. Is it a way to remove that button?
I need it to be in an ImmersiveSpace because I would like to be able to instantiate it wherever I want.
I have design a 3D object and exported it as a USDZ. I also 3D printed said object. I want to use the object as a 3D trigger for an AR experience I am building. My question is: is there a process that would let me take the 3D .usdz file and convert it to a .arobject or a .objcap medium/low density point cloud to use as an AR trigger. Because I do have the 3D print of the object I did use the "scan" option when setting up my scene but the "resolution"/fidelity seems really low, and the results I get are just mediocre.
I would love to take my 3D USDZ that I already have and use it to generate a file that can be used as a 3D trigger. is this possible, or is there a process to do this. I am able to take the 3D that I scan in Reality Composer (which is exported as a .objcap file), send it to reality converter on my Mac and make a USDZ from it. I am looking for a way to go the other way .USDZ > .objcap or .arobject.
I am trying to make a experience that mimic projection mapping but in AR. I have a 3D object I built and textured in substance painter. I also printed this object in a base gray color. I want to use the 3D print of the object as an AR trigger that would start a scene placing/overlaying/projection mapping the textured 3D model over the gray 3D printed model. Ideally the mapped 3D model would be spatial attached to the 3D print and move with it when the object is handled.
Hi,
We are currently building an app for immersive experiences of our custom content. This is displayed from a video on a custom geometry in the immersive on the Vision Pro
I have enabled the AVPlayerViewController system controls that detach when entering immersive like in the sample:
https://developer.apple.com/documentation/visionos/building-an-immersive-media-viewing-experience
For our case, we do not need the 2D screen showing after entering the immersive, only the environment
So my question is how to remove the screen with the video and keep the controls, like in the Apple TV app for immersive experiences?
Thanks in advance
So I am tracking 2 objects in my scene, and spawning a tiny arrow on each of the objects (this part is working as intended).
Inside my scene I have added Collision Components and Physics Body Components to each of the arrows.
I want to detect when when a collision occurs between the 2 arrow entities.. I have made the collision boxes big enough so they should definitely be overlapping, however I am not able to detect when the Collision occurs.
This is the code that I use for the scene -
import SwiftUI
import RealityKit
import RealityKitContent
struct DualObjectTrackingTest: View {
@State private var subscription: EventSubscription?
var body: some View {
RealityView { content in
if let immersiveContentEntity = try? await Entity(named: "SceneFind.usda", in: realityKitContentBundle) {
content.add(immersiveContentEntity)
print("Collision check started")
}
} update: { content in
if let arrow = content.entities.first?.findEntity(named: "WhiteArrow") as? ModelEntity {
let subscription = content.subscribe(to: CollisionEvents.Began.self, on: arrow) { collisionEvent in
print("Collision has occured")
}
}
}
}
}
All I see in my console logs is "Collision check started" and then whenever I move the 2 objects really close to each other so as to overlap the collision boxes, I don't see any updates in the logs.
Can anyone give me some further guidance/resources on this?
Thanks again!
Is there any way to place 3D objects, maybe using ARKit or Metalkit in the video.
I have tried to extract frames from video, then draw a cube using SCNNode and then render it into UIImage, then gather all images and create video.
But this is not feasible solution as it creates huge memory spike and ultimately gives memory warning.
So is there any other way to draw 3D objects on the video file.
We've recently discovered that our app crashes on startup on the latest visionOS 2.0 beta 5 (22N5297g) build. In fact, the entire field of view would dim down and visionOS would then restart, showing the Apple logo. Interestingly, no app crash is reported by Xcode during debug.
After investigation, we have isolated the issue to a specific USDZ asset in our app. Loading it in a sample, blank project also causes visionOS to reliably crash, or become extremely unresponsive with rendering artifacts everywhere.
This looks like a potentially serious issue. Even if the asset is problematic, loading it should not crash the entire OS. We have filed feedback FB14756285, along with a demo project. Hopefully someone can take a look. Thanks!
I have updated the sample code so that the scan will start generating when 15 photos r captured. I hope I can catch this error so the app wont crash.... really need help on this and thank you in advanced !
Hardware Model: iPhone14,2
OS Version: iPhone OS 17.6.1 (21G93)
Exception Type: EXC_BREAKPOINT (SIGTRAP)
Exception Codes: 0x0000000000000001, 0x000000023363518c
Termination Reason: SIGNAL 5 Trace/BPT trap: 5
Terminating Process: exc handler [525]
Triggered by Thread: 0
Thread 0 name:
Thread 0 Crashed:
0 RealityKit_SwiftUI 0x000000023363518c CoveragePointCloudMiniView.interfaceOrientation.getter + 508 (CoveragePointCloudMiniView.swift:0)
1 RealityKit_SwiftUI 0x0000000233634cdc closure #1 in closure #2 in CoveragePointCloudMiniView.body.getter + 124 (CoveragePointCloudMiniView.swift:75)
2 RealityKit_SwiftUI 0x000000023363db9c partial apply for closure #1 in closure #2 in CoveragePointCloudMiniView.body.getter + 20 (:0)
3 SwiftUI 0x0000000195c4bbac closure #1 in withTransaction(::) + 276 (Transaction.swift:243)
4 SwiftUI 0x0000000195c4ba90 partial apply for closure #1 in withTransaction(::) + 24 (:0)
5 libswiftCore.dylib 0x00000001903f8094 withExtendedLifetime<A, B>(::) + 28 (LifetimeManager.swift:27)
6 SwiftUI 0x0000000195b17d78 withTransaction(::) + 72 (Transaction.swift:228)
7 SwiftUI 0x0000000195b17d04 withAnimation(::) + 116 (Transaction.swift:280)
8 RealityKit_SwiftUI 0x0000000233634bfc closure #2 in CoveragePointCloudMiniView.body.getter + 664 (CoveragePointCloudMiniView.swift:73)
9 SwiftUI 0x0000000195bef134 closure #1 in closure #1 in SubscriptionView.Subscriber.updateValue() + 72 (SubscriptionView.swift:66)
10 SwiftUI 0x0000000195b3f57c thunk for @escaping @callee_guaranteed () -> () + 28 (:0)
11 SwiftUI 0x0000000195b3c864 static Update.dispatchActions() + 1140 (Update.swift:151)
12 SwiftUI 0x0000000195b3bedc static Update.end() + 144 (Update.swift:58)
13 SwiftUI 0x0000000195a691fc closure #1 in SubscriptionView.Subscriber.updateValue() + 700 (SubscriptionView.swift:66)
14 SwiftUI 0x0000000195a68eb0 partial apply for thunk for @escaping @callee_guaranteed (@in_guaranteed A.Publisher.Output) -> () + 28 (:0)
15 SwiftUI 0x0000000195a68e78 closure #1 in ActionDispatcherSubscriber.respond(to:) + 76 (SubscriptionView.swift:98)
16 SwiftUI 0x0000000195a68c80 ActionDispatcherSubscriber.respond(to:) + 816 (SubscriptionView.swift:97)
17 SwiftUI 0x0000000195a68938 ActionDispatcherSubscriber.receive(:) + 16 (SubscriptionView.swift:110)
18 SwiftUI 0x0000000195a6786c SubscriptionLifetime.Connection.receive(:) + 100 (SubscriptionLifetime.swift:195)
19 Combine 0x000000019aed29d4 Publishers.Autoconnect.Inner.receive(:) + 52 (Autoconnect.swift:142)
20 Combine 0x000000019aed2928 Publishers.Multicast.Inner.receive(:) + 244 (Multicast.swift:211)
21 Combine 0x000000019aed2828 protocol witness for Subscriber.receive(_:) in conformance Publishers.Multicast<A, B>.Inner + 24 (:0)
....
(FBSScene.m:812)
46 FrontBoardServices 0x00000001aa892844 __94-[FBSWorkspaceScenesClient _queue_updateScene:withSettings:diff:transitionContext:completion:]_block_invoke_2 + 152 (FBSWorkspaceScenesClient.m:692)
47 FrontBoardServices 0x00000001aa8926cc -[FBSWorkspace _calloutQueue_executeCalloutFromSource:withBlock:] + 168 (FBSWorkspace.m:411)
48 FrontBoardServices 0x00000001aa8977fc __94-[FBSWorkspaceScenesClient _queue_updateScene:withSettings:diff:transitionContext:completion:]_block_invoke + 344 (FBSWorkspaceScenesClient.m:691)
49 libdispatch.dylib 0x00000001999aedd4 _dispatch_client_callout + 20 (object.m:576)
50 libdispatch.dylib 0x00000001999b286c _dispatch_block_invoke_direct + 288 (queue.c:511)
51 FrontBoardServices 0x00000001aa893d58 FBSSERIALQUEUE_IS_CALLING_OUT_TO_A_BLOCK + 52 (FBSSerialQueue.m:285)
52 FrontBoardServices 0x00000001aa893cd8 -[FBSMainRunLoopSerialQueue _targetQueue_performNextIfPossible] + 240 (FBSSerialQueue.m:309)
53 FrontBoardServices 0x00000001aa893bb0 -[FBSMainRunLoopSerialQueue performNextFromRunLoopSource] + 28 (FBSSerialQueue.m:322)
54 CoreFoundation 0x0000000191adb834 CFRUNLOOP_IS_CALLING_OUT_TO_A_SOURCE0_PERFORM_FUNCTION + 28 (CFRunLoop.c:1957)
55 CoreFoundation 0x0000000191adb7c8 __CFRunLoopDoSource0 + 176 (CFRunLoop.c:2001)
56 CoreFoundation 0x0000000191ad92f8 __CFRunLoopDoSources0 + 340 (CFRunLoop.c:2046)
57 CoreFoundation 0x0000000191ad8484 __CFRunLoopRun + 828 (CFRunLoop.c:2955)
58 CoreFoundation 0x0000000191ad7cd8 CFRunLoopRunSpecific + 608 (CFRunLoop.c:3420)
59 GraphicsServices 0x00000001d65251a8 GSEventRunModal + 164 (GSEvent.c:2196)
60 UIKitCore 0x0000000194111ae8 -[UIApplication run] + 888 (UIApplication.m:3713)
61 UIKitCore 0x00000001941c5d98 UIApplicationMain + 340 (UIApplication.m:5303)
62 SwiftUI 0x0000000195ccc294 closure #1 in KitRendererCommon(:) + 168 (UIKitApp.swift:51)
63 SwiftUI 0x0000000195c78860 runApp(:) + 152 (UIKitApp.swift:14)
64 SwiftUI 0x0000000195c8461c static App.main() + 132 (App.swift:114)
65 SoleFit 0x0000000103046cd4 static SoleFitApp.$main() + 24 (SoleFitApp.swift:0)
66 SoleFit 0x0000000103046cd4 main + 36
67 dyld 0x00000001b52af154 start + 2356 (dyldMain.cpp:1298)