Objects do not behave properly with indirect pinch in a Mixed Reality Environment

In Mixed Reality Mode there is strange issues with indirect pinches on objects.

If a user uses an indirect pinch to select an object and then walks around, or moves and re-orients their body while maintaining the pinch, the object moves as if there is some scalar being applied to it and it causes the object to behave in ways that are extremely counter-intuitive compared to other MR devices.

If a user indirect pinches on an object and then walks forward the object flies away from the user, faster than they are walking. If a user indirect pinches on an object and then walks backward, the object flies towards and eventually past the user, faster than they are walking. If a user indirect pinches an object and then turns around, the object rotates around some unknown position and with some added scalar resulting in very strange behavior.

Here are some examples of the issue in action. The first video is using Unity's Polyspatial SDK. The second video is using an entirely native stack of SwiftUI and RealityKit with NO Unity at all.

For some reason I am not allowed to link videos here from Drive or Gyazo, so I am including it in plaintext for now. If someone could direct me how I can upload video examples of what I am describing directly to these forums, I would appreciate it.

First Video Showing Issue in Unity with PolySpatial SDK:

https://i.gyazo.com/95788cf9d4587c167b544db031fbf412.mp4

Second Video Showing Issue in native only stack with RealityKit and Swift UI:

https://drive.google.com/file/d/1mgt8TXJiopbm6qdJw2rFG0geam0irnMn/view?usp=sharing

Unity Forum Bug Discussion which, after Investigation, Confirmed this issue is on the Native Platform:

https://discussions.unity.com/t/objects-do-not-behave-properly-when-manipulated-in-an-mr-space/1482439

For a Mixed Reality Environment, where a user may want to move around their space, while using Indirect Pinches to manipulate and "carry" objects with them this is a big issue.

Thank you

Answered by DTS Engineer in 797715022

Hello @Tyro_0,

Transforming entities with gestures is not built-in behavior, the issues demonstrated in the videos you linked to are specific to those implementations.

This sample project demonstrates an implementation that does not have that issue: https://developer.apple.com/documentation/realitykit/transforming-realitykit-entities-with-gestures

Best regards,

Greg

Hello @Tyro_0,

Transforming entities with gestures is not built-in behavior, the issues demonstrated in the videos you linked to are specific to those implementations.

This sample project demonstrates an implementation that does not have that issue: https://developer.apple.com/documentation/realitykit/transforming-realitykit-entities-with-gestures

Best regards,

Greg

This sample project is set for a bounded volume. I took the sample project you provided, changed it to an unbounded immersive view and am still seeing issues. Our application is targetting an unbounded volume in Mixed Reality space.

While I don't see the exact same issues that are present in my initial videos, I do still see counter-intuitive behaviors in object manipulation with your sample project in an Unbounded Mixed Reality Space.

Objects still rotate oddly and have other minor issues when being manipulated in an unbounded immersive space. While this scene has numerous objects setup with different capabilities, I focused on testing the "Yellow Cube". When testing this object, which provides, translation via dragging, rotation and scale I still see several issues that cause it to behave in ways that are not intuitive compared to other platforms that support Mixed Reality interactions.

I will provide video later when I am in a space I can record in.

Hello @Tyro_0,

That sample project has a few properties to configure how orientation behaves. The distinction between a Volume or an ImmersiveSpace isn't core to how that sample's interaction behaves.

In any case, the main point is that Entity-Gesture transformation isn't built-in behavior, it is up to you to define it for your use-case. If you believe there is some standard behavior that the SDK should provide, you are welcome to file an enhancement request for that behavior using Feedback Assistant.

Best regards,

Greg

Yes, we am aware that fundamentally the interaction should behave the same within a bounded space vs unbounded space. But there simply is not enough room within most bounded views to see the issues we are describing. You need an actual large physical space to move around in. Which is why we mentioned that we changed it to an unbounded space. It seems that the issue we are seeing accumulates and worsens over time and distance travelled. Also, we can appreciate the point that this isnt "built-in behavior".

The point we are trying to make is that your sample project code seems like it should work one way: I.E. if a user indirectly manipulates an object, that object moves in relation to the user's hand with a set offset.

But that is not the actual behavior that we observe on the headset.

We are aware that object manipulation is up to us for our implementation. But when your very simple sample code behaves differently on the device than one would expect, it seems like something deeper is incorrect about how interactions are handled on the device across distances.

We have filled out a report on Feedback Assistant.

Thank you for your time and responses

Objects do not behave properly with indirect pinch in a Mixed Reality Environment
 
 
Q