Methodology Unity, OpenXR, Oculus Integration, Auto Hand
MetaFrame is a high-level, component-based embodied MR interface system. This system facilitates physics-based bodily interaction with virtual entities, stimulating natural interactions with the physical entities. Utilizing hand tracking for actions like touch, throw, pinch, poke, grab, etc., I enabled physical hand gestures to interact with the intangible virtual objects, following the law of physics (gravity, friction, momentum, etc.). Additionally, this system integrates computationally augmented surreal modalities, including teleportation, distance grab, ray cast, eye gaze trigger, etc. While these enhanced interactions may not be feasible in the physical realm, they offer a natural and comfortable embodied interface in the virtual space.
*Currently in development of documentation, more interaction modalities and measurement system.
Embodied Interaction
Tracked hand interaction, gesture detection, locomotion through hand gesture, etc.