XR Human Inputs

Unity

Human Inputs, like hands & eyes, increase the sense of presence in immersive environments while decreasing the reliance on controllers.

The goals is to feel obvious.

We physically engage within our natural environment countless times per day, but we interface with our digital lives through the proxies of mouse and keyboard. As technology becomes more increasingly aware of our what our hands, eyes and bodies are doing we can reduce the weight we place on those proxies for interactions that are more in line with our desires.

Mastery of any tool or system is gated by the knowledge required, retained and refined to execute desired tasks. The traditional software approach is to present the user with explicit choices through UI that enables them to change the state of the system. An alternative approach is to identify a user implicit desire through context and action. Shifting the goal towards understand intent so that we can serve results through the least amount of supplementary knowledge (the best manual is none). While we can’t guarantee success, we can aim to educate along the way.

Hand Draw & Duplicate. A simple set of gestures that aim to reach a common set of tasks.

Hand Tracking on HoloLens 2. Distance event triggers (between Thumb & Pinky tips)

Hand & Eye Tracking on HoloLens 2. (Offset from recording through Device Portal)

While technologies for hand and eye tracking have been around for a while, the limits and abilities are still quite different across the different devices. While the technology matures, there’s a lot of explorations that can be made in anticipation of where we hope the technology to land so that the interactions are in service of ways that we interact with our eyes and hand already.

Results

Eye Tracking on HoloLens 2. Linger affordances (circle timer) creates user feedback to assist in controlling object reveals with eye gaze.

Gesture based drawing, grab, duplicate & erase.

Hand Drawing

Quest Hand Tracking. Draw, duplicate & erase

HoloLens 2. AR Drawing with Body Occlusion

Another point cloud based pursuit was to modify the object of the ‘point’ based on the color (similar to ASCII art).

Eye Tracking

ARKit with LiDAR. Point Cloud Pointillism.

ARKit with LiDAR. Point Cloud Pointillism with Emojis

Exploring Hand & Eye tracking on the Meta Quest Pro. Previous hand tracking system would fail when hands overlapped, so the Clap-to-Delete gesture is to test this hand over hand limit.

Meta Quest Pro

Previous
Previous

Art of Movement

Next
Next

Project Gum