top of page

Where AR Immersion Meets Human-First Design

What's in a Name?

“That which we call a rose by any other name would smell as sweet.” AR, XR, MR, or Spatial - the labels may shift, but the essence remains. 

At their core, all face-worn devices serve two functions: they display digital content and enable the user to interact. Their ability to perform these tasks ultimately determines their value. 

While the category name may change, what truly matters are two key dimensions: display immersion and input functionality. The GUI ultimately determines how seamlessly display and input integrate.


Juston Payne - Director of Product Management, XR at Google [AWE USA 2025]
Juston Payne - Director of Product Management, XR at Google [AWE USA 2025]

Tying Display and GUI

Navigation and pointing are the foundations of human-computer interaction. Navigation moves us through digital space, while pointing lets us select and manipulate elements. Together, they define how we engage with a graphical user interface. Here’s how we’ll classify glasses for today’s discussion:

  • AI Glasses – Screen less eye-wear that plays audio and connects to an AI agent for contextual information. Meta Ray-Ban, Bose Frames, Amazon Echo Frames.

  • AR Glasses – Binocular or Monocular displays that either only act as a HUD (Heads up Display), or overlay digital elements—like text, navigation, or checklists. EverySight Raptor, Google Glass Edition 2, RayNeo Air 3s, Rokid Max 2.

  • MR Glasses – Fully immersive, high-resolution devices that blend virtual and physical environments for apps, media, and collaboration. Apple Vision Pro, Snap 2024 Spectacles, Magic Leap 2.

Left: Meta-Ray Ban glasses , Google Enterprise edition , Snap Spectacles 2024
Left: Meta-Ray Ban glasses , Google Enterprise edition , Snap Spectacles 2024

Binding Input and GUI

Input functionality for each sub-category can be suited on the basis of the three most common pointing device functionalities: the gaming controller, the directional pad, the computer mouse:

  • AI Glasses –  best suited to a controller-style input (like a TV remote), with buttons mapped to simple functions such as play, pause, or next. Think iPod.

  • AR Glasses – pair well with a directional pad or trackpad input, moving up, down, left, or right, with extra buttons for select or back. Think Apple TV.

  • MR Glasses – benefit most from mouse-like input, where users move a pointer, click icons, and use pinch-and-drag for rich interaction. Think PC.


Gesturing a 'PC-mouse' - controlling a cursor using wrist movement 
Gesturing a 'PC-mouse' - controlling a cursor using wrist movement 

Replacing Buttons and Touch with Micro-Gestures

Since Display and Input are tied together by the GUI, each smart glasses category needs its own input scheme.

  • For AI glasses, discrete gestures like index or middle finger taps, double-taps, or wrist orientation can control simple functions such as play/pause, select/back, or next/previous. [see thumbnail 1 below]

  • thumbnail 1: Gesturing a controller - discrete finger movements and taps for AI glasses or no-display devices
    thumbnail 1: Gesturing a controller - discrete finger movements and taps for AI glasses or no-display devices

    AR glasses benefit from wrist flicks and fingertip pressure, enabling directional navigation-similar to arrow keys or touch pads-combined with taps for selection. [see thumbnail 2 below]

  • thumbnail 2: Gesturing a virtual directional pad -flicks and taps for binocular and large display
    thumbnail 2: Gesturing a virtual directional pad -flicks and taps for binocular and large display

    With MR glasses, interaction reaches its most immersive form. Mudra Air-Touch combines spatial pointing with subtle neural gestures, letting users select, drag, or manipulate digital elements in mid-air—without ever touching a surface. [see thumbnail 3 below]

thumbnail 3: Gesturing a PC-mouse - navigation using wrist movement and pointing via micro-gestures
thumbnail 3: Gesturing a PC-mouse - navigation using wrist movement and pointing via micro-gestures

-----

Mudra Link - all the Input you need for AR

The Mudra Link is an EMG wristband that interprets electrical signals from the wrist, turning hand and finger micro-gestures into effortless control of digital devices. It offers three control modes:

  1. Mouse mode – use wrist movement and micro-gestures to move a pointer (↗️), tap (👌), scroll and swipe (↕️ , ↔️), just like a PC-mouse.

  2. D_pad mode – use wrist flicks and and taps which function as keystrokes  (⬅️ , ➡️ , ⬆️ , ⬇️),  and  (↩ , 🔙), as if you use a virtual track pad.

  3. Gesture Mapper – customize micro-gestures to create a controller experience (e.g. music ▶, ⏸, ⏭ , ⏮ ), just like the iPod.

 
 
 

Comments


footer.png

STAY IN THE KNOW

Thanks for subscribing!

ABOUT US

Wearable Devices Ltd. develops a non-invasive Neural input interface for controlling digital devices using subtle finger movements.

 

We believe that neural-based interfaces will become as ubiquitous as wearable computing and digital devices in general, just as the touchscreen became the universal input method for smartphones.

TALK TO US

Wearable Devices Ltd - HQ
Hatnufa 5 street,
Yokneam Illit 2066736,

Israel.

STAY CONNECTED

  • YouTube
  • LinkedIn

© All rights reserved to Wearable Devices Ltd

bottom of page