top of page

Four Ways to Control AR Glasses

AR and AI glasses are coming! But, without a touchscreen in our hands, how will we interact with them? 


Tomorrow’s interface won’t live on a phone screen, rather it will overlay information and experiences seamlessly onto our world. 

Maps, messages, and apps will appear as part of what we see, creating a continuous experience, freeing us from constantly holding a phone, and showing exactly what we need, when we need it. 

Here are the four input methods vying for dominance.


[1] Temple Area Touchpad.

"Touch Control, Built into the Frame"

Temple touchpads are slim, touch-sensitive strips built directly into the frame, usually positioned at the arm near the temple, just above the ear.

Using taps, swipes, and long-presses, users navigate menus, adjust settings and trigger actions without a need for a separate controller.

Because the touchpad is part of the glasses’ physical frame, it’s always within reach and doesn’t add bulk. This form is already used in devices like the Rokid Max, offering a low-profile and familiar way of interaction while wearing the glasses.


RayNeo X2 temple area touchpad. Source: Marton Barcza of TechAltar.
RayNeo X2 temple area touchpad. Source: Marton Barcza of TechAltar.

[2] Handheld Controller

"A Familiar Input, Repurposed for AR"

Handheld controllers for AR glasses serve as companion devices, typically connecting over Bluetooth or proprietary links to act as input. Once paired, they act as the primary navigation tool to move through menus, select items, and trigger actions on the display. 

The XREAL Beam uses a touch-sensitive surface for directional input, while devices like the Rokid Station rely on physical buttons. These controllers often include haptic feedback and are designed to integrate tightly with the glasses’ software layer, ensuring consistent, responsive control. 

Rokid Station, a hand-held controller & compute in one.
Rokid Station, a hand-held controller & compute in one.

[3] Gesture Recognition Camera

"When You Vision Works, so Do Your Gestures"

Gesture-recognition cameras detect and interpret hand movements. By tracking motions like taps and swipes, users can control interfaces without touching any device. This method typically relies on depth-sensing cameras to create a 3D map of the space in front of the user.

The tracking zone, however, is limited to roughly 40 to 70 degrees field-of-view extending outward from the glasses, and gestures must stay within a line-of-sight. Performance is affected by light conditions.Integrating cameras into the frame adds bulk and weight, while the continuous processing places high demands on computing resources and battery life.

Despite these limitations, gesture cameras offer a hands-free way to interact, making the experience feel more spatial and responsive.

Spectacles by Snap, featuring gesture cameras. Source: GamertagVR.
Spectacles by Snap, featuring gesture cameras. Source: GamertagVR.

[4] Neural Wristband

Made possible by recent advances in AI, sensor technology, and flexible PCBs, neural wristbands mark a new frontier in human-computer input.

Worn like a fitness band, they detect neural signals generated by hand gestures and finger movements. These signals, originating from the forearm’s muscles, are translated into digital actions.

Because the sensors read intent rather visible motion, micro-gestures can be used — even in complete darkness or with hands tucked inside a pocekt.

For AR glasses, this brings a leap in UX, UI, and HCI. Interfaces become easier to navigate, less tiring, and more discreet, while neural input removes the need for bulky controllers or constrained touchpads. The result is interaction that feels natural, precise, and seamlessly woven into everyday life.

Mudra Link - AR interaction using familiar gestures in comfortable body postures
Mudra Link - AR interaction using familiar gestures in comfortable body postures

 
 
 

Comments


footer.png

STAY IN THE KNOW

Thanks for subscribing!

ABOUT US

Wearable Devices Ltd. develops a non-invasive Neural input interface for controlling digital devices using subtle finger movements.

 

We believe that neural-based interfaces will become as ubiquitous as wearable computing and digital devices in general, just as the touchscreen became the universal input method for smartphones.

TALK TO US

Wearable Devices Ltd - HQ
Hatnufa 5 street,
Yokneam Illit 2066736,

Israel.

STAY CONNECTED

  • YouTube
  • LinkedIn

© All rights reserved to Wearable Devices Ltd

bottom of page