10 Years in the Making - a CTO x CSO Discussion
- Shmuel Barel

- Dec 26, 2025
- 3 min read
How helping a nervous student on his first day at work spiked the signal that formed the team behind the world’s first neural wristband.
The discussion explores how raw wrist signals were gradually transformed into structured data, algorithms, and machine-learning models capable of recognizing intent before movement.
🧠 Early Exposure to BCI Concepts
Early encounters with brain–computer interfaces revealed a simple truth: breakthrough technology often begins in imperfect form. The first experiments were limited and fragile, yet they hinted at a future where intention itself could become an input.
As research advanced, powerful demonstrations began to emerge. Patients with severe paralysis, equipped with risky yet transformative implants, were able to play video games, control robotic arms, and even use a mechanical hand to feed themselves.
From those early glimpses, a new idea took shape: neural signals could be decoded into action, and this capability could move from implants to a wrist wearable, where people have naturally worn technology for generations.
-----
◎ Prototype Beginnings
An early prototype, presented when applying for a small government grant, was completely handmade — genuine maker work in the spirit of the 1970s.
These first prototypes used basic sensors connected to Arduino boards to capture electrical signals from the wrist, forming the foundation for early experiments in neural input.
📟 Hardware Meets Algorithms
The main challenge was interpreting raw data and turning noise into patterns that reflected human intent. Early filters and classifiers from academic research laid the foundation for an AI-driven neural engine.
As deep learning advanced, the team applied machine learning to bio-signals, building adaptive models that continuously learned from data. Each iteration improved gesture recognition, latency, and user adaptation, evolving static classifiers into intelligent, self-learning systems.
This fusion of precise hardware and adaptive algorithms became the core of Mudra’s neural technology, enabling software that learns from the user and translates neural activity into natural, effortless interaction.
A Broader Canvas
Developing algorithms for neural control from the wrist, however, posed a unique challenge. Unlike computer vision or speech recognition, where vast datasets already exist, wrist-based neural input began with no reference material. Every signal, gesture, and movement had to be captured, labeled, and modeled from scratch. Building that foundational dataset became as critical as the algorithms themselves.
Both hardware and software were designed with a single shared goal: to deliver a seamless user experience. The system had to recognize a familiar and intuitive gesture set while allowing users to remain in natural, comfortable postures. By combining high-fidelity sensing with adaptive algorithms, the technology transforms complex neural activity into effortless, everyday control.

-----
Algorithms Development at Wearable Devices
Wearable Devices Ltd. develops advanced bio-signal algorithms built on a deep understanding of neural pattern recognition. Early work in noise filtering and signal classification established the framework for today’s adaptive machine-learning models.
As proprietary datasets expand, our deep-learning architectures advance beyond static code, enabling continuous learning and personalization. We develop automated methods for collecting and annotating user data to ensure scalable, high-quality training inputs. This intelligence forms the core of wearable neural input and drives our growth from research innovation to commercial deployment.
To further explore the evolution of algorithms for neural wristband hardware, you are welcomed to delight yourself by watching the full video.
-----
Why This Matters Now
The ecosystem is aligning.
The smartphone era generated over $8 trillion in hardware and app revenue, powered by the touchscreen, a universal interface anyone could use.
Today, the neural band is emerging as the next interface that will define how humans interact with technology. Consumer brands, platform developers, and OEMs are already shaping this new input standard.
Staying close to this shift means staying ahead, before the next interface becomes the norm.



Comments