Algorithm Journey from PC to IoT — A Signal Processing Pipeline with TensorFlow 2.X on a Smartwatch

The Challenge — acquire bio-potentials from the surface of the skin, through the medium of an electrode, then process those data streams and classify each pattern of signals as the correct finger movement.

Nowadays, when you’re developing technology, especially consumer electronics, you want the best-of-the-best in terms of user experience. At Wearable Devices we are developing Mudra, a neural input technology in the form factor of a watch strap which acts as an extension of the hand into the digital world. It detects neural signals from the wrist and translates them into control functions on digital devices. Intuitive and natural HMI is the missing piece in the puzzle for smartwatches \ smart glasses. Just like the iPhone without the touchscreen would never have caught on, so is the case for these technologies to achieve mass adoption.

The Mudra device detects subtle finger movements such as the individual movement of the Index finger, Thumb finger, soft tap of the Middle Finger on the Thumb and a Double Tap. The device also detects gradations of fingertip pressure applied between the Index finger and the Thumb.

Control of devices is accomplished by binding the right function based on the user experience (you can read more here). Ideally you’d want to complete this entire process with practically no latency, on any user physiology, and as a standalone device — e.g., using very low compute on your CPU/GPU/NPU, such as those found in smartwatches.

Huawei Watch 1 running the pipeline

Consumers today demand simplicity and accuracy, a “plug&play \ simply works!” experience. So, in order to get the experience good enough for today’s expectations from electronics, Mudra must perform almost flawlessly. Fortunately for us, there are amazing open-source software solutions for building futuristic technology. I’d like to talk about some of these technologies and explain how we got them to work on Android Wear with low-compute capabilities, without sacrificing latency or performance in general.

The fun part of developing technology from the ground up is that you can control the entire process, from designing the analog sensors, to customizing data acquisition and annotation all the way up to the final experience. The software design must be performant enough to run on a smartwatch, which is the only constraint. To this end we have at our disposal several amazing tools.

I’d like to talk about 3 such tools that I find groundbreaking (especially for IoT):

  1. Tensorflow 2.0 \ Tflite

  2. Eigen c++ Algebra library

  3. Ctypes

Lets start with Tensorflow 2.0. Our pipeline includes various stages, some deep learning based and others not. We use deep learning for calibrating Mudra to a specific user and for accomplishing various tasks, in the spirit of HydraNet. The advantage of this approach is twofold: We can achieve higher accuracy by sharing weights between tasks and keep the computation overhead low. Such models can be built using the advanced model subclassing API.

Tensorflow 2.0 works very well with tflite, which works like magic on smartwatches. It also runs very quickly on some micro-controllers we’ve tested, including STM32h743. Our models are blazingly fast even on a single-core ARM cortex m7, which is a modern, low power micro-controller with great tflite support. This is very helpful since the Apple Watch 3 introduced standalone cellular connectivity so the compute capabilities are constrained. The ability to deploy a model and run the same ops without the need to code it yourself enables the flexibility startups need to deploy cutting edge models quickly and iterate on such success.

Second, we utilize the DSP NEON core for various operations in c++, including our custom Wavelet Decomposition. Wavelets are a great method to analyze non-stationary processes. They usually outperform other methods when you have a very specific form or “basis” (for example, radiology images and aerial photography, not only bio-potentials…). To get this decomposition right, we love the open source Algebra library Eigen. It has a great API and allows you to write highly readable code which runs across multiple architectures, including the ARM architecture. This makes it a great candidate for many smartwatches (a lot of which are based on the ARMv7/m7 architecture). One can use this library as a powerful pre\post-processing tool.

Third, we really love to simulate our algorithms accuracy. Anyone who works on such simulations, especially data science projects, knows how tightly coupled these simulations are with Python. Python has become the go-to-language for simulation and visualization. However, what do you do when you’re relying on high-performance \ low latency c++ code for embedded application? A great tool for saving time without the need to write two separate code bases (i.e. simulation vs embedded) is ctypes.

Ctypes is an amazing tool. You write your code once in c++. Then you wrap it into a .dll or .so library, and access this library in Python. Ctypes gives you the best of both worlds, high performance code with the ability to plot visualizations, create custom analysis tools and the ability to read various database files quickly and easily. I find it a much easier tool than alternative methods, such as Boost Python, since no c++ wrappers are necessary.

The above tools are used by many people in the tech space. However, at Wearable Devices we use them all and quite extensively, including all the advanced features. Since each hardware manufacturing iteration is costly, we collect our data in-house and simulate carefully all aspects of performance and accuracy that might affect the user. Luckily, we can rely on the availability of some wonderful open source tools to drive us towards success.

A concept video of Mudra for a Smartwatch

Wearable Devices Ltd develops a non-invasive neural interface for the wrist that allows revolutionary new input experiences and applications for the digital world. The Mudra technology enables control of digital devices using subtle finger movements and the analysis of neural signals. We use proprietary sensors and Deep Learning algorithms to decipher the neural code and detect which finger the user moved. Each finger movement is bound to a control function in order to create the most instinctive, intuitive and natural user experience for wearable computers — smartwatches, smart glasses and head mounted devices.

A few words about myself: I am the co-founder and CTO at Wearable Devices. I held lead algorithms engineer positions in the Israeli high-tech industry and had the opportunity to work with extremely talented individuals in Academia. My background involves Machine learning and Signal / Image processing. I hold an MSc. in Applied Mathematics and BSc. in Electrical Engineering.

Original blog post was published on Medium