A wrist-worn device could translate nerve signals into ‘digital commands.’
Last year, Facebook showed off Project Aria, the research project that will help the company create augmented reality glasses. At the time, Facebook presented a vision in which such a device could eventually take over many of the functions we currently use smartphones for, like calling friends or looking up directions. Now, the company is offering a new peek into how users could ultimately control AR: via their wrists.
The idea, according to researchers at Facebook’s Reality Labs, is to use a technique known as electromyography or EMG, which can detect nerve signals that travel through the wrist. A wrist worn device with specialized sensors would be able to interpret these signals and translate them into “digital commands” that can then be used to control a device or AR interface.
“This is not akin to mind reading,” Facebook explains in a blog post. “Think of it like this: You take many photos and choose to share only some of them. Similarly, you have many thoughts and you choose to act on only some of them. When that happens, your brain sends signals to your hands and fingers telling them to move in specific ways in order to perform actions like typing and swiping. This is about decoding those signals at the wrist the actions you’ve already decided to perform and translating them into digital commands for your device.”
One advantage of using such a system, Facebook says, is that EMG is so precise it can “understand finger motion of just a millimeter.” Eventually, it might not even be necessary to move a finger at all as long as there’s an “intention” to do so. This precision could also potentially make navigating AR interfaces a much faster experience than the way we currently interact with technology. For example, Facebook’s researchers say that EMG could enable people to type on a virtual keyboard at a higher speed than what’s possible on a mechanical one.
For now, though, Facebook is still fine tuning the basics of interacting with EMG. The company showed off an interaction it’s calling the “intelligent click,” which allows users to “click” on a menu by subtly moving their fingers. The interface may also adapt based on contextual information and what it knows about you, like queuing up a playlist when you’re about to go for a run. “The system will be able to make deep inferences about what you might want to do in various situations based on the information you choose to share about yourself and your surroundings,” says Sean Keller research director at Facebook Reality Labs.
Keller and other researchers emphasized that this work is still in a very early stage, and that any kind of consumer-level device is still years away. And there are other issues the company will need to address outside the mechanics of how wrist-controlled AR will operate. Namely, the immense privacy concerns that accompany an always-on Facebook-run AR platform. But it does offer an intriguing glimpse into how Facebook is thinking about the future of augmented reality and what might one day be possible.