In Episode 7 of Boz To The Future, Reality Labs VP and incoming Meta CTO Andrew “Boz” Bosworth speaks with RL Director of Neuromotor Interfaces Thomas Reardon about the more intuitive inputs we’re building for future computing platforms like AR glasses. For generations, humans have been adapting to machines and learning how to use the various inputs available. In the future, technologies like EMG at the wrist will help decode electrical signals from our muscles based on very small movements of our hands or fingers — and eventually just from the intention to perform a gesture. Those signals can then be used to translate physical actions that we’ve already decided to perform into digital commands in order to directly control a device. It’s a much faster way to act on the instructions that you already send to your device today when you tap to select a song on your phone, click a mouse, or type on a keyboard. It’s exciting work that could transform the way we interact with computers in the future.
For more on the series, visit facebook.com/boztothefuturepod). For feedback and suggestions, drop Boz a message @boztank on Instagram) or Twitter).