
It first appeared on March 9 as a tweet on Andrew Bosworth’s timeline, the little corner of the internet that these days offers a rare glimpse into the mind of a Facebook manager. Bosworth, who leads Facebook’s augmented and virtual reality research labs, has just shared a blog post outlining the company’s 10-year vision for the future of human-computer interaction. Then, in a follow-up tweet, he shared a photo of a still unseen portable device. Facebook’s vision for the future of computer interaction seems to involve something that looks like an iPod Mini on your wrist.
Facebook owns all of our social experience and some of the world’s most popular messaging apps – for better or for worse. Every time the business dips into hardware, it gets noticed, whether it’s a very good VR headset or a video chat device you follow. And it sparkles not only intrigue, but also questions: why does Facebook want to own this new computer paradigm?
In this case, the unanswered questions are less about the hardware itself and more about the research behind it – and whether the new interactions that Facebook envisages will only deepen our ties with Facebook. (Answer: probably.) In a media briefing earlier this week, Facebook executives and researchers presented an overview of this technology. In the simplest terms, Facebook has tested new computer inputs using a sensor-filled wrist portable.
It is an electromyography device, which means that it translates the nerves of electric motor nerves into digital commands. If it’s on your wrist, you can just flick your fingers into space to control virtual input, whether you’re wearing a VR headset or interacting with the real world. You can also ‘practice’ it to feel the intent of your fingers, so that actions take place even when your hands are completely still.

This portable wrist does not have a name. It’s just a concept, and there are different versions of it, some of which contain haptic feedback. Bosworth says it could take five to ten years for the technology to become widely available.
It all has to do with Facebook’s plans for virtual and augmented reality, technologies that can sometimes make the user feel a clear lack of agency when it comes to their hands. Put on a VR headset and your hands will disappear completely. By picking up a few hand controls, you can play games or grab virtual objects, but then you lose the ability to take notes or draw with precision. Some AR or ‘mixed reality’ headers like Microsoft’s HoloLens have cameras that detect spatial gestures so you can use certain hand signals and the headset interprets the signals … which sometimes work. Facebook is therefore using this EMG portable in its virtual reality lab to see if such a device enables more precise interactions with the handheld computer.
But Facebook has visions for this pulse technology outside of AR and VR, Bosworth says. “If you really had access to an interface that allowed you to type or use a mouse – without physically typing or using it, you could use it anywhere.” The keyboard is an excellent example, he says; this pulse computer is just another way of intentional input except that you can take it everywhere.
Bosworth also suggests that the microwave oven in the kitchen be presented as a use case – while making it clear that Facebook is not building a microwave oven. The interfaces of household appliances are different, so why not program such an appliance to understand it if you want to cook on medium power for ten minutes?
In the virtual demo that Facebook gave earlier this week, a gamer was shown wearing the wrist device and controlling a character in a rudimentary video game on a flat screen, without having to move his fingers at all. These kinds of demonstrations tend to apply (forgive the pun) to mind-reading technology, which Bosworth insisted is not. In this case, he said, the mind generates signals identical to those that will make the thumb move but the thumb does not move. The device records an expression intention to move the thumb. ‘We do not know what is happening in the brain, which is full of thoughts, ideas and ideas. We do not know what happens before someone sends a signal through the wire. ”
Bosworth also stressed that this portable wrist is different from the invasive implants used in a 2019 brain-computer interface study that Facebook worked on with the University of California at San Francisco; and it differs from Neonink, Elon Musk, a wireless implant that allows people to theoretically send neuroelectric signals directly to digital devices. In other words, Facebook does not read our minds, even though it already knows a lot about what is going on in our heads.
According to researchers, there is still a lot of work to be done to use EMG sensors as virtual input devices. Precision is a big challenge. Chris Harrison, the director of the Future Interfaces Group in the Human-Computer Interaction Lab at Carnegie Mellon University, points out that the nerves of every human being are a little different, as are the shapes of our arms and wrists. ‘There is always a calibration process that needs to take place with any muscle sensing system or BCI system. It really depends on where the computer intelligence is, ”says Harrison.

And even with haptic feedback built into these devices, as Facebook does with some of its prototypes, there is the risk of visuo-haptic mismatches, where the user’s visual experience, whether in AR, VR or real space, does not match with the haptic reaction. These points of friction can make this interaction between human and computer feel frustrating areal.
Even if Facebook can overcome these obstacles in its research labs, there is still the question why Facebook – largely a software company – wants to own this new computer paradigm. And should we trust it? This extremely powerful technology enterprise that has a record of sharing user data in ‘exchange for other equally or more valuable things’, as Fred Vogelstein of WIRED wrote in 2018? A more recent report in MIT Technology Review highlights how a team at Facebook to tackle ‘responsible AI’ was undermined by the relentless pursuit of growth.
Facebook executives said this week that these new human-computer interaction devices will perform as much computer work on a device as possible, meaning the information is not shared to the cloud; but Bosworth does not want to commit to how much data can eventually be shared with Facebook or how the data will be used. The whole thing is a prototype, so there is still nothing substantial to bother apart, he says.
“Sometimes these companies have piles of cash big enough to basically invest in these big R&D projects, and they will suffer losses from such things if it means they could be forerunners in the future,” says Michelle Richardson, director of the Data. and privacy project at the nonprofit Center for Democracy and Technology. “But with companies of any size, any product, it is so difficult to refurbish it once it has been built. So anything that can start the conversation about this before the devices are built is a good thing. ‘
Bosworth says Facebook wants to lead this next paradigm shift in computing because the company sees technology like this as fundamental to connecting people. If anything, over the past year we have highlighted the importance of connecting feeling as you personally are, says Bosworth. He also seems to believe that he can earn the required trust by not “surprising” customers. “You say what you do, you set expectations and you meet those expectations over time,” he says. “Trust comes on foot and departure on horseback.” Rose-colored AR glasses, activated.
This story originally appeared on wired.com.