Science & Technology

Zuckerberg Announces Meta Wearables That Read Brain Signals

Mark Zuckerberg hinted that the new neural technology Meta is currently developing will be “pretty wild,” able to interpret brain signals in order to control computers and other smart devices.

Zuckerberg Announces Meta Wearables That Read Brain Signals

In the April 18 interview between the Facebook co-founder and YouTuber Roberto Nickson, Mark Zuckerberg revealed that Meta is making progress on its first “consumer neural interfaces” – non-invasive wearable devices that can “read neural signals that your brain sends through your nerves to your hand to basically move it in different subtle ways.”

These signals will be interpreted with the help of electromyography (EMG) to further translate them into commands to control devices. The technology reminds a little of Elon Musk’s Neuralink brain chip. However, its main difference that may become an ultimate factor for mass adoption is the non-invasive character. Instead of “jacking” something into your brain, Meta suggests to simply wear a device on your wrist.

The innovative “wrist-based interaction” has been tested and studied since 2021, within Facebook Reality Labs Research. As for its practical use, the interpretation of brain signals may be used to seamlessly control computers, smart glasses, VR or AR headsets, etc.

According to Zuckerberg, the new technology is still in its infant stage. The firm has not yet developed a prototype for public presentation, but the internal testing is in its active phase, bringing exiciting results.

Earlier this year, the Meta CEO also commented that turning this neural wristband into a consumer product might take just a few years. He hinted that the company is experimenting with artificial intelligence to overcome the limitations of camera-based gesture tracking too.

One of the first use cases may be connecting neural wristbands to Meta’s Ray-Ban augmented reality (AR) smart glasses. The firm is currently working on integrating AI into them. “We’re really close to having multi-modal AI […] so you don’t just ask it a question with text or voice; you can ask it about things going on around you, and it can see what’s going on and answer questions […] that’s pretty wild,” explained Zuckerberg.

Meta’s active research and development of the AI technology has also resulted in an updated version of Meta AI virtual assistant that operates across the firm’s applications and glasses. It got upgraded with the new Llama 3 AI model. Meta Chief Product Officer Chris Cox said that Llama 3 is the industry leader in several LLM criteria.

Nina Bobro

1190 Posts 0 Comments

https://payspacemagazine.com/

Nina is passionate about financial technologies and environmental issues, reporting on the industry news and the most exciting projects that build their offerings around the intersection of fintech and sustainability.