There is a neural interface at CES that reads signals from your brain
Imagine controlling a computer or playing a game without using your fingers, voice or eyes. It sounds like science fiction, but it gets a little more real every day thanks to a handful of companies making technologies that detect neural activity and convert those measurements into signals that computers can read.
One such company, NextMind, has been offering its version of mind-reading technology to developers for over a year. First unveiled at CES in Las Vegas, the company’s neural interface is a black circle that can read brain waves when attached to the back of a user’s head. The device isn’t quite ready for prime time just yet, but it’s destined to make its way into consumer goods sooner rather than later.
Neural interfaces are already there
Neural interfaces have the potential to support a wide range of activities in various contexts. A company called Mudra, for example, has developed a bracelet for the Apple Watch that allows users to interact with the device by simply moving their fingers – or remembering to move their fingers. This means that someone with the device can browse music or make calls without having to interrupt what they are doing at that time. It also opens up huge opportunities for making the technology available to people with disabilities who have issues with other user interfaces.
NextMind has bet on virtual reality and augmented reality. Tech reporter Scott Stein paired the device, which sells for $ 399, with an Oculus Quest. He said the experience of using his mind to play a game where he blew the heads of aliens using only his thoughts was “hard, but also fascinating.” NextMind indicates what a user can select by gently flashing sections of the screen. The user makes their selection by “clicking” on one of these flashing sections. In this case, clicking means focusing, sometimes for a while.
Stein calls NextMind an “imperfect attempt to create an entry”, but he also says that the device was good enough to determine which section of the screen it was trying to select. “On a field of about five flashing ‘buttons’ on the screen, it really knew what I was looking at,” he said.
Whether you like it or not, machines that can literally read the human brain are coming into consumer electronics. What’s the worst that can happen?