MIT's AlterEgo Headset Can Read Your Thoughts Using Internal Verbalisation

The holy grail of brain computer interfaces (BCI) is the ability to read the thoughts of the user, leading to a host of applications that can utilise the mind to directly control devices and information. A team of researchers at Massachusetts Institute of Technology's (MIT) Media Lab has taken a step along that path, unveiling a prototype headset that can pick up subvocalisation - that is, the natural process of internal verbalisation or inner speech usually performed when reading to aid cognition and characterised by miniscule movements in the larynx and other muscles associated with speech. Called AlterEgo, the wearable uses electrodes to detect neuromuscular signals in the jaw and face to transcribe words the user does not speak aloud but reads or otherwise subvocalises.


The prototype AlterEgo headset developed by the team at MIT's Media Lab, lead by Arnav Kapur, uses machine learning (ML) to correlate neuromuscular signals with words it has been trained on to identify them with accuracy later. After training, the ML system had an average transcription accuracy of about 92 percent. Kapur says this number will improve with increased training. Also known as a wearable silent-speech interface, the AlterEgo headset can use as few as four electrodes on either side of the mouth and jaw to consistently pick up the neuromuscular signals needed to distinguish subvocalised words. The electrodes are combined with a pair of bone-conduction headphones that convey information to the user with vibrations without "interrupting conversation or otherwise interfering with the user's auditory experience".

The team has thus far used the AlterEgo headset to assist in several tasks, such as transmitting opponent chess moves and receiving computer-recommended moves in response, or providing answers to large addition or multiplication problems. This makes it a sort of intelligence-augmentation device, unobtrusively providing answers to computationally complex problems. Other applications of course include the ability to control interfaces, with commands such as up, down, or select being picked up.


Current training models are meant for tasks that require identification of limited vocabularies - about 20 words each. In their usability study with the AlterEgo headset, the team had 10 subjects spend about 15 minutes each customising the arithmetic application to their own neurophysiology, then spend another 90 minutes using it to execute computations.
"The motivation for this was to build an IA device - an intelligence-augmentation device," said Kapur, a graduate student at the MIT Media Lab. "Our idea was: Could we have a computing platform that's more internal, that melds human and machine in some ways and that feels like an internal extension of our own cognition?"

"We basically can't live without our cellphones, our digital devices," added Pattie Maes, a professor of media arts and sciences and Kapur's thesis advisor. "But at the moment, the use of those devices is very disruptive. If I want to look something up that's relevant to a conversation I'm having, I have to find my phone and type in the passcode and open an app and type in some search keyword, and the whole thing requires that I completely shift attention from my environment and the people that I'm with to the phone itself. So, my students and I have for a very long time been experimenting with new form factors and new types of experience that enable people to still benefit from all the wonderful knowledge and services that these devices give us, but do it in a way that lets them remain in the present." (Via Gadget360)

No comments