Courtesy of University of Washington
By Derrick Harris
September 24, 2015

You know the TV-show trope where someone is able to read other people’s minds just by touching them? Well, scientists have devised a way to let people actually read minds—in a sense—and they can do it even when they’re nowhere near each other.

Research out of the University of Washington, and published Wednesday in the journal PLOS ONE, shows how people sitting nearly a mile apart were able to participate in a question-answering game using only their minds. While the study is not proof that people will soon be carrying on long, non-verbal conversations with each other, its underlying technology could potentially be very valuable in burgeoning fields such as virtual reality and real-time language translation, or even helping tackle diseases such as ADHD.

The way it works, essentially, is that each person is wearing caps that read their brain waves, which are measured by an electroencephalography, or EEG, machine. One person asks the other a yes or no question, via computer, which the other answers by focusing on the “yes” or “no” sections on his computer monitor. The answer is delivered to the questioner via magnetic stimulation, with an answer of “yes” delivering a signal strong enough to stimulate the visual cortex and make the recipient see a flash of light.

In this case, the questions and answers were pretty limited. The person asking the questions was shown a list of eight objects, one of which was shown on screen in front of the person answering. After asking three questions from a predetermined list and receiving (or not receiving) the stimuli, the questioner had to guess what the answer was. They were able to do so 72% of the time, although the paper notes that the actual number would have been above 90% had people answering questions not answered incorrectly sometimes, and had those asking not occasionally misinterpreted the stimuli they received.

A diagram of how the experiment was set up.
Courtesy of PLOS ONE / University of Washington

While any controlled experiment has factors that limit real-world viability—in this case, for example, the physical setup (specialized caps, EEG machines and stimulus servers on each end) and the limited yes-no interactions—it’s not too difficult to envision some powerful applications for this type of technology in situations where mere words wouldn’t work. The PLOS ONE paper suggests improved communications for people who cannot speak, or between people who don’t speak the same language.

In a University of Washington news story on the experiment, one of the researchers, Chantel Prat, suggests this type of technology could transmit brain states from person to person. For example, she said, a student without ADHD might be able to help a student with ADHD concentrate by sending signals that stimulate the proper areas of his brain. That sort of capability might also apply to a broad range of disorders that we would like to get under control in certain situations.

However, pulling off these ideas at any significant scale would likely require some major investments in both the gear involved (everything would probably have to fit in a small, comfortable package, for example) and in the methods for actually sending the proper signals between devices. So maybe a realistic first application would be something like virtual reality, where companies such as Facebook (FB) and Microsoft (MSFT) are already investing a lot of money, including on headset technology that’s both functional and feasible to wear.

The prospect of letting headset-wearers read each other’s minds might be the kind of game-changing technology that convinces these companies to fund its development across the finish line.

SPONSORED FINANCIAL CONTENT

You May Like

EDIT POST