The Neuroscience of Mind-Control Gaming

In late 2016 Boston-based startup Neurable, the brainchild of neuroscientist Ramses Alcaide and a collective of fellow alumni from the University of Michigan, received a $2 million investment to develop software that will make it possible for users to control virtual and augmented reality with their mind. In other words, a “brain mouse” connecting your intentions to a computational device. Think right, go right. Think left, go left.

Neurable’s first project, the short video game demo “Awakening,” is a proof of concept for this thought experiment. “Awakening” is played using a VR headset fitted with an internal cap of electrodes that comb through your hair and down to the scalp, reading the electrical activity that is occurring at the level of your cerebral cortex – the top layer of your brain. In regular bursts, patterns of electrical activity begin to emerge. Neurons firing in wave-like unison are detectable through Electroencephalogy (EEG), a non-invasive method of measuring their voltage fluctuations.

“What we do is look for discrete brainwaves that we then leverage for control,” says Neurable’s Adam Molnar.

Imagine three items in front of you. As you focus your attention on a toy block it begins to pulse with light. As it does, your brain subconsciously registers its particular pattern of flashes. And then certain neurons begin to “fire” in response. Neurable’s software processes this noisy EEG data then finds the signal within it and translates it into a game command: That’s the one I want.

It isn’t science fiction, although it’s fitting that you play as a character who wouldn’t be out of place in one.

“Awakening” starts with a warning over the speaker system: “Wake up. This is not a test.” You play as a psychokinetically-gifted child and government prisoner – think Eleven from “Stranger Things.” You’re locked inside an unassuming examination room that contains a few toys scattered nearby: A block, a balloon dog, and a ball. These light up in a pulsating beam, each rising into the air, slowly rotating until you turn your focus to the next. Hanging from the opposite wall is a mirror which, if smashed, will reveal a hidden keypad and a way out. The aim is to escape the room. The trick? No hands, only thoughts.

Neurable’s mind-reading technology is better thought of as a simple brain-reading machine. Its VR headset appears just a little more alien than your standard-issue HTC Vive, outfitted with six bulky electrodes which detect the wearer’s basic intentions. The signal that emerges from such a brain-computer interface – a BCI for short – can’t tell if you’re hungry or if you’re visualizing your bank PIN, but it can capture the user’s intent from a constrained set of options.

“The interaction method works like a binary click,” says Molnar. “So either ‘yes’ or neutral, very much like a computer mouse.” But an EEG isn’t necessarily the most robust technology to work with.

“There are many difficulties working with EEG,” Molnar continues. “It is difficult to make sense of brain-activity that propagates from the neurons speaking to each other, through the skull, through one’s scalp, and just barely into the EEG sensor. Generally, EEG data is very noisy in the sense that it is very hard to get a clear signal for what you are looking. Anything from a smile to an eye-blink (called muscle artifacts) can mess with the EEG signals that are already difficult to pick up.”

It’s for this reason that most laboratories use “wet electrodes,” or electrodes dipped in a conductive gel, to enhance conductivity and pick up clearer EEG signals. However, Neurable has been experimenting with various methods of improving the process while using dry electrodes, which unlike their gel counterparts can be used more easily outside of a lab setting.

“Neurable has developed proprietary algorithms that deal with signals processing, which allow us to understand brain-activity, with less data, more reliably,” Molnar continues. “Because of this, we can use fewer sensors that are dry, so without the laborious task of applying conductive gel, while allowing low to moderate levels of physical activity, i.e. walking, moving your head, something that is quite difficult for most other EEG applications.”

When the company first began, they were using a cap with 32 wet electrodes to do the job, in a process that required 30 minutes of calibration – a system of rehearsing mental exercises so that the AI program can successfully classify your brain patterns and generate the desired movements. More recently, Neurable has reduced this to six dry EEG sensors which requires a calibration time of just two minutes.

Neurable isn’t the only team testing the waters of BCI. In the past few years, a number of tech industry stalwarts have turned their attention toward connecting minds and machines. In 2017, Facebook announced a team of 60 engineers are currently working on keyboard-less typing, through a device which scans your brain to detect your internal dialogue and translate it to text. Likewise, it was just in September that Elon Musk, under a ganja fog, told podcast host Joe Rogan of his plans to merge AI with the brain.

“How much smarter are you with a phone or computer or without? You’re vastly smarter, actually,” Musk said at the time. “You can answer any question pretty much instantly. You can remember flawlessly. Your phone can remember videos [and] pictures perfectly. Your phone is already an extension of you. You’re already a cyborg. Most people don’t realize you’re already a cyborg. It’s just that the data rate … it’s slow, very slow. It’s like a tiny straw of information flow between your biological self and your digital self. We need to make that tiny straw like a giant river, a huge, high-bandwidth interface.”

But in spite of all this hopeful futurism, BCI is likely still a few decades out from bringing our cyborg fantasies to life. “BCI needs to be integrated into your body for you to become a cyborg,” says Dr. Heather Read, a neuroscientist and professor in Psychological Sciences and Biomedical Engineering at the University of Connecticut. By comparison, EEG-based BCI is more similar to riding a bicycle. It is an extension of ourselves hurling us faster than we could run. “No one thinks twice about using a bicycle as an extension of your own body to move forward faster.”

It may be surprising that the technology necessary for bridging the wall between mind and machine have existed in some form since an era of rotary dial telephones and computers the size of living rooms. The term brain-computer interface was first coined in 1973 by Jacques Vidal, Emeritus Professor and founding member of UCLA’s Computer Science department, who posited a direct communication pathway could be created between the brain and an external device.

“EEG has been around for decades,” says Stanford neuroscientist Sharlene Nicole Flesher. “I honestly don’t expect to see any improvement to things like EEG tech in the next 10 or even 20 years. While the advent of machine learning and AI can sort of help, the signals available from EEG are probably at or near their peak performance right now.”

Since 2017, Dr. Read worked as director of the school’s Brain Computer Interface Core, where in collaboration with MIT, her team is developing a technology that can read-out and modulate human brain activity with timing precision 100 times faster than the human reaction time. Unlike Dr. Flesher, she maintains the ability to read out brain and behavioral activity will get faster and more accurate over the next two decades.

“This timing precision will make possible a whole new area of real-time neuromodulation to augment learning while using VR/AR and could be used to provide neuromodulation therapy,” she says. “Combined with high-speed wireless interfaces this will help make BCI vastly more fluid, effective and fun.”

But for both Flesher and Read, the true future of brain-computer interfaces is within-head, with further experimentations into brain-computer interfaces which can be implanted into the brain.

“The technical improvements I expect to see [in the future] are in electrode design which would make the idea of having something in your brain less crazy,” says Flesher. Indeed, the idea of putting something into your brain for fun does seem like a big ask. Invasive brain surgery remains the domain of the medical community in 2018 and modern BCI implants are designed to help restore lost critical functions. But according to Flesher, invasive brain implants are likely to be adopted by the entertainment industry. “The idea of implanted sensors for things like gaming/convenience is maybe on the scale of 40-50 years away,” she says.

According to Read, we might not even have to wait that long for our telekinetic cyborg powers.

“With enough vision and funding in 20-30 years, when the benefits outweigh the risks, it will be feasible to have sensors surgically implanted inside the brain and body allowing you to ‘telekinetically’ control devices and cars and other machines.”

Related stories

HTC Vive Launches $599 Vive Focus Standalone VR Headset in North America, Europe

‘I Expect You to Die’ Creators Working on Sword-Fighting VR Game

Stars Power Venice’s VR Competition Lineup

Subscribe to Variety Newsletters and Email Alerts!