Rise of the mind-reading machines
-   +   A-   A+     22/05/2017
Rise of the mind-reading machines

Mind reading machines: What will communication look like in 20 years?

Mind reading machines: What will communication look like in 20 years? (Credit:vitstudio/Depositphotos)

So you made your way to this article, but how did you do it? Did your motor cortex fire up the muscle fibers in your fingers to click on a particular area of the screen, prompting the CPU inside your device to load up this page? One day that could all seem decidedly archaic. That's because some smart people are investing big time and money into computers that can read your thoughts as they are conceived. The goal is to have machines that know what you want and will give you the information you need before you could literally lift a finger. But how far off might such a future be? Let's take a look at the current state of these brain-computer interfaces, and the challenges that remain in getting them inside our heads.

Brain-computer interfaces (BCIs) have actually been in the works for decades, but sometimes it takes a billionaire that likes landing rockets on floating pads in the ocean to make an audacious technology actually seem possible. Elon Musk generated quite a buzz when he revealed that he was working on such a thing (more on that later), but in actual fact, the basis for these mind-reading machines has its roots in neuroscience research from almost a century ago.

In 1924, German psychiatrist Hans Berger made the first ever EEG (electroencephalogram) recordings during neurosurgery on a 17-year-old boy. What Berger later described as "alpha and beta" waves would soon be recognized as electrical activity that was, and still is, of huge assistance to physicians working to detect brain disorders.

By attaching electrodes to the scalp and having the measured brainwaves appear onscreen as a graph, physicians can look out for abnormalities and gain insights into the health of the brain. Rapid spikes might be indicative of epilepsy or seizures, for example, while slower waves may be the result of a tumor or stroke. Alzheimer's, narcolepsy and brain damage are other examples of conditions that can be surveyed by EEG.

Toward direct brain-computer communication

In the 1970's, an electrical engineer from Belgium called Jacques Vidal started to wonder whether these electrical signals could be used for applications beyond the medical realm. His 1973 peer-reviewed paper "Toward direct brain-computer communication," was the first to describe a brain-computer interface (he is now credited with coining the term), and explored the feasibility of pulling electrical signals from the brain and converting them into commands for a computer.

"Can these observable electric brain signals be put to work as carriers of information in man-computer communication or for the purpose of controlling such external apparatus as prosthetic devices or spaceships?" wrote the retired air force lieutenant. "Even on the sole basis of the present states-of-the-art of computer science and neurophysiology, one may suggest that such a feat is potentially around the corner."

Prosthetics

That corner may have taken a little longer to round than Vidal guessed, but his ideas on how BCIs could be used are proving quite prescient.

At the FIFA 2014 World Cup in Brazil, an international collaboration of scientists making up The Walk Again Project demonstrated their latest advance in assisted mobility technology: a brain-controlled exoskeleton. Using a set of non-invasive electrodes to read brain signals and relay commands to the lightweight exoskeleton, a paraplegic man completed the symbolic kick-off for the tournament.

We have also seen scientists progress toward mobility solutions by drawing data from non-invasive EEG devices to reconstruct 3D hand and leg movements, enable a paraplegic to walk again using his own paralyzed limbs and allow a quadriplegic woman to eat chocolate with a mind-controlled robotic arm.

Taking flight

And spaceships? Alright we're not there yet, but NASA is exploring the possibilities. In 2013 the space agency teamed up with scientists from the University of Essex on a project where two subjects controlled a virtual spaceship using BCIs. The study was designed to explore the potential of using BCIs to control planetary rovers, though that kind of thing remains a long way off.

In the meantime, drones aren't a bad compromise, right? Unmanned aircraft have become quite a popular testbed for BCI technologies. We have seen mind-controlled quadcopters and fixed-wing drones, with some even adding a competitive flavor to the mix to really nudge things along.

In April last year, neuroscientists at the University of Florida held the first Brain Drone Race, an event that asks pilots to will their drones across the finish line using only their thoughts. The technology involved here takes brain signals collected by EEG devices and converts them into control inputs for drones. So rather than pushing left on a joystick, you only have to think about pushing left.

But more than purely competitive spectacle, Brain Drone Race was an attempt at inspiring further developments in the BCI area, with a view to one day using the devices in everyday life. And the scientist behind the event, Juan Gilbert – chair of computer, information science and engineering at the University of Florida – tells us that they are making some good progress.

"We are planning Brain-Drone Race II in a couple of weeks and we have started some projects," he tells New Atlas. "We have a project called Brainwords where we are trying to use the BCI as an authentication device; imagine using your thoughts as your passwords. We have a project sponsored by Lenovo to play the drums with your thoughts. We are also working on the design of a new BCI that is easier to use by the general population. We have a project on building tools that make the BCI easier to use for app development and we are doing research on the BCI for monitoring your brain activity, or what's called quantified-self."

As it stands, non-invasive BCIs like EEG caps need to read the electrical signals through layers of skull and tissue, so there is a lot of noise to sort through, which does limit their use. For the clearest signals and truly game-changing potential, you need to get closer to the source.

Insane in the membrane

They require surgery and carry risk of infection, but BCIs that can be planted inside the head in direct contact with the surface of the brain offer the best signal quality. And this approach has allowed scientists to do some truly remarkable things.

Dr Ali Rezai (left) with his patient Ian Burkhart

Back in 2014, Dr Ali Rezai, the director of Ohio State University's Center for Neuromodulation, implanted a tiny 4 x 4 mm microchip on the surface of Ian Burkhart's motor cortex. 26-year-old Burkhart had suffered a diving injury at age 19 that left him quadriplegic. The doctor's hope was that this chip, when used with purpose-made algorithms and an electrical sleeve to stimulate muscles in the arm, would allow them to bypass the damaged spinal cord and use Burkhart's thoughts to control his fingers and hands.

"The results are excellent," Rezai now tells New Atlas. "Ian is the first human who was able to move his own hand and arm using his thoughts. He initially achieved rough movements of the wrist and hand. Over the past two-and-a-half years, Ian has exceeded our expectations and is able to perform increasingly complex movements that he could not have imagined ever doing again, such as a rapidly opening and closing his hands; moving fingers; grabbing and holding objects like a cup, toothbrush, phone, key and credit card; opening and close a jar; stirring a cup of coffee; pouring from a bottle; holding a phone; feeding and grooming and even playing a video game."

Another recent example involves a man paralyzed from the shoulders down regaining control of his paralyzed muscles by also bypassing the injured spinal cord. To do this, scientists implanted two aspirin-sized 96-channel electrode arrays into his motor cortex, and connected another set of electrodes to his arm. Then with some training, just by thinking about moving his arm or hand, his brain signals could be translated into electric pulses that triggered the desired muscles movements in his arm.

So the BCIs of today are already impacting lives of disabled people in a very real way. But Elon Musk imagines machines that go well beyond that.

A brain-computer interface anybody can use?

It was around 2008 that consumer-focused EEG devices began to trickle out of the lab and into the market. The NeuroSky and Emotiv headsets were two of the early players on the scene, and each promised to bring the concept of mind-controlled video games to the public.

These days, those non-invasive EEG devices are marketed more as ways to monitor brain health, like Fitbits for your noggin. They have been joined by others such as the iBrain, which Stephen Hawking tested back in 2012 and the Muse, which displays EEG data on a mobile device. But it is difficult to see the general public wandering the streets in EEG caps, or with cords sticking out of their head, as current implantable BCIs require.

For this kind of technology to become ubiquitous it will need to be much more discreet, and that's exactly what Elon Musk has set out to build. With the launch of his company Neuralink, the entrepreneur aims to develop a type of wireless BCI that sits inside the brain, monitors your brainwaves, and is capable of uploading and downloading thoughts and information. Treating neurological conditions is part of the picture, but ultimately, supplementing human intelligence to save humanity as a whole, is Musk's real motivation.

"Under any rate of advancement in AI, we will be left behind by a lot," he said last summer. "The benign situation with ultra-intelligent AI is that we would be so far below in intelligence we'd be like a pet, or a house cat. I don't love the idea of being a house cat."

And that's if the machines decided to keep us around. If they didn't ...

"It could be as simple as something like getting rid of spam," Musk continued. "What's the easiest way to clean your inbox? But instead of getting rid of spam, it gets rid of humans."

Becoming superhuman

"We already have a digital tertiary layer in a sense, in that you have your computer or your phone or your applications," Musk tells the blog Wait But Why, in an expansive interview explaining the basis for his Neuralink venture.

"You can ask a question via Google and get an answer instantly," he added. "You can access any book or any music. With a spreadsheet, you can do incredible calculations ... You can video chat with someone in freaking Timbuktu for free. This would've gotten you burnt for witchcraft in the old days. You can record as much video with sound as you want, take a zillion pictures, have them tagged with who they are and when it took place. You can broadcast communications through social media to millions of people simultaneously for free. These are incredible superpowers that the President of the United States didn't have twenty years ago."

The way Musk sees it, Neuralink's BCIs will connect that digital tertiary layer directly to the brain – no need to type commands with your fingers or thumbs on phones, laptops or tablets; these machines will already be way ahead of you. But what kind of challenges will need to be overcome for that to happen? For starters, you'll need a device that can transmit a lot of data back and forth wirelessly from inside the brain. Today's implantable BCIs involve wires sticking out of the subject's head, but there is some exciting progress being made on versions that don't.

Last year, scientists at University of California, Berkeley showed off their so-called Neural Dust, tiny wireless sensors that can go inside the body and track nerve signals and muscles in real-time. Well, thats the idea anyway.

The 1 mm cubes contain piezoelectric crystals that turn ultrasound vibrations from outside the body into electricity that powers an onboard transistor. This transistor sits against the nerve and measures electrical activity, relaying any voltage spikes to the ultrasound device outside the body for analysis. These have already been implanted into the muscles and peripheral nerves of rats, but the researchers hope further down the track they can make their way into the human head.

"The technology is not really there yet to get to the 50 micron target size, which we would need for the brain and central nervous system," Jose Carmena, a neuroscientist who helped develop Neural Dust, said at the time. "Once it's clinically proven, however, neural dust will just replace wire electrodes. This time, once you close up the brain, you're done."

This kind of thing would still involve brain surgery, and for the technology to reach mass adoption in the way Musk hopes, it would need to be streamlined further to resemble something like, laser eye surgery, for example.

To read the minds, you first have to change the minds

Assuming that the wireless problem and the bandwidth problem – along with other complex challenges such as biocompatibility – can be solved, is everyone going to be cool with having these chips planted inside their heads?

"The use of chip-based BCIs is not just a science and technology challenge, it is also a people's perception and willing to adapt challenge," the University of Florida's Juan Gilbert tells us. "People without health problems may not be willing to put a chip in their brain because of the 'Hollywood effect.' They would think that the government may try to control them or steal their thoughts. Therefore, Elon and this new company should have a team of researchers concentrating on ethnography studies to understand how, when and why people would use a chip-based BCI."

Musk first imagines that people will use Neuralink's technology to treat brain injuries, such as strokes, problems relating to mobility (like those outlined above), and others resulting from aging, such as the loss of memory. From there, he anticipates that things could move rather quickly.

"I think we are about eight to 10 years away from this being usable by people with no disability … It is important to note that this depends heavily on regulatory approval timing and how well our devices work on people with disabilities," he tells Wait But Why.

What does this future look like?

So let's fast forward 10, 20, 40 years down the track, whenever it might be that only total Luddites would dare walk around without BCIs inside their heads. What are we doing? What would communication look like? Do we even need to speak anymore?

"If I were to communicate a concept to you, you would essentially engage in consensual telepathy," says Musk. "You wouldn't need to verbalize unless you want to add a little flair to the conversation or something, but the conversation would be conceptual interaction on a level that's difficult to conceive of right now."

So what Musk is essentially describing is a completely different kind of communication, one that is impossible for us to wrap our stupid, non-computer-enhanced heads around. Such a platform wouldn't just make typing by finger on a mobile phone old-hat, it would do the same to speech, our primary means of communication for tens of thousands of years.

All of the thoughts rattling around in your head amount to much more information than can be instantly conveyed in English, French or Mandarin including all the nuanced emotions, half-baked ideas, fleeting moments of inspiration, adrenaline, excitement and fear. Planting a brain-reading device inside your head could open up entirely new ways of expressing yourself.

What does it feel like to have a close shave with death? Score the winning touchdown in a Superbowl? Experience true love and have your heart broken? How about other experiences that defy words? That Musk sees this future as not only possible, but essential for our survival, is a little unsettling, but hey, it beats becoming a house cat.


Read count: 4722 Previous page Back to top
Other news