A new brain implant lets paralysed people write with their minds – Screen Shot
Deep Dives Level Up Newsletters Saved Articles Challenges

A new brain implant lets paralysed people write with their minds

For decades now, scientists have been connecting the impulses our brain generates whenever we do something—be that moving, speaking or simply sensing—not only to understand and treat brain diseases but also to help people with disabilities. Using Brain-computer interfaces (BCIs), researchers have been aiming to restore movement in people with paralysis and potentially help treat neurological and psychiatric diseases. And it certainly looks like they’re getting somewhere.

A study conducted by a team at Stanford University and published in Nature reports on a brain implant that will allow people with impaired limb movement to communicate with text formulated in their mind, no hands needed. How does it work exactly?

When coupled with electrodes implanted in the brain, the artificial intelligence software was able to ‘read’ the thoughts of a man with full-body paralysis as he was asked to convert them to handwriting. The BCI transformed his imagined letters and words into text that appeared on a computer screen—a form of “mental handwriting,” as Scientific American calls it.

The technology could benefit millions of people worldwide who are unable to type or speak because of impaired limbs or vocal muscles. Until now, co-senior study author Krishna Shenoy had helped analyse the neural patterns associated with speech. In other words, Shenoy had managed to decode imagined arm movements so that people with paralysis could move a cursor on a keyboard screen in order to type out letters. However, this specific technique only allowed participants to type around 40 characters per minute, far lower than the average keyboard typing speed of around 190 to 200 characters per minute.

With the help of his team, Shenoy’s recent work focused on imagined handwriting as a way to improve the speed of communication for the first time, which the researchers hope it will reach, at very least, smartphone texting rates. Their technique allowed the study subject, who was 65 years old at the time of the research, to mentally type 90 characters per minute. Although we’re not at the 190 characters per minute efficiency just yet, that rate is not far from average for most senior texters “who can typically type around 115 characters per minute on a smartphone,” according to Scientific American.

The study participant had suffered a spinal cord injury in 2007, and had lost most movement below his neck. In 2016, Stanford neurosurgeon Jaimie Henderson, co-senior author of the paper, implanted two small BCI chips into the man’s brain. Each of the chips had 100 electrodes capable of sensing neuronal activity. They were implanted in a region of the motor cortex that controls movement of the arms and hands, allowing the researchers to profile brain-activity patterns associated with written language.

Not only could this technology’s recent success help restore communication in people who are severely paralysed, but it could also light the way for more progress in intracortical brain-computer interfaces. But as assistant professor of neurology at Thomas Jefferson University Mijail D. Serruya, who studies BCIs in stroke recovery but was not involved in the research, told Scientific American, “Why not teach the person a new language based on simpler elementary gestures, similar to stenography chords or sign language?”

Through his question, Serruya highlighted the fact that focusing on restoring communication via written letters may not be the most efficient means of doing so. In fact, although translating the brain’s control over handwriting is a significant first step in reclaiming someone’s ability to communicate, decoding what that person actually intends to say is still a major challenge researchers face.

For now, given that we generate speech much more quickly than we write or type, it’s hard to predict when the researchers’ method will be translated into a real device that anyone can buy. “We hope it’s within years and not decades!” said Frank Willett, lead author of the paper and a research scientist at Stanford’s Neural Prosthetics Translational Laboratory. Meanwhile, Elon Musk has made a monkey play Pong telepathically. Priorities, am I right?

AI

Busted! AI can figure out who your secret crush is by analysing your brain activity data

I’ve said it before and I’ll say it again: artificial intelligence (AI) is already on its way to altering every little aspect of our lives—even the ones we never ever asked it to come close to. Forget about AI art and robots delivering sermons, AI now wants to infiltrate the world of dating, to help you find out exactly who you find attractive.

Researchers from the University of Helsinki and the University of Copenhagen have managed to teach AI what faces an individual person finds attractive, and how to generate artificial portraits in response. The study, titled Brain-computer interface for generating personally attractive images, was conducted on 30 volunteers who agreed to having the electrical activity in their brain monitored while they looked at artificial portraits created by a generative adversarial neural network (GAN), pulled together from thousands of images of real celebrities. Think ThisCatDoesNotExist.com, only with hot (but fake) contestants.

Speaking to Neuroscience News, senior researcher at the University of Helsinki Michiel Spapé further explained, “It worked a bit like the dating app Tinder. The participants ‘swiped right’ when coming across an attractive face. Here, however, they did not have to do anything but look at the images. We measured their immediate brain response to the images.”

Once that was done, the data from the readings was then analysed using machine learning techniques, generating a network that helped create new portraits based on a person’s individual preferences. Both the AI model interpreting the volunteers’ brain responses and the generative neural network modelling the face images played a crucial role in this experiment. Together, they produced an entirely new face image by combining what a particular person finds attractive.

The new images produced were then found to match the subjects’ preferences with an accuracy of 80 per cent. Now, before you get too excited about what this could mean for the future of dating apps, I should note that what this specifically highlights might turn sour, real quick.

After all, the study could easily expose unconscious biases and stereotypes most people hold (be that knowingly or not). And we’ve come to learn the hard way—attractiveness is a far more challenging subject than simply defining concrete physical traits. As Spapé explains, attractiveness is “associated with cultural and psychological factors that likely play unconscious roles in our individual preferences. Indeed, we often find it very hard to explain what it is exactly that makes something, or someone, beautiful: beauty is in the eye of the beholder.”

Succeeding in assessing attractiveness is especially significant, as this is such a poignant, psychological property of the stimuli. By bringing in brain responses to the mix, it shows it is possible to detect and generate images based on psychological properties, like personal taste.

What does it say about us and the way we perceive attractiveness? Well, although we can all instantaneously recognise a face as ‘attractive’, it is much harder for us to explain what exactly defines our personal attraction. This suggests that attraction depends on a complex mix of culturally and individually defined features. And now, GANs have learnt to mimic these complex data distributions without even having to ask the question we all ask ourselves: why is my type my type? In other words, this little study just proved that AI is not to be slept on.