You might soon be able to communicate with machines without talking

By Alma Fabiani

Published May 23, 2019 at 12:32 PM

Reading time: 2 minutes

1888

Last month, a team of scientists led by Edward Chang from the University of California in San Francisco, found a way to decode specific signals in the brain that convert our thoughts into words. By understanding these signals, they’re making tentative steps towards merging our minds with machines. Just like with anything else, change is exciting, especially technological advances transforming the medical field, but when it comes to the human brain, a part of me can’t help but fear what’s to come.

The outcome of this research could lead to are endless opportunities, such as creating a system that could help severely paralysed people speak. This same technology could equally be used for more trivial purposes, like mentally dictating texts and sending them from your phone—drunk texting is about to get way worse. Using this technology to help paralysed people should be more important than creating another way for us to be texting non-stop. Although I must admit, sending text messages without picking up my phone sounds intriguing.

Talking is not something we usually stop and think about, it just comes out. What really happens is that your brain sends signals to your lips, tongue, jaw, and larynx to produce whatever you’re about to say. Edward Chang and his team used electrodes, placed on the brain’s surface, to record those signals. They then entered the signals into a computer model of the human vocal system for it to generate synthesised speech.

At the moment, the technology is far from becoming widely used or even accepted, as according to Chang, this technology is not able to work with electrodes placed outside of the brain, meaning that for now, people will have to undergo neurosurgery in order to put electrodes on the surface of the brain.

Hannah-Madersbacher-@gridzstudio

In the U.S. again, but on the East Coast this time, at Columbia University in New York City, researchers started testing a potentially transformative system that could help people using hearing aids by amplifying the voice they want to listen to and blocking any background noises. To understand exactly what the listener wants to hear, it uses electrodes placed on the section of the brain that processes sounds (the auditory cortex). As the brain focuses on each voice, it generates a specific electrical signature for each speaker.

The researchers at Columbia University then created a deep-learning algorithm that was trained to differentiate voices and look for the closest match between this electrical signature and the ones of the many speakers in the room. Once the algorithm’s choice is made, the hearing aid would just amplify the voice that matches best—doing so with an average accuracy of 91 percent.

Although this system is only a trial for now, one obvious stumbling block could already be a worry for potential users: just like the first research mentioned, to implant the electrodes, patients would have to undergo brain surgery—a very risky operation. Additionally, this technology could end up being used for the wrong purpose. What if people without hearing impairment used it to boost their ability to focus on one voice; to listen to private conversations? Call me a pessimistic but in this day and age, trusting technology with our privacy has been proven to be hazardous at best, so going ahead and getting it implemented inside our brains seems extremely dystopian.

The concern surrounding these two studies can be similarly raised about any new upcoming invasive technology that would require researchers to come anywhere near our brain. If further developed, this technology could be perfected and the chances of anything going wrong reduced, but what if, sometimes, pushing scientific boundaries goes too far? We’ve reached a stage where almost nothing seems impossible. Surely this level of invasion should come with ethical boundaries—let’s learn from our past mistakes.

Keep On Reading

By Charlie Sawyer

Mpox outbreak: What you need to know to stay safe and informed

By Charlie Sawyer

Valentina Gomez calls basketball player Brittney Griner an unpatriotic lesbian in new video

By Fatou Ferraro Mboup

Blake Lively criticised for another tone-deaf comment in new It Ends with Us interview

By Louis Shankar

BlueSky sees 300% surge in users after 2024 US presidential election

By Fatou Ferraro Mboup

Nazi-themed party drugs surge among Europe’s Gen Z

By Fatou Ferraro Mboup

Victoria’s Secret Fashion Show 2024 has haters and fans alike losing their mind, here’s why

By Fatou Ferraro Mboup

New footage shows man dragging Yazmeen Williams’ body in sleeping bag using motorised scooter

By Charlie Sawyer

Lily Allen creates an OnlyFans account to sell feet pics for $10 per month

By Abby Amoakuh

Here is what really happened between Julia Roberts and Travis Kelce at the Eras Tour in Dublin

By Abby Amoakuh

MAGA-themed fashion show goes viral as netizens discover it’s not a joke

By Fatou Ferraro Mboup

Was the alleged assassination attempt on Trump staged? Conspiracy theorists think so

By Malavika Pradeep

Meet Sonny Angels, the pocket boyfriends helping Gen Zers navigate adulthood

By Emma O'Regan-Reidy

What is Americana style? From problematic roots to Beyoncé’s modern reinterpretation

By Abby Amoakuh

It Ends With Us author Colleen Hoover’s long history of controversies and problematic behaviour

By Malavika Pradeep

8 celebrities and fashion moments you might have missed at the $600 million Ambani wedding

By Fatou Ferraro Mboup

Internet erupts over deepfake porn video of Saltburn star Jacob Elordi

By Charlie Sawyer

How the EDL is using extremist influencers to fuel misinformation and violence across the UK

By Fatou Ferraro Mboup

Beauty creator Golloria George faces discriminatory backlash after criticising YSL blush

By Abby Amoakuh

#swiftieracism begins trending on X after Taylor Swift fans hurl racist abuse at Beyoncé

By Fatou Ferraro Mboup

Study reveals alarming suicide rates among female doctors linked to misogyny and harassment