Some of you might not remember, but August 2017 was definitely microchipping’s ‘breakthrough’. Wisconsin-based company Three Square Market had lined up its employees in its cafeteria to be implanted with microchips. Just after that, the news named the event the “chipping party” and bombarded us with futuristic tales of cashless payments and phoneless calls. We were all going to be transhuman. Back to 2019, and most of us are still microchip-free. So why are microchips still not the norm?
Let’s start with the obvious—the idea of living with a chip under your skin potentially monitoring your every move is not one that is easy to accept. After the famous chipping party, religious groups panicked, convinced that the small implants actually marked people with “the mark of the beast”, and accused Three Square Market of being the antichrist. While this sounded highly improbable, more rational worries have since then been stated and proved right.
Microchip implants are similar to the ones we’ve started putting under livestock and our pets’ skin. In other words, the tiny bar codes have been around for a while. Kevin Warwick, a professor of cybernetics at Reading University, had already implanted a chip in his hand by 1998. Warwick wanted to show the world that it was doable, but also that microchipping was the future. To him, fusing technology with our bodies had to be the next big step. He obviously didn’t take data breaches and exploitation of labour into consideration.
So far, the few people that have been microchipped are digital-savvy people from wealthy countries. Well-educated Swedes—“chips and beer” evenings were quite big in Stockholm—do not make up the world’s population. Talking to The Guardian, Urs Gasser, executive director at Harvard’s Berkman Klein Center for Internet and Society, made a very valid point about the way people reacted to the chipping party, stating that “[s]eeing employees get implanted at the workplace made people question what it means to be an employee. Are you a person being paid for your work, or are you the property of the company you work for?” These implants probably weren’t the “mark of the beast”, but they definitely reminded us of the dystopian novel 1984.
The implications that microchipping technology brings with itself are exactly what stops us from getting them. Having a chip under our skin equals increased worker surveillance for most. And, more recently, it could also mean that data collection methods could go up a notch—something that we clearly don’t need while Facebook is still around. After all, microchips could surveil us non stop, and monitor us wherever we go. Remember the nightmare that was Snapchat Map, when all your Snapchat contacts could see where you were at any time of the day? Imagine this with microchips, only this time we won’t be able to turn its signal off.
The same kind of concerns have recently been raised by lawmakers in the US. The states of Arkansas, New Jersey and Tennessee have started drafting new bills to make involuntary microchipping illegal, while Nevada has already passed its own version of the bill. The message is: feel free to microchip yourself if you want, but don’t force anyone to do the same, especially not if they’re your employees. And yet, surveillance tech still is a big issue in the US, with companies forcing their employees to wear an Apple watch constantly to monitor their health. There doesn’t seem to be that much difference between this and microchips.
To try and look at microchipping under a different light than this ethically problematic one, Screen Shot spoke to Jowan Österlund, founder and CEO of the microchipping company Biohax, and the body piercing professional that microchipped the lucky employees who attended 2017’s chipping party. Österlund, like many others, can’t ignore the risks that microchips could entail, but for him, it’s part of a good thing, “People say that [they’re scared of microchips] while they’re on Facebook, on their smartphones logged into their Google accounts, so I can’t really take that seriously. I wouldn’t want people to blindly accept any kind of new technologies, because that would be repeating the same mistakes again. If people are scared, it’s good—they get informed, and they’re not afraid anymore.”
Österlund furthered his point by pinpointing the many benefits that microchips could bring to adapters, such as health-related tracking, “you could get all the information you need about your heart rate patterns, your sleeping patterns, your blood oxygenation, and breathing patterns.” According to him, most of us react that way to microchips because of the small body modification they come with. What needs to be changed, for them to be 100 per cent safe, are the regulations that have been proven to be already problematic, even without most of us having a chip implant.
Instead of panicking about microchips, our attention should be focused on demanding stronger labour regulations and data protection laws, such as Europe’s General Data Protection Regulation (GDPR). In other words, don’t shoot the messenger. Microchips could be our society’s gamechanger—the only question is: are we ready for that kind of change?
Imagine if I told you that a few years down the line humans will have the ability to merge themselves with computers. Well, soon you won’t have to imagine it anymore. Recent studies show that we are closer to a transhumanist era than we thought. But what exactly is transhumanism?
Transhumanism is a philosophical belief that human evolution must come to a natural end and from there on continue with human intervention by essentially merging biology with technology. In the past two decades, we’ve seen technology transform and improve radically—even I, despite being born in 1998 remember a time pre-WiFi and 4G, when we couldn’t even use the landline if the internet was switched on. Fast forward to today, we have people replacing human partners for sex robots, self-driving cars in development, and employee-less stores. Transhumanism is the next step.
A recent study conducted by scientists at Harvard and Surrey University suggests that we are on the verge of entering the transhumanist era, as these scientists manufacture nanoscale probes used to read intracellular electrical activity from neurons. Nanoscale probes could potentially measure the electric current that runs within our cells and push progress on human-machine interfaces. In other words, in the near future, science will have the ability to turn us into literal machines.
And this should come as no surprise, especially with Elon Musk’s recent developments with Neuralink. The company recently unveiled some of the technology it has been working on, which includes a device implanted in paralysed people allowing them to control phones and computers via electromagnetic waves from their brains. This would mean that people who previously weren’t able to or struggled to communicate to could now do so through technology—as a computer would quite literally be able to read our thoughts.
Similarly, Facebook also recently released an update on its developments with brain-reading computer interfaces. The company funded research at the University of California, San Francisco (UCSF), where it published the results of an experiment decoding people’s speech, using implanted electrodes linked to a computer to read words and phrases from the brain.
Such topics inspire heated debates, and it’s understandable why. While Neuralink and Facebook’s developments may be able to improve the lives of many for the better, the merging of humans with technology is controversial. Many argue that we are already transhuman due to the growing crisis of our screen addiction and reliance on technology. This is where the increased discussion on transhumanism comes from.
Elon Musk himself believes in the need for merging people with AI in order to avoid losing control of superintelligent technology and prevent technological unemployment. Humanity+ is a non-profit organisation that advocates for transhumanism through the use of AI to expand human capacities and “increase human performance”, as it puts it, “outside the realms of what is considered to be normal for humans”. The organisation already has over 6,000 members. There are threads on Quora where people discuss their desire to become transhuman, and one of HBO’s most recent shows, Years and Years presents us with a transhuman teenager depicted in the realms of our foreseeable future. The point is, there is a growing demand for transhumanism and we need to talk about it.
While it is evident that merging biology with technology can make major improvements for healthcare and medicine, it is still uncertain what other features we will be able to implement in ourselves with transhumanism. It could develop to anything from having a Google search engine inside our brains to taking photographs with our eyes. PhD student Anqi Zhang, who was part of the research team at Harvard, says that, “the area of brain-machine interfaces will see significant advancement in the next 10 to 15 years”, meaning we will see various implementations very soon. Of course, this all sounds far-fetched and bizarre, and rightfully so.
There are a number of things that could go wrong, one being the fact that implanting devices into our brains would slowly take over the function of different specific parts of it. If we ever reach this ‘perfection’ that transhumanism depicts, it would be pretty difficult to know where to draw the line and finally stop. Nevertheless, we are certainly shifting toward a transhuman era, and all we can do is sit back and try to stay hopeful that it will probably improve the lives of many, if adequately moderated.