There’s a political storm swirling around big tech companies right now, and TikTok is in the eye of it. Being the world’s most downloaded app in 2022, you might say TikTok is the eye of it.
The US government wants to dictate the algorithm that determines which Pedro Pascal clips and Negroni Sbagliato recipes wash up on your FYP. Why? They’re (legitimately) worried about a couple of (actually very important) things: keeping young people safe, and data being used for international espionage.
Their problem? TikTok’s parent company, ByteDance, is in China. Not America, where US lawmakers would have more power over the platform—though the internet technology company does have headquarters in Singapore and Los Angeles.
President Biden’s team even threatened a nationwide ban if ByteDance doesn’t agree to sell the infamous app. TikTok says the worries are unfounded, based on no evidence, and the uproar is causing uncomfortable political ripples.
There’s another issue, too. The people who want more control over the internet, seem to know nothing about the internet. On 24 March 2023, TikTok’s CEO Shou Zi Chew got a grilling from US lawmakers who are terrified China’s government will manipulate data from American users. Trouble was, some of those lawmakers didn’t seem to have a basic understanding of WiFi, let alone in-depth knowledge regarding the most powerful artificial intelligence technology in existence.
This was gleefully pointed out in clips from Friday’s televised session in US Congress, going viral over the weekend—where else, but on TikTok?
One user rounded up some of the most, uh, interesting moments from Chew’s five and a half hours on the stand:
It’s not just the US who are panicking. The UK’s Houses of Parliament and the EU government are feeling threatened, too. All these bodies have banned TikTok from their team’s devices and networks, over security breach fears.
But TikTok’s take is that the bans are based on “fundamental misconceptions.” Chew insists that the app has never, and would never, honour a request from the Chinese government to share US data. That is, if one was ever made. The CEO says it’s unreasonable for America to crack down on the app when no threat to national security currently exists.
If you’d like to hear it from Chew’s mouth, here’s what TikTok’s CEO had to say about the whole thing:
To show it’s serious, ByteDance spent $2 billion on something called Project Texas—a partnership with cloud software group Oracle that put up a firewall to make sure US user data is protected from Chinese influence.
Project Clover will then follow in Europe, handing some approvals over to third parties. TikTok reminded US officials that American authorisation is needed every time they pass data through their servers. But the US doesn’t think it’s enough.
Chew named a big competitor to take the heat off TikTok, reminding the world of instances like Cambridge Analytica, where millions of Facebook users had personal data collected by the British consulting firm without their permission.
“With a lot of respect, American soil companies don’t have a good track record with data privacy and user security. I mean, look at Facebook and Cambridge Analytica,” Chew rebutted.
In the run-up to 2017, Myanmar military personnel used Facebook to spread misinformation and hate speech until they were eventually banned by the platform. Thousands of Rohingya Muslims were killed; with thousands of refugees eventually suing Facebook for £150 billion for failing to prevent the incitement of violence. In the UK, an Online Safety Bill might be amended to include possible prison sentences for social media bosses who don’t do enough to protect kids.
So there are plenty of good arguments for why moderation of content should be cranked up—but who or what technology should have that responsibility? Can you sue AI, if the tool moderates content too much or not enough?
Currently, tech platforms can (and do) moderate content however they want to, and are protected by a decades-old US law: Section 230, part of the 1996 Communications Decency Act. This was set up way back when the internet first exploded, to help platforms get off the ground without endless litigation. Changes to this could impact how the internet looks, and the way companies operate, forever.
At the moment, apps can hold their hands up and say ‘it wasn’t me’ if their users post dangerous content, but if Section 230’s protection dwindles, tech companies will face big problems.
How does this affect me, you’re probably wondering. Well, if the algorithm spits dangerous content onto your feed, and you share it, you are currently the ‘creator’ of that content. And you’re liable.
Then there’s the safety side of things. Whether apps are using our data or not, what has been proven is the link between social media usage and mental health problems. It’s easier than ever to buy drugs on TikTok. Young users can come across sexually explicit content and discussions. Google a salad recipe, and you might get served an ad for weight loss injections. Open Instagram, and within minutes it can feel like everyone in your life is on holiday or going out, having more fun than you. The internet is inherently depressing as much as it is useful and wonderful.
All that is taking its toll. Between 2010 and 2020, suicide among young people aged 10 to 19 years old went up by 45.5 per cent in the US. The same government agency that carried out this study found that one in three teenage girls had seriously considered taking their own life. Social media addiction gets a lot of the blame, because tech companies build them in ways that tempt us (and condition us) to use them constantly. Features like video autoplay are in question, which many argue should be stopped.
Would getting rid of TikTok actually help, at this point? Plenty of young people would find ways around it, and new apps would spring up. Perhaps a more sensible approach is to limit time spent on apps and increase time spent in the real world, interacting with other people.
Celine Bernhardt-Lanier and Emma Lembke agree with this notion, which is why they founded the LOG OFF movement in the summer of 2022. The group isn’t anti-social media, but instead wants to get people talking about how much they’re using it, how it makes them feel, and what we can do about its negative effects.
SCREENSHOT recently spoke with Claire Rowden, movies and social producer at MTV UK. Joining a movement like LOG OFF would be a tall order for her, being chronically online is part of her job. So, what does she think about a potential ban?
Rowden explained: “I’ve been on TikTok for three years, and so much of my experience has been positive, because I’m putting out positive content and I’m receiving positive responses.”
One of the aspects of the app that Rowden loves the most is its ability to platform people’s passions, and even provide the opportunity for anyone to go viral. “There are so many different sides to it—everyone has an equal opportunity for their content to be seen, which is why everyone loves it so much,” the producer stated.
Even more than this, Rowden values the community aspect of the video-sharing app: “TikTok has built so many communities through the app—people would be at a loss if they were taken away from them. If people want to be on social media, they’re going to be on social media. It’s that need for searching, that desire to find people who like what you like.”
She went on to add: “I personally learn so much from TikTok, there are so many different sides to it. [About] being neurodivergent for example, and particularly working in film. Being on #filmtok and having movies recommended to me. Even something as simple as #booktok has shown me so many novels I’d otherwise never have read. I have gained so much from the app.”
The producer of course also recognises the impact being fixated on social media can have on one’s mental health and general well-being. Rowden explained how she feels “50/50” on this topic. “I have found community, friends, and it’s helped me meet people in my industry that I wouldn’t have otherwise. It’s opened so many doors for me and it’s integral for working in the media. But it can negatively affect my mood.”
“A lot of the internet and social media is what you make it. If you’re not going online and spreading hate, then it is possible to avoid it. But like in all areas of life, there’s always going to be toxicity to watch out for. All in all, it’s a much-needed escape for a lot of people,” Rowden continued.
One thing the media expert definitely stands by, is that the US government has far more important issues to be tackling right now. “I think the people running the country should be focusing on stricter gun laws and giving women autonomy over their own bodies, rather than trying to force kids to stay off an app that (as much as any place on social media) can be toxic, but can also be life-changing.”
There’s no denying it, some US lawmakers want to see social platforms gone altogether (one likened TikTok to “a cancer, like Fentanyl, another China export, that causes addiction and death”). They want more protection and they want it now. How they’re going to get it, of course, still remains to be seen.
But, what can be said is that politicians should start working more closely with industry experts (that includes gen Z) who know what they’re dealing with, and can at least explain how WiFi works.