The new age factory workers get their hands dirty with data – Screen Shot
Deep Dives Level Up Newsletters Saved Articles Challenges

The new age factory workers get their hands dirty with data

When talking about the average factory worker, the typical persona that comes to mind is one that consistently performs physical strenuous tasks which are extremely repetitive. What’s also well understood is that many of these classically-defined factory jobs are being increasingly automated. The machine has been slowly making our lives easier, though it has caused much distress in terms of economic job loss. The world of the factory worker seems to be outdated, and long gone, but there is a new era of labour upon us—and it’s beginning in China.

In a semi-ironic and an almost romantic ode to older generations, old factories in China are being filled with new types of factories: hundreds of AI tagging start-ups. With the Chinese promising to be the global leader in AI by 2030, the understanding and need for their AI to be competitive and superior has become an immense focus for the country. In 2017, China’s start-ups alone made up more than one-third of the global vision market. China’s competitive advantage in the AI race against the U.S. could possibly be single-handedly due to their ability to tag data.

These factories are filled with workers who quite literally sit in front of a screen, tagging everything they see picture after picture, to help different AI projects (whether it be for search browser, autonomous cars, or facial recognition) better learn to identify and differentiate objects and learn as they go. The interesting part about such factories is that many of these start-ups are developing in rural areas, where overhead cost tends to be cheap—providing those without education as well as those in remote areas with new job opportunities.


This new industry is growing fast, employing hundreds of young citizens around the country. The tasks fit similar job descriptions to those of factory work in the past, just in a completely different context—plenty of manual work, lots of repetition and a focus on quantity.

With AI currently being utterly incapable of learning such things on its own, an entire industry has been born to its aid. Artificial intelligence, while being tagged as the peak of machine automation, is quite literally built on manual human labour, and will continue to be so. With descriptions of the work environments being reasonable, and the average income ranging between $400-$500 (around the nation-wide salary average), it’s been reported that workers prefer AI tagging jobs to other alternatives such as typical “old fashion labour” that would be the only alternative for people living in remote areas. The benefit, many say, is that typically uneducated rural citizens can at least become part of a new, upcoming and important industry.

So even as automation replaces the human touch in industries that have traditionally been human-friendly, it also continues to create new industries, new economic job opportunities and ease the type of strenuous or unwanted work needed to be done. This industry won’t be in need of human workers forever, but it is comforting to know (and consistently has been proven in the last 200 years) that even though we might not be able to fathom the new industries and jobs new technological advancement will create, the ever-changing job landscape will always have room for humans.


In Data We Trust: China’s Social Credit system and the quantification of morality

By Maya Raphael

Human rights

Mar 15, 2018

If Black Mirror and Orwell’s 1984 had a child, that child would probably bear an uncanny resemblance to the subject of this article—China’s ‘Social-Credit system’. Unfortunately, this system is not a figment of the imagination created for a sci-fi anthology or dystopian fiction, but rather an evolving reality. So take your seats, get comfy, and have some popcorn ready at hand, while I present you with the next episode in the series, This is actually happening, society is fucked.

The Social Credit System is a government initiative, proposed by no other than (I hate for the protagonist to be so predictable…) China’s communist party. The initiative, which the Chinese government intends to launch nationally by 2020, will essentially be one of the biggest mass surveillance projects in history. It proposes to use big data analysis to track the economic, social and political behaviour of its 1.4 billion citizens, awarding them with three digit “citizen scores” (currently ranging from 350-950) based on algorithms, all of which will use numerous factors to determine how “trustworthy” a citizen is in the eyes of the Chinese government. While there is little information as to what the official government standards are for citizen scores, those being used in the pilot schemes make for pretty ominous indicators as to what they might be. Oh and by the way, these pilot schemes are being run by billion dollar private corporations (such as China’s retail giant Alibaba and its social-credit pilot Sesame Credit). Don’t capitalism and communism make for strange bedfellows when it comes to the profits of citizen data?

The factors that make up the citizen score start off pretty straightforward: your individual “credit history”. Do you pay your bills on time? Water? Electricity? Phone? Rent? Next, it’s down to “verification of your personal details”. Your phone number, home address, email— are you sure it’s all updated so that they can tie your digital identity to your real-life identity? And here’s where it starts getting a little more interesting, “personal behaviour and preference”. Your consumer behaviour is an indicator of how “trustworthy” you really are. For example, if you often purchase video games, expect your score to start plummeting, you are clearly an “idle” citizen. Ever make a reservation at a restaurant and just don’t turn up? Well, that’s no longer a good idea, call the restaurant and cancel immediately if you want to secure yourself a decent citizen score. And as if that’s not enough, another factor currently being taken into consideration is “interpersonal relationships”. Do you have a friend who is only a 400? Not a good idea. Your score might go down just for associating with them. And as if that wasn’t bad enough, did I mention that Sesame Credit has teamed up in a corporate data-tastic partnership with China’s largest online dating platform Baihe? Ouch. Good luck getting someone to swipe right with a score lower than 600.

So what will the consequences of these scores be? According to Rachel Botsman, author of Who Can You Trust? They will be far worse than just losing out on dates. People with low scores will have slower internet speeds, restricted access to restaurants, longer wait times in hospitals, their right to travel or move freely could be taken away from them and will have the burden of paying higher fees for services such as phone charging stations and city bike rentals—a consequence that’s already taken effect.

While mass surveillance, the omnipotent marriage of capitalist corporations and China’s communist government, the use of data tracking to monitor and shape the behaviour of citizens, and the draconian sanctions proposed for the “untrustworthy” are all terrifying, there is something else that scares me more; the way in which an economic discourse is being tacitly underpinned by a moral one. And unlike the points listed above, it scares me not because I’m disposed to seeing it as a warning sign, but precisely because I’m not.

“Allow the trustworthy to roam everywhere under heaven while making it hard for the discredited to take a single step.” This sentence, pious as it may seem, is not taken from a religious manuscript, but is stated in the Chinese government’s documents outlining the plans for the credit system. Still, tempted as I am to use this as yet another example of just how extreme and despotic the Chinese communist party and this particular initiative is, the more I think about the rhetoric used by governments and financial entities in the west—the more it, too, seems to be implicitly packaged in moral discourse. “Together we are one”, “Open. Honest. Hardworking”, ” The better we know you, the more we can do”, ” Honesty first”, “Trust. The feeling is Mutual”. These are just a few of the financial slogans belonging to banks in Europe and the US, the same institutions that have ingrained in us the belief that our financial decisions are not purely economic, but also moral. Take for example, the notion that debt must be paid back, and that houses, jobs and people’s life savings can all be claimed by those banks because not paying back debt is such a serious offence. However, as we so clearly witnessed in 2008, when these same institutions accumulated too much debt of their own, they didn’t have to abide by the moral rules and pay it back—the government bailed them out.

But still, in the western world we have not reached a point where who we are friends with on Facebook affects our credit ratings, right? Wrong. Affirm, a US online money-lending company owned by Paypal co-founder Max Levchin is already digging through its customer’s social networks to evaluate what the default risk might be. And Singapore-based Lenddo, a company which states that it uses “non-traditional data” to provide credit scores in order to fulfil its mission of “empowering the emerging middle class around the world” (how Miss America of them), is so thoughtful that it notifies a debtor’s friends on Facebook when they don’t pay their installations back on time.

Perhaps what we in the western world find so disturbing about China’s social-credit system is the bold and unmistakable government use of personal data for creating a moral index of what a good trustworthy citizen is. Yet maybe what should worry us instead is that corporate companies and governments in the west are doing much of the same thing, only without the transparency and visibility with which China is approaching this new venture. For all we know, we may have already been assigned a ranking in the trust index, yet as opposed to the Chinese, we just haven’t been informed as to what our score is.