Don’t hate the data, hate the game – Screen Shot
Deep Dives Level Up Newsletters Saved Articles Challenges

Opinion

Don’t hate the data, hate the game

By Daniel Finegold


Human rights

May 25, 2018

“Alexa, do you protect my personal data?”. This is the question that is at the heart of the GDPR, the new EU regulations that come into force today, 25 May. It has been over 20 years since the last data protection laws were passed in the U.K. and since then, the information battlefield has changed considerably.

Web 2.0 created a way for users to collaborate, participate, and interact with the sharing of information, giving rise to a new breed of personalised services: Facebook for social networks, Twitter for news, Amazon for purchases. The tracking and sharing of information between platforms, using tools such as single-sign ons (e.g. Facebook Connect), seamlessly expanded our digital playgrounds, mapping data points on our love lives, music tastes and political leanings and then reflecting those choices in the shaping of future content that is in constant orbit around us.

The unquestionable value of insights into personal choices sparked a data gold-rush, causing industries to experiment with how personal data could be collected and analysed in order to improve their product. The rules of engagement for free services were simple: your data in return for targeted advertising. As a result, a culture of self-entitlement to our data emerged and then spread, encouraged by the sophistication of data mining tools that could be hidden from plain sight.

The purpose of the new data protection laws is to move away from this lack of transparency epitomised by the ‘tick box’ culture, by extending the obligations on businesses to secure the integrity of their data collection, chiefly by giving their customers the necessary information to understand and control the use of their personal data. A move that is an “evolution”, not a “revolution”, according to the Information Commissioner, the regulator charged with responsibility for enforcing the new laws.

Fundamentally, the laws clarify the scope of ‘personal data’; it is ultimately about the monitoring and tracking of your activities, and this can happen without storing your name or address by the use of clever technological quirks. With greater fines for those that fall afoul of these regulations and businesses being held liable for the mishandling of information by their partners, this should result in greater due diligence for the collection and sharing of our data. However, bad practices are likely to return if regulators are under-resourced or seen to be weak in enforcing the regulations. To determine the likely consequences of breaching these new laws, many businesses will be paying close attention to see how the regulators treat Facebook over its recent data protection breaches.

The responsibilities and burden of the GDPR do not rest solely with businesses and law-enforcement, the onus is on data subjects (you) to enforce their rights. The laws simply provides the armoury to do so. Individuals can request a copy of their data and require its deletion should they wish. A key argument raised by Zuckerberg in defence of Cambridge Analytica at the recent congress testimony has been that “users have complete control over their data.” Clearly this is not, and has not, always been the case. The flurry of updates to Facebook’s services since the breach was publicised speaks for itself. Users can now view in greater depth the plethora of information behind the profile that is being monetised by Facebook. Zuckerberg’s reference to “complete control” should therefore be understood within the context of the deal on the table for its users: free access to Facebook in return for targeted advertising. For those willing to put in the effort and constant upkeep, this intricate data-DNA behind our identities can be monitored, tweaked even—but its continual evolution cannot be stopped.

If there are broader learnings to be taken from Cambridge Analytica—beyond finger-pointing at Facebook for its complicity in not recognising the inherent risks with amassing a data vault and commodifying it on an open-marketplace—it is that we must acknowledge our own naivety of participating and trusting this technology, without further thought to the broader consequences. The safeguards in the physical world do not easily extend to the digital, and the burden is therefore on individuals to investigate the boundaries within which their personal identity is being constructed and exploited, so that they may protect themselves accordingly.

It is questionable whether these regulations will significantly alter the attitude of businesses and their users to the sharing of data. While there has clearly been investment from larger businesses in the improvement of the security and integrity of their data collection, a ‘spring clean’ of their data systems, the updates to privacy policies to make them clearer, and the reaching out to customers to obtain their consent to newsletters has resulted in yet more clicking and ticking. The unbridled response from the majority of users has not been to enquire as to whether updated legal terms are useful, but to complain about the clutter that this has brought to their inbox and seamless experience of these digital services.

Perhaps there is a psychological hurdle that remains, with too many degrees of separation between our physical and machine selves, such that we do not feel the same guardianship for our digital anatomy that we assume for our physical. Alternatively, it’s possible that we, rightly or wrongly, believe that we have the requisite control; that we can consume the features and fads to enhance our digital lives, and then always unsubscribe from these services if they no longer serve our interests.

Clearly businesses still desire our data, and we are still willing to share it with them. Neither side is inclined to slow down this fast moving train—it is the exchange of information that has been at the epicentre of human evolution. The question is where is the ultimate destination? Facial recognition software in the latest mobile devices can store over 83 data points on facial features alone. On the horizon, artificial intelligence and machine learning will be able to gather enough data points to produce digital replications indistinguishable from their masters. What happens when these digital replications are then violated, or when we become so dependent on our digital assistants, that we no longer have the agency to unsubscribe? “Alexa, do you want all my personal data?”, “I’m not sure about that” it responds, seeking an answer to add to its neural network of information. It’s up to us to decide, for now at least.

Airbnb and 23andMe want to use your DNA for holiday recommendations

Can’t decide where to go for your next summer vacay? Why not let your DNA dictate your next destination. From learning which strain of weed to smoke to what music you should listen to, Airbnb and 23andMe are the latest in strange collabs with DNA analysis. On May 21, the global hospitality and homestay service and the world’s first at-home DNA testing kit announced they are making a foray into the ever-rising demand for heritage travel. They’re now offering users the chance to go from a curious trip down ancestry lane online to a literal trip down ancestry lane.

Heritage travel has been hotter than ever due to the ease of technological access to our past and the opening of the genetic market. More enriching than your regular escape to Ibiza, heritage holidays provide an emotional experience to explore your ancestral roots as well as take a week’s worth of Instagram-worthy photos. Before, these excursions were meant for religious purposes such as pilgrimages to Mecca for Muslim travellers or a Jewish birthright trip to Israel, but according to Airbnb’s consumer trends spokesperson Ali Killam, heritage travel has been a top travel trend since 2014.

After filling your genealogy reports on 23andMe and sending it in, you will be able to click through Airbnb homes and experiences that reside in your ancestral homelands. According to 23andMe, users have at least five (out of eight) different origins within their report, which leads to ample options for your travel itinerary. Compared to sifting through dusty records and studying family trees, it’s a unique and easy way to learn about your heritage. Not to mention an offbeat approach to your vacation plans inspired by your genetic code.

According to press releases on Airbnb and 23andMe, their aim is to provide “an exciting opportunity for customers to connect with their heritage through deeply personal cultural and travel experiences.” Although some see it as a unique escapade, others find this partnership uneasy. Using DNA to capitalise on emotional experiences may feel demeaning and diminish the significance heritage trips could actually be. Not to mention the level of legitimacy of your genetic reports and the authenticity of these travel options.

As we become increasingly wary of how our data is used the thought of our genetic data used for ulterior motives comes to mind. This slightly whispers back to U.S. President Roosevelt’s 1942 order for Japanese Americans to register their identity and begs the question: Could this be a new form of racial registration but with a different face? The ethical implications of having your cultural and racial identity monetised should also be questioned. If it is used as a cultural or racial registry, this not only affects the people who have done the test but also their relatives—which was proven through the capturing of the golden state killer, where police found him through the genetic code of a relative who used 23andMe. Although both parties have stated they won’t be sharing personal information with the other, being sceptical seems like a safe option.

Or maybe we should just take this collaboration at face value. As one Reddit user asks in the 23andMe thread discussing this topic, “Why are you complaining?”. There also are some positive implications that could come from this new way of travelling. People who are adopted or people from disconnected communities from their ancestral home, such as the African American community, could learn more about their geographic history. For those who have no idea where they came from this may be a stepping stone to their personal journey, but then again, how personal can you get with a pre-packaged holiday?

If both companies communicate this collaboration as a way to explore your roots from a physical standpoint then it may not seem so contrived. Instead, it seems that Airbnb and 23andMe are trying too hard to pull at your heartstrings and not telling it like it is: a fun new way to pick your next travel destination.

Your approach to this new way of travelling now depends on how you perceive society. Some may think, as we have already given up so much of ourselves to digital data, what’s one more thing? Others may see it as another way for companies to use and sell our data. So where exactly is the limit? Just like many other questions, there’s no right answer just yet. We’ll find out in the future. In the meantime, hopefully give your long lost cousin a good Airbnb review cause you know, ‘family’.