Facebook-owned WhatsApp presented users with a simple choice this January: either they accept the messaging app’s new privacy policy or go and talk to friends somewhere else. To their surprise, users fearing the abuse of their personal data fled the platform so quickly that rivals Signal and Telegram together scooped up over 10 million of them within a single week, according to data from Sensor Tower.
The worry over the changes was ignited by an apparent misunderstanding, as users thought WhatsApp could now see the content of their calls and messages and freely share this information with parent Facebook. The former is not possible given the messaging service’s encryption while the latter has already been the case since 2016 with respect to certain metadata including our phone number and the make of our device. Still, the miscommunication has left substantial chaos in its wake, and WhatsApp was eventually forced to delay the policy change by three months.
But what did really change?
Upon discussing the botched privacy policy overhaul with Gizmodo’s Shoshana Wodinsky, it occurred to me that WhatsApp achieved something remarkable: it changed nothing and quite a lot at the same time. The messaging app will continue to send metadata to Facebook, so nothing to see here. But WhatsApp is not just for exchanging opinions on the latest memes: the service has expanded into mobile payments, online shopping and customer service, with a focus on fast-growing markets such as Brazil and India. In theory, our messages with businesses are encrypted in the same way as the ones we send to fellow human beings: Facebook will have no clue of our symptoms when we order medicine online.
Unfortunately, this is not the full picture.
As soon as the business we communicate with outsources the management of the servers used for its Business API – the software that gives businesses access to WhatsApp – to a third party called a Business Solution Provider (BSP), end-to-end encryption no longer holds. Facebook does not guarantee encryption if your data is in the hands of these firms, even if it is used for ad targeting on Facebook itself, and in the future even if it was not a third-party but Facebook who took care of the Business API.
Browsing the list of BSPs is a sure-fire way to stumble upon some hidden gems. My personal favourite is Indonesia-based PT Digital Artha Media which has a webpage that consists of literally one page while searching for their address on Google Maps yields a picture of a pedestrian overpass in Jakarta (the actual HQ building is next to the overpass). Sinch, which is active in Hungary, is another interesting beast, with its offer of a whopping up to a 275,000 per cent return on its clients' marketing spending.
Social data externality
Back to WhatsApp, some have noticed that any potential new data-sharing with Facebook has no notional impact on customers in the European Union – as Dutch MEP Paul Tang noted, that is exactly why have data protection. But as Professor Tommaso Valletti of Imperial College London highlighted, the EU’s GDPR privacy armoury is insufficient here: Facebook does not need to directly obtain our data via WhatsApp, as it can also use statistical profiles based on users outside of the EU and apply it to our Facebook data the social media giant already has. In this way, the company can predict user behaviour using WhatsApp data even if the user in question never shared anything via WhatsApp.
This is not only a theoretical worry, as the very same problem was one of the sticking points in Google’s acquisition of wearables-maker Fitbit.
Researchers at Yale University and MIT call the issue the social data externality. In a nutshell, it means that people similar to us (in terms of their home address, interests, or socio-economic status) can reveal quite a lot about who we are even if we never actually meet. This renders users of services such as Facebook completely helpless, as digital platforms can collect whatever piece of data they feel like. The externality makes the social cost of data-sharing larger than its private cost, but since this does not factor into our decision-making process, we are prone to share more information than is socially optimal.
Asymmetric, unfair and opaque
Even more important than oversharing is the fact that the transaction between user and platform is based on an economic relationship that is grossly asymmetric, unfair and opaque. The vulnerable user faces off against a platform with gigantic power: as economists Jonathan Haskel and Stian Westlake explain in their book Capitalism Without Capital, digital platforms can easily consolidate their market power due to scalability (the marginal cost of serving a new user is practically zero) and network effects (the value of the platform for any given user rises with an additional user who joins). Such industries can become what is commonly referred to as winner-take-all: the 92% market share of Google’s search engine is merely one of the many examples you can spot.
This power imbalance can easily tip the market in favour of a few dominant platforms, who in turn can do basically whatever they want.
As Dina Srinivasan – whose research underpins last year’s US antitrust lawsuit against Facebook – explains, it was the elimination of its rivals that allowed the social media giant to systematically deceive both its advertisers (inflating the effectiveness of its ads) and users (with false claims on privacy). Absent competition, neither side of the platform could vote with their feet and move away from it.
Opaqueness is a more pernicious issue still. Researchers at Carnegie Mellon University and the University of Arizona show that, even though we care about privacy, we are prone to sharing too much information if the risks of doing so are hidden to us. And boy hidden are they: according to research from the Financial Times our health data from a few seemingly harmless health-related websites could pass through over a hundred webpages, to end up in the hands of the leading digital platforms – even without explicit user consent, an action prohibited by GDPR.
This is a classic case of asymmetric information: we have no idea who is going to use our data and for what purpose.
We have no choice either, as the ubiquity of the business model referred to as surveillance capitalism by Professor Shoshana Zuboff of Harvard University often means we have no place to hide.
The crisis moment of online advertising
We cannot therefore think of our personal data as any other good that we can freely put up for sale on the market: the invisible surveillance and the social data externality mean we have no control over what data to share and when. We need to be mindful of the potential harms that the lock of control over our personal data might entail.
It is not a coincidence if the reader got confused. The opaqueness of the ad-tech world looks similar to the financial sector pre-crisis when neither banks nor regulators understood their institutions’ exposure to risky financial instruments. The online ad industry may just be reaching the point that was 2008 for exotic financial assets: the chaos that ensued after WhatsApp’s privacy change not only showed that we do care about privacy but also that we have very little understanding of it. I can only hope that this is where the similarities end, and that regulation will precede the ad-tech market’s meltdown.
Dávid Pákozdi, competition economist; teaching assistant at Queen Mary, University of London
(This article was originally published in hungarian.)