Sherlock Holmes sent you a friend request

On digital user-tracking

Disclaimer: This blog post was written as an assignment for DPI-662 ‘Digital Government’ class at the Harvard Kennedy School of Government

1. On intentionality

Imagine that you live in the XIXth century. In the streets of London or Vienna, young boys sell the latest newspapers. One follows you to your house and tells The Observer where you live and whether you like Pouchkine or Shakespeare. With the help of an address book and theatre tickets, zealot journalists could probably identify who you are. So what? Unless the boy reveals your love affairs or divulgates negative misinformation about you, you would probably do not care. But if you were to spot Sherlock Holmes’ trench-coat at your doorstep, spying on your every moves for what you think is an investigation to make the neighborhood safer, you would probably feel uncomfortable with someone analyzing your behaviour. The key difference between the boy and Sherlock Holmes is the intention that goes with data collection and whether it can be used to infringe either or intimacy or our freedom. We generate data as we breathe, but data analysis and interpretation is where privacy can be breached, and for that you need a human intention to begin with. So to what extent shall we accept privacy invasion and how does it look like in the digital world?

2. The Facebook time bomb

Facebook collects data at different imbricated layers:

  • the data I generate by interacting with the Facebook platform and its related services: connections, messages, likes, time spent reading pages, transactions on the platform (including credit card number),
  • the data I generate by interacting with external platforms for which I use the Facebook log-in (like Spotify)
  • the metadata associated to my Internet usage: dates and times I engage with certain activities, geolocation, devices and networks I use, cookies….
  • the inward data that I do not generate but that others do about me: time when they comment pictures I am in..
  • The sale of services to external counterparts that leverage my personal data, especially with FacebookAds which allow brands to push me targeted content based on my profile and cookies, and analyze how I responded. The Ads can also be political, and be based therefore on how Facebook algorithms appreciates my political sensibility. Other products like Facebook Pixel allow retailers to track my behavior on their own websites, and associate it with the ads I had been targeted by. In particular, developers of external platforms can receive my Facebook data (likes, comments on their page). Basically, Facebook serves as giant personal identifiers depository to which platforms can connect to retarget their services.

3. So what: discomfort exists, but it is not all black and white

To summarize, Facebook sells my personal data, Apple does not. In both cases, it is a digital contract I have with these firms, and which I am supposed to be conscious of under GDPR philosophy. To summarize, my discomfort on the loss of privacy stems from:

  • The loss of intimacy — if data is collected all the time, then you no longer have the “right to be let alone” which has been linked to privacy in the aftermath of the Industrial Revolution (see Brandeis & Warren in 1890);
  • The loss of control — if all your data is stored indefinitely (because it is so cheap), great uncertainty exists on how our your data will be used or transferred. OECD recommends that only necessary data should be collected.
  • The loss of autonomy — Certain content can modify significantly your behaviours and opinions (political ads for example), and you have little control over the algorithmic regulation which arbitrates for pushing you certain types of content;
  • The loss of secrecy — a machine will not uncover secrets, but human intentionally can, like a Sherlock Holmes could do with my Facebook location history.
  • The loss of integrity — as highlighted in section 2., my biggest fear is that an intentioned Sherlock Holmes conflates the analysis of my personality traits (targeting, predictive behavior) with my identity. All in all, it is the association of private data analysis with my identity that truly violates my privacy.

4. What can we do about it?

Three actions can be outlined to offset negative feelings about user tracking:

  • Fear of misuse by intentioned agents requires to strengthen security and break down user databases, so as to separate identifying data from behavioral data.
  • Greater transparency shall be given to the user of the frequency of his data usage like how many ads targeted him.