Freedom and Justice in our Technological Predicament

This is the thesis I wrote for my Master of Arts in philosophy.
It can also be dowloaded as a PDF.

Introduction

As the director of an NGO advocating for digital rights I am acutely aware of the digital trail we leave behind in our daily lives. But I too am occasionally surprised when I am confronted with concrete examples of this trail. Like when I logged into my Vodafone account (my mobile telephony provider) and—buried deep down in the privacy settings—found a selected option that said: “Ik stel mijn geanonimiseerde netwerkgegevens beschikbaar voor analyse.”1 I turned the option off and contacted Vodafone to ask them what was meant by anonymized network data analysis. They cordially hosted me at their Amsterdam offices and showed me how my movement behaviour was turned into a product by one of their joint ventures, Mezuro:

Smartphones communicate continuously with broadcasting masts in the vicinity. The billions of items of data provided by these interactions are anonymized and aggregated by the mobile network operator in its own IT environment and made available to Mezuro for processing and analysis. The result is information about mobility patterns of people, as a snapshot or trend analysis, in the form of a report or in an information system.2

TNO had certified this process and confirmed that privacy was assured: Mezuro has no access into the mobility information of individual people. From their website: “While of mobility patterns is of great social value, as far as we’re concerned it is certainly not more valuable than protecting the privacy of the individual.”3

Intuitively something about Vodafone’s behavior felt wrong to me, but I found it hard to articulate why what Vodafone was doing was problematic. This thesis is an attempt to find reasons and arguments that explain my growing sense of discomfort. It will show that Vodafone’s behavior is symptomatic for our current relationship with technology: it operates at a tremendous scale, it reuses data to turn it into new products and it feels empowered to do this without checking with their customers first.

The main research question of this thesis is how the most salient aspects of our technological predicament affect both justice as fairness and freedom as non-domination.

The research consists of three parts. In the first part I will look at the current situation to understand what is going on. By taking a closer look at the emerging logic of our digitizing society I will show how mediation, accumulation and centralization shape our technological predicament. This predicament turns out to be one where technology companies have a domineering scale, where they employ a form of data-driven appropriation and where our relationship with the technology is asymmetrical and puts us at the receiving end of arbitrary control. A set of four case studies based on Google’s products and services deepens and concretizes this understanding of our technological predicament.

In the second part of the thesis I will use the normative frameworks of John Rawls’s justice as fairness and Philip Pettit’s freedom as non-domination to problematize this technological predicament. I will show how data-driven appropriation leads to injustice through a lack of equality, the abuse of the commons, and a mistaken utilitarian ethics. And I will show how the domineering scale and our asymmetrical relationship to the technology sector leads to unfreedom through our increased vulnerability to manipulation, through our dependence on philanthropy, and through the arbitrary control that technology companies exert on us.

In the third and final part I will take a short and speculative look at what should be done to get us out of this technological predicament. Is it possible to reduce the scale at which technology operates? Can we reinvigorate the commons? And how should we build equality into our technology relationships?

Continue reading