Companies like Google and IBM are opening up services through APIs that will allow you to do things like check if an image contains adult/violent content, check to see what mood a face on a picture is in, or detect the language a piece of text is written in. Artificial Intelligence as a Service as it were (or maybe Machine Learning as a Service would be more appropriate).
So imagine building your product on top of these services. What happens if they start asking you to pay? Or if they censor particular types of input? Or if they stop existing? Where are the open alternatives that you can host yourself?
For anyone who likes logical Lego, the availability of these plug and play services means that in many cases you don’t have to worry about the base technology, at least to get a simple demo running. Instead, the creativity comes in the orchestration of services, and putting them together in interesting ways in order to do useful things with them…
For the past few years I have been saying that more and more of our lives will very soon be under the governance of the big Silicon Valley information giants. Living in Google’s jurisdiction as it were.
Morozov is much more articulate than me. He is now convinced that once this transformation is complete there won’t be a way back.
In fact, technology firms are rapidly becoming the default background condition in which our politics itself is conducted. Once Google and Facebook take over the management of essential services, Margaret Thatcher’s famous dictum that “there is no alternative” would no longer be a mere slogan but an accurate description of reality.The worst is that today’s legitimation crisis could be our last. Any discussion of legitimacy presupposes not just the ability to sense injustice but also to imagine and implement a political alternative. Imagination would never be in short supply but the ability to implement things on a large scale is increasingly limited to technology giants. Once this transfer of power is complete, there won’t be a need to buy time any more – the democratic alternative will simply no longer be a feasible option.
ownCloud has really matured in the last couple of months. Both from a technical perspective and as an open source community based project that just kicks ass.
The free software replacement of Dropbox (and eventually GoogleDocs) is fast becoming one the key pieces of infrastructure for a decentralised and open web. I love how absolutely hardcore pro freedom the community of developers is.
It is great to read that they are now also taking very concrete steps towards making it easier to just have this running in your own home.
[The] goal of this project is to create a product home users can buy to easily and quickly get their ownCloud up and running, based on a Raspbery Pi (or something like that!) and a hard drive (or more). Something they will be able to buy from a easy store online, receive home, plug in, configure in some easy steps and – done.
I’ve written before about the use of police bodycams, mainly looking at what the first person perspective might mean for the way we will see (police) violence in the future.
The ACLU has written an blog post about a video that clearly shows the level of manipulation that is possible for the police around their own footage. The third person viewpoint of a public surveillance camera made that clear in this particular case in point.
Last year I wrote about “acting and directing with police body cameras” — how police officers are likely to increasingly learn to manipulate the photographic record that their cameras create. A stark case study in that kind of manipulation can be found in video of a 2014 arrest in Florida that was released in January and recently came to my attention. It’s the kind of video that everyone should watch in order to become sophisticated and properly skeptical consumers of video evidence.
Update (d.d. 4 April 2016): The New York Times has put up an interactive site that allows you to see the same footage of standard policing situations from different points of view. Do check it out. The main lesson? What we see in police footage tends to be shaped by what we already believe.
How the focus on security and the culture of fear has real negative effects and hurts our social integrity.
The term ‘doublethink’ comes from the book ‘1984’ of course. Big Brother’s ‘Ministry of Truth’ -specializing in fabricating lies- uses slogans like ‘War is peace’ and ‘Freedom is slavery’.
There is another classic book in which the state creates paradoxical rules to keep her citizens in check. It is one of my favourites: ‘Catch-22’ by Joseph Heller.
Yossarian is the protagonist, a Captain in the American Airforce during World War Two. When, during a mission, his buddy Snowden (yes, you can’t make it up) dies, something breaks inside of him. He decides he needs to escape. He tells the doctor that the war is making him insane and that he wants to go home. The doctor tells him that there is a rule that says that anybody who wants to go home because of the war can’t be insane. Yossarian has to stay because of rule 22, the infamous ‘Catch-22’.
One of the most interesting characters in the book is the profiteer Milo Minderbinder, responsible for the canteen at the army base.
Minderbinder runs a ‘syndicate’, M&M Enterprises, of which everybody (according to him) is a member. I can’t explain precisely how Milo buys fresh eggs for 1 cent in Sicily, sells them for 4-and-a-quarter cents in Malta, buys them back from there for 7 cents and sells them to the base for 5 cents, while still making a profit. Milo himself is clear about where the profit goes:
"Of course, I don’t make the profit, the syndicate makes the profit. And everybody has a share."
As soon as anybody questions his intentions, he literally hands them ‘a share’. Minderbinder sells anything and everything that he can find on the base. For example, when their plane has to make an emergency landing on the water, the crew finds out that he has removed the CO2-cannisters from the life jackets to make icecream to sell in the canteen. He has replaced them with a note with the following text:
"What’s good for M & M Enterprises is good for the country."
The Minderbinder character is Heller’s razor-sharp critique of the military-industrial complex. "What was good for the country was good for General Motors, and vice versa" said the former CEO of General Motors in 1953 when he became the American Minister of Defense.
Nowadays companies still use this type of of ‘doublespeak’.
Commercial interests are then equated to public interests. We now call outsourcing public tasks and risks to the business world (in exchange for a profit of course) ‘public-private partnerships’.
Proponents of this concept are often allowed to appear in the public eye as an ‘independent’ technical expert, to give their opinion on safety and the Internet. For me that feels a bit like you are asking a locksmith whether she thinks that the number of break-ins will increase, or that you create space for the thoughts of the CEO of Durex on the population explosion on the African continent.
Record holder in this rhetoric of (internet)safety as a market is ‘The Hague Security Delta’. A group of private companies, governmental organizations and knowledge institutes with a shared goal. I cite: "more business activity, more jobs and a secure world". Let’s take a look at the way in which The Hague Security Delta recruits students for their campus…
It is important of course to be a frontrunner in the cybersecurity domain. However, this bombastic piece of ‘safety-porn’ has a very damaging side to it. It scares us.
At Bits of Freedom we often talk about the ‘chilling effect’: not daring to do certain things anymore because you think you might be listened in on or looked at. The current focus on more and more security has another negative effect. The effect of the false positive: we see dangers that don’t exist.
You’ve probably read about Ahmed Mohamed, the 14 year old from Texas who was put in handcuffs and was arrested after he had shown his self-made clock to his teacher at school.
Or about the 30 hipsters who had to answer to two police officers after a passer-by had gotten a bit nervous after seeing their black flag.
It isn’t only Muslims and men with beards who are the victim of our urge to profile.
This shoe is owned by Peter Schaap. The laser helps him to walk with his Parkinson’s. Last month, he was sitting in the bus waiting for it to leave. The bus driver refused to get in. Before Peter knew what was happening he was taken off the bus by two police officers. They had been called by one of his fellow passengers who, rather than asking him why he needs those special shoes, had just dialed the emergency number.
Although we can probably also laugh about this, it is very sad story too. Apparently, deviant behavior is immediately seen as suspect. It is symptomatic for what I’ll call a ‘Culture of Fear’. And these are only the examples that make the news. How often does this happen to people without us getting to know about it?
That is why I was so disappointed when the boss of the Dutch secret service, Rob Bertholee, told a room full of readers of ‘De Correspondent’ that he wants to flip around the standard question about the so-called balance between privacy and security. "How much security do you want to give up for privacy?", he asked. This shows that he doesn’t see how fear has a deleterious effect on how we relate to each other. The question that has to be asked instead —by him too— is: How much societal integrity do we want to give up for a one-sided and anxious focus on security?
The earlier examples of false positives show a human failing. But more and more future decisions about us will be taken by computer algorithms using profiling data. On the basis of the collected data about us (where do we live, what is our ‘sentiment’ on social media, what have we bought recently) we are pigeonholed by the system.
Last summer, Google’s image recognition algorithm categorized Jacky Alciné’s black friend as a gorilla…
Not only does this say something about the lack of diversity of the Google team, it also shows the current limitations of technology. The exact same machine learning techniques —including its preprogrammed biases— will make a guess whether you should be allowed to order at a web shop, whether you are eligible for a deduction on your insurance premium, if you aren’t being fraudulent with the mileage of your company car, and whether you are intending to travel to Syria of course. If you start looking for that one dangerous exception in massive amounts of data, you will by definition mostly find false positives. These wrongly profiled people are therefore the victim of our craving for more (false) security and for bigger data.
We have to keep resisting the fact that we are constantly being reduced to our profiles. We can really say that in the case of the digital civil rights movement everybody does have a share. So let us keep fighting together for an internet on which human rights are truly meaningful and for a society in which we can truly be free.