I’ve always had a weak spot for philosophy. One strange aspect about philosophers is their relationship with language. It is often and fiercely disputed whether a thought can exist without being put into language first. For instance, the philosopher Ludwig Wittgenstein is famous for claiming in his “Tractatus Logico-Philosphicus” from 1922: “The limits of my language are the limits of my mind. All I know is what I have words for.” [1]

Indeed, when we look, for example, at the expression “Third World” no one had ever used it before a certain moment in time. A single person created the concept and put it into language and with that in our minds. In this specific case the French demographer Alfred Sauvy published an article in the year 1952 which “created” the Third World.[2] With the term came more interest and it is hard to judge now whether there was first the idea for the phrase or the actual part of the world we refer to when we use it.

In the digital age humanity has started to use new forms of language to shape the world. Often it is referred to as being “code”. Famously, the American scholar Lawrence Lessig has made the argument that such “code is law”.[3] In other words, code is a language that shapes our world like the ones humans used for ages. The limited number of people who can “talk” to machines and control them are so powerful that they also determine the rules for the users of their services and society. Some people have started to call this phenomenon the “rule of algorithms” or “algocracy”.[4]

This technology is also being used to increasingly collect, store and process personal data relating to human beings. It is questionable if and when it will be possible to represent all human qualities in the virtual world. Personally, I prefer a position which does not distinguish between an “actual” and a “virtual” world, but suggests that any form of technology can be interpreted as an enhancement or expression of one or more human features.[5] Put in one simple equation that means: Humans shape technologies that shape humans that shape technologies and so on.[6]

There are two dimensions for the analysis, or “contextualization” of personal data: the future and the past.[7] Who uses an application on a smartphone to “track” a run can find out how fast he or she was yesterday in the evening and how that compares to the days before. Who tracks sleeping patterns might find out that in the past week the average sleeping time was 8 hours. Who tracks what kind of food one eats over a year might be able to make predictions how this influences weight, health, mood, even relationships and social status. After all, aren’t you what you eat and with whom?

We have started to live in an age where unprecedented volumes of personal data are available. With this phenomenon new questions arise. As a society we would like to use this potential for good causes. We constantly keep discovering tools which might be enormously beneficial for all of us. However, when a technology has large potential to do good there are also dangerous aspects which must not be ignored.  So how can we use these large amounts of personal data safely and sustainably?

We can use the data we created and stored to analyze our past behavior. But more interestingly, we can also project our future success and development. However, this requires more thought and analytical skills.

If someone slept 8 hours each night in the last week, does that mean the same person will do so in the next one as well? Maybe we need to compare the data with other similar data from a number of people who also tracked their behavior? Depending on how skillful we are we can make predictions which will become reality with a high probability. However, this activity can have serious implications for the people concerned.

For instance, when software is being used to predict whether a person who has committed a crime will do so again in the future it is necessary to know how the software comes to this conclusion. Angwin and others published a news article on the 23rd of May 2016 which suggested that an analytical tool in use in several states of the USA might be biased against people with black skin.[8] Based on research it was submitted, although contested by the software’s for-profit developer company, that the criteria in use to calculate the risk whether a person would commit another crime was higher for people with different genetic backgrounds. In other words, it seemed as if the algorithm developed by the company to calculate a risk rate focused on the wrong aspects. It was stated that “[…] [t]he formula was particularly likely to falsely flag black defendants as future criminals, wrongly labeling them this way at almost twice the rate as white defendants. White defendants were mislabeled as low risk more often than black defendants. […].”[9]

These people probably suffered serious consequences from the rise of the risk rates, because judges were supposed to use these tools to make decisions on probation conditions. Hence, if there is a wrong presumption in the basis of the software there is a serious problem for the person affected. This even might lead to a significant deterioration of the potential to develop one’s personality and ultimately deprive people from a better life they should be able to live. In other words: If there is a prejudice in the basis of an algorithm, it will become the common and tolerated view of all people who base their decisions on this tool.[10]

For example, let us assume that the totality of crimes in the United States in a given year are carried out by 40 percent of people with black skin, 35 percent with white skin color and the rest with mixed skin color. In such a situation it is a fact that most of the crimes were committed by people with dark skin. But does that mean that a person is more likely to commit a crime because his or her skin happens to be dark? Certainly not.

More often than not, striving for a better society means turning an unforeseen event in reality. Such is hard to achieve, but humans can do it if they have good intentions and sufficient space for personal development.

It is interesting to know if and how I can foresee (or “project”) future behavior with a certain probability. Even more interestingly, we could use these methods to allocate resources based on the likelihood of developments. Maybe this sounds a bit complicated, but it can be very rewarding to master this discipline; not least in the monetary sense.[11]

Let us assume a young person just got its driving license. Is there anything greater than getting hands-on your first car and head out on the open road? In the digital age you might think twice about this, since your insurance company might have a particular interest in installing a monitoring mechanism (“black-box insurance”).[12] By this, the insurance company makes sure that such young persons do not accelerate or break too fast, stick to speed limits and potentially even avoid certain times and areas for driving. It is sensible, because it is easier to ensure that young people drive like they should.

In summary, one can state that with the ever increasing amounts of personal data and continuously increasing skills to analyze them,[13] it will become easier to foresee people’s behavior in the future and avoid certain risks. We will be able to put the right amount of buses and trains at the right time to the right stations. We will have enough police and security personal at critical times in the right parts of the city to avoid that citizens feel unsafe and insecure. But what about the “side-effects” of such a world, where everything is recorded, monitored and analyzed?  On the one hand, we–as a society–will be much more in control. On the other and sadly, we–as individuals–will be much more controlled.

As a consequence of this increased level of control it can be predicted that the individuals in this new world of large amounts of personal data and new analytics (“big data”) will be much more vulnerable: “Shorn of the cloak of privacy that protects him, an individual becomes transparent and therefore manipulable. A manipulable individual is at the mercy of those who control the information held about him, and his freedom, which is often relative at best, shrinks in direct proportion to the extent of the nature of the options and alternatives which are left open to him by those who control the information.”[14]

It is one thing to be most likely taking a certain bus on Monday morning at a certain time to go from the place one lives to the place one works. It is another to be (a little bit more than) expected to do so.

However, since the monitoring of each of our behaviors is becoming ever more continuous and coherent we are risking to be able to no more take decisions, but instead having to show all the time that we comply with a projected reality.[15] That certainly leads to a more controlled and safe environment, but just as definitely will have devastating effects for individuality, freedom and personal development.

Hence, we have to be very careful when we use analytical tools to make decisions that most certainly will have heavy implications for a person’s future development. This is a question of human rights, since the right to privacy and personal development is very closely linked to a life in dignity.[16]

It is useful to employ analytical tools to project somebody’s behavior. But when we base decisions with actual and grave consequences for humans on technology we have to be aware of the presumptions that have been made when developing the analytical tools in use.[17]

As already mentioned at the start of this piece: If we do not watch closely how we allow language to shape our world we might end up in a place we never wanted to live in. But it might be the world that was most likely to be existing at the point in time when we gathered the data, when we designed the algorithms and tools to help us make decisions.

While technology certainly does not have the capability to predict our future in full detail, humans relying on this technology do have the power to shape the world in a way that fits the prediction. The only question is whether such a society is worth striving for. It certainly is not, if the algorithms on which a projection is based do not take into account basic human capabilities, such as the potential to make an unlikely event reality.

While big data analysis could potentially be very helpful it is not only the creators and/or subjects who shape these tools.[18] Also the tools will shape the lives of the creators and subjects who they relate to.

It is pointless to block or hinder the development of technology as such, but it is necessary to ask what we, as humans, consider to be development in the first place. Wouldn’t it be better to keep the purpose of big data analytics limited and clearly defined while trusting more in the naïve yet ingenious force of natural and unmonitored personal development?[19]

As comfortable as it might be to remain within the known limits of what one can express in code or algorithms, already Wittgenstein concluded: “Whereof one cannot speak, thereof one must be silent.”[20]

References

[1] Originally in German: “Die Grenzen meiner Sprache bedeuten die Grenzen meiner Welt.” Wittgenstein, Tractatus Logico-Philosophicus, London, Routledge & Kegan Paul Ltd, Eight Impression 1960, 5.6.

[2] Sauvy, Trois Mondes, une planète, L’Observateur, 14 août 1952, n°118, p. 14.

[3] Lessing, Code version 2, full book available via URL — accessed on 05.07.2016.

[4] Crichton, Algocracy, Techcrunch 23.05.2015 via URL — accessed on 03.10.2016.

[5] Similar to what Marshall McLuhan has laid down in connection with the use of television and media as “extensions of men”; McLuhan, Understanding Media — The Extensions of Man, McGraw-Hill, 1964.

[6] Cf. Kucklick, Die granulare Gesellschaft, Ullstein, p. 77 – 84.

[7] Gstrein, Das Recht auf Vergessenwerden als Menschenrecht, Baden-Baden, Nomos, 2016, p. 36 – 60.

[8] Angwin et al, Machine Bias via URL — accessed on 24.05.2016.

[9] Ibidem.

[10] Cf. a law suit of the American Civil Liberties Union which aims at making it possible to look more closely at the presumptions made when developing an algorithm: Brandom, New ACLU lawsuit takes on the internet’s most hated hacking law via URL — accessed on 05.07.2016.

[11] Cf. eg. OECD, OECD Digital Economy Outlook 2015, OECD Publishing, Paris via URL — accessed 30.05.2016, p. 245 – 247.

[12] Stott, Black box car insurance: a young driver’s new best friend behind the dashboard, The Guardian 26.03.2016 via URL – accessed on 03.10.2016.

[13] Cf. Mayer, Mutchler, Mitchell, Evaluating the privacy properties of telephone metadata, PNAS, Vol. 113, No. 20, p. 5536 — 5541.

[14] Cannataci, Privacy and Data Protection Law; international development and Maltese perspectives, Norwegian University Press, 1987, p.60.

[15] Cf. Gstrein, Privacy in Germany 2015, 01, p. 9 – 17, p.11 ff.; Horne, Maly, The Inspection House, Toronto, Coach House Books, 2014; A study in the USA has come to the conclusion that Half of all American adults are already in some sort of facial recognition network accessible to law enforcement. Collier, Study: 1 in 2 American Adults Already In Facial Recognition Network via URL — accessed on 20.10.2016.

[16] Cf. The right to personal self-determination as laid out by the German Constitutional Court, Urteil des Ersten Senats vom 15.12.1983, 1 BvR 209/83 u. a. — Volkszählung –, BVerfGE 65, 1, Mn. 152.

[17] Smith, Algorithmic transparency: Examining from within and without via URL — accessed on 03.10.2016.

[18]  Mayer-Schönberger, Padova, Regime Change? Enabling Big Data through Europe’s New Data Protection Regulation?, The Columbia Science & Technology Law Review, Vol. XVII, Spring 2016, p. 332.

[19] Or in the words of Antoinette Rouvroy: “Therefore what the foregoing considerations invite us to do is to relocate the gravitational centre of legal subjectivity and protect it by law, focusing not so much on people’s capacities of understanding and volition, or their control over their intentions, but on a certain inclination towards spontaneity and unpredictability, an open-minded approach to events and an ability to express ideas, however far-fetched.” Rouvroy, “Of Data and Men” — Fundamental Rights and Freedoms in a World of Big Data, Council of Europe, T-PD-BUR(2015)09REV via URL — accessed on 03.10.2016.

[20] Originally in German: “Wovon man nicht sprechen kann, darüber muss man schweigen.” Wittgenstein, Tractatus Logico-Philosophicus, London, Routledge & Kegan Paul Ltd, Eight Impression 1960, 7.