I.
Onlife – this is how the philosopher Luciano Floridi, who teaches in Oxford, describes our present. He wants to express the fact that we can no longer distinguish between online and offline. Of course, we can switch off the smartphone or even – as Hans-Magnus Enzensberger has demanded in some kind of Swing-Riot-attitude – dispose of it, but we cannot escape the online world, we remain in the onlife world: permanently monitored and networked. Two examples may illustrate this onlife life in sharp focus:
Facebook has installed a tool which, on the basis of one’s communication, posts and likes, estimates a probability of whether one has depressive tendencies or is even suicidal. Facebook can only do this because the users have been "informed" about this procedure somewhere in the general terms and conditions – we all know how proper the level of being informed regarding these consents is. Of course, the system does not make a proper psychiatric diagnosis, but only draws this conclusion with the help of pattern comparisons with numerous other postings and likes. I call this a deep intervention, because it is always very irritating, not to say shattering for people when they are confronted with such a hint, presumably unexpectedly. Facebook then offers the user three options: 1.) Should we inform friends? 2.) Here are the phone numbers of hotlines that offer help! And third: Here are the best quick hints on how to behave to prevent suicide.
Is such a procedure morally and/or legally admissible? Is life saved here – at least with a view to an acute situation? Or do people feel traumatized, stigmatized or in the worst case even "encouraged" in the sense of a self-fulfilling prophecy to think about the terrible option of suicide through this scientifically questionable procedure? But what alternatives are there under the real conditions of the onlife world? Isn't the advice provided here "thanks to" algorithmic "help" cheap – probably not the best, but after weighing advantages and disadvantages not the worst of all conceivable possibilities?
Second example: Every week one can read with – I confess for myself – the greatest horror how the social credit system to be introduced in China by 2020 involves partly abstruse behavioral regulations. Not only can one's own image be denounced on a large advertising screen, as is now widely known, if one has crossed a red light at a street corner before, no, even entitlements that seem as basic to us as those of freedom are to be regulated by standardized, fixed norms and credits of good behavior towards the party, the company, family members or society. Accordingly, misconduct leads to malus, stigmatisation and exclusion. But who determines what is good and bad, right and wrong, socially desirable and undesirable? Are these categories congruent – or does the dynamics and evolution of life in society not result from the fact that perceptions of good, right and socially desirable are not congruent? And how can it be checked, how can it be questioned, how can it be claimed that certain, purely statistically determined, realistically unchecked subsumption options are right and just?
This is how life in onlife can be, this is how it already is: permanently monitored, crushed into data points and then not only, but above all by large Internet companies, the so-called platform companies, reassembled by means of pattern recognition into behaviour forecasts used for advertising purposes.
II.
Against the background sketched out, I would like to point out that digitalisation carries the potential to constrict the real exercise of freedom and self-determination. In my view, it is most worrying that this perceivable process is taking place in small steps and that a point of no return might be reached in the not too distant future. In order to express this double concern, I will briefly outline three developments of digitalisation that can mutually reinforce each other and can conjure up the feared tipping point. I am talking about trends regarding economy, civil society and understanding of self-determination. In view of this overarching development, I no longer plead for optimism in dealing with Big Data and AI – following Terry Eagleton's formulation: "Hope without Optimism". I will therefore conclude with an outlook as to why I believe we (still) have the means to defend freedom as well as other constitutional principles of our civil society. But first, three trends become more urgent:
1.) In the field of economics, I see two major challenges that we are facing or have already faced as a result of digitalisation – whether we want to admit it or not: On the one hand, the future of work seems more uncertain than ever. The figures on how many jobs the so-called Fourth Labour Revolution will cost fluctuate considerably: from the initial dramatic forecasts presented by the two Oxford economists Frey and Osborne, who, in developed countries such as the USA and Germany, regard almost two-thirds of all occupations as endangered by digitalisation, to the much more moderate estimates in the White Paper "Work 4.0" of the last Federal Government in Germany or the current OCED forecasts. However, these forecasts share the following three assumptions: The first is that most job losses are to be expected in the white-collar milieu. It will not only affect truck drivers, engine drivers and office workers, but also bank and insurance clerks, administrative employees, engineers, sales managers, controllers, some physicians and lawyers, designers, stock exchange and real estate agents, and so on and so forth. Secondly, the number of creative, productive and education-intensive occupations will be paid better, but thirdly, the number of job losses for well-educated people will not be compensated by comparable alternatives. In the end, however, we will probably not only have a minus in jobs for well-trained people. Rather, the broad middle class, over which the dictum of the "leveled middle class society" was partly critically, partly ironically felled, threatens to be crumbled if this trend is not counteracted. The dramatic issue about this development is that it is precisely the lifestyle of this (still) broad milieu that has effectively and continuously shaped and still shapes the culture, motivation and reproduction of the interlinkage between of democracy, the rule of law and civil society in many countries. Andreas Reckwitz describes this dangerous drifting apart in his award-winning contemporary diagnosis "The Society of Singularities". What is remarkable and disturbing about his interpretation is, on the one hand, that the many people from the middle class share the increasing feeling that they no longer belong to the cultural and economic mainstream and are no longer sufficiently recognized in both spheres. This leads to a distance from state, the media and a pursued notion of the common good that transcends the respective milieus. With all this, the regulative idea of a single public sphere also threatens to be lost.
A second shift in the economic axis, which is probably even more drastic from a global perspective, must be viewed with concern to the lively interplay between the rule of law, democracy and civil society: I am talking about the highly dynamic platform economy, which is increasingly determining the global economy and which we in the West still associate too one-sidedly with the so-called GAFA – Google, Amazon, Facebook and Apple – even though the two Chinese Internet giants Alibaba and Tencent have successfully opened up in the global economy.
Their logic is: because pattern recognition and logic of predictability work better under the conditions of artificial intelligence if you can combine data sets that are as large as possible, big data collectors, the so-called platforms, have an evident advantage over smaller companies: technologically this is called the so-called network effect, in business style: "the winner takes it all" logic. Because this logic rewards extensive size, we are experiencing an unprecedented monopolisation trend in economic history, which the American journalist and economist Scott Galloway sums up so succinctly with a view to GAFA: there are the four "horsemen" – "horsemen" in allusion to the Horsemen of the Apocalypse – who, like Google, claim divine attributes such as omniscience, like Facebook, steer our emotions, like Apple, determine our attractiveness economy, and, like Amazon, steer our consumption. According to him, the anthropological constants of religion, love, sex and consumption are shaped by these Internet giants, but they have also attained market power in their areas that can hardly be tamed any more by means of economic activity, which they also use to disadvantage or destroy competitors – and thus to suppress thrusts of innovation in the long run. This implosion of innovation economy by financial economic power is likely to continue if one or two of these four American and two Chinese giants get into economic difficulties and are bought up by one of the other three.
With these brief references to the possible economic developments of the digital economy and the world economy, I would like to draw your attention exactly to one point, which I dress in a question: How do we want to defend freedom and self-determination in a meaningful sense, if freedom and self-determination are understood permanently, essentially and by the majority as guided decision options by consumers, users and video gamers, if Chinese companies with their state capitalist background and its comprehensive surveillance practice begin to dominate the global competition for the hearts, minds and hands of people? Anyone who sees Europe's role in the world economy today, trapped between the USA and China, must keep an eye on this line of development. In order not to despair, it should be remembered that Europe can take action against the threat of digital incapacitation: from the rulings of the higher courts to the General Data Protection Regulation, from the hopefully similar wrangling over the e-privacy directive to the EU Commission's competition monitoring activities.
2.) I come to the social tendencies of digitalisation: the debates raging everywhere about identity and integration, enlightened or dull patriotism, about interpretation of migration and populism interpretations are indications that the social cement that binds people together has diminished. It can hardly be denied that the social media, which are also controlled by AI technologies, have a catalytic effect on these centrifugal social processes. The usual "narrative," as it is said today, is that the social media trapped us in filter bubbles and echo chambers that made it impossible to exchange information beyond these bubbles, that we might become more and more susceptible to fake news, and that the basic idea of a general public and of a generally shared understanding of truth is fading away.
But the situation is more complicated, not so one-directional: there is not this simple cause-effect relationship. It is not only social media that cause outdated authorities to lose their credibility, and the general public to lose its attractiveness as a regulative idea of a plural civil society and truth as the corrective of opinion. If this were immediately the case, everything would be quite simple: One would have to abolish social media in the usual form. Facebook, Twitter and Co. would have to be smashed, and the described dangers would be gone.
Not only do social and media studies show that filter bubbles do not exist in this stereotypical form. For example: The voters of AfD – a right wing party in Germany – do indeed perceive what they see as mainstream or so called system media in the press, but the voters do not acknowledge the mainstream media, their reports and comments are not seen as questioning, but as confirmation of their own structure of judgement: "We have always known that 'old parties' and 'the press of lies' confirm each other here.” Beyond the simple idea of filter bubbles, however, the logic of the so-called social media then amplifies the logic of simplifiers and radicalizers. The rationality, especially of Facebook, Twitter and other, is not only – as in the old media – to distinguish between attention and non-attention, but the currency of the social media is much stronger and, above all, more interventionist than in the old media: emotionalisation. The purpose of Facebook and comparable social media, in particular, with their still inscrutable algorithms, is to keep users on their platforms for as long as possible in order to use microtargeting to place personalised advertising here – by the way, in Germany a legitimate business model, incidentally, on this side up to the point at which manipulation becomes the rule.
There are two spheres of emotion that particularly bind attention and are therefore stirred up by the social media: namely, on the one hand, emotions that are addressed when there is proximity: Sympathy, empathy, compassion – that's why Facebook has increased the share of private communications since the beginning of the year – and on the other hand outrage – and that's why the "anger citizen" and his special communication habits are addressed in social media. When users bury themselves in the social media both in this way and with the temptation of being able to stage themselves in idealistic style (with Instagram), the media structure, which is already considerably diversified, is further broken down. Consequently, the costly quality-journalism, this pillar of civil society and democracy under the rule of law, comes under considerable permanent pressure when the number of those who want to pay for it falls.
The trend towards privatisation, simplification and polarization is inherent in social media because the logic of emotionalisation undermines basic prerequisites and decisive foundations with which we must try to responsibly shape plurality in a democratic and civil society: the regulative idea of the public sphere and the professional quality media that foster it, as well as the idea of standards of the search for truth that are respected beyond individuals' and closed groups' opinions.
3.) This brings me to the last of the dynamics associated with Big Data and AI, which together can dry up the sources of a living ethos of human dignity and human rights. It is about enabling and shaping one's own self-image, about what some call autonomy, what others call self-determination. Big Data, AI and machine learning now achieve such an uncanny depth of intervention efficiently and unnoticed that there is reason to fear that the ability to determine oneself, however demanding, will diminish, if not threaten to be lost by many.
Sure, people have always been influenced, even manipulated, by "higher" powers. But the comprehensive pattern recognition and prediction logic in the style of Silicon Valley or Chinese state capitalism force their users bit by bit into ever tighter corsets. At some point – so I fear – one notices – perhaps too late – that the power to breathe is lacking to develop ourselves in a self-determined free way. It seems obvious that the development of the Chinese social credit system is accompanied by tendencies that, in my view, pose an extreme threat to freedom, even if in return it promises to guarantee safety, security and order.
"Such dynamics, which exercise intimate control, do not spread to us after all, you might think.” Google magician Eric Schmidt said years ago: "We know where you are. We know where you've been. We can more or less now what you're thinking about" and, "If you have something that you don't want anyone to know, maybe you shouldn't be doing it in the first place." This if-then-conclusion is to be explicitly contradicted from an ethical point of view. How boring would our visual arts or our music, and beyond that: our life course, be if everyone would leave his or her dark, uncontrolled and wild sides to self-fulfilling obedience. It would be the end of creativity and innovation in the long run. If someone threatens us that he knows everything about us and we have to assume that he knows more than we like, then not even our thoughts are free anymore, then a line from a German folk song is no longer true: "no one can know them, no hunter can shoot them".
III.
So what can one do? I retain hope if I succeed in defending the self-determination of the individual under the conditions of digital transformation in a concerted action that includes all the forces of society, or – where it already seems lost – in reconquering it. To this end, I advocate making the right to informational self-determination behind traditional data protection weatherproof for the age of Big Data, AI and machine learning. This can happen – and to this end I would like to present the approach of the German Ethics Council, which it issued by the end of Nov. 2017 in the opinion-statement "Big Data and Health" and which I was allowed to co-develop. The paradigm shift presented there consists of switching from a traditional input orientation of data protection (consent, data minimisation and purposeness) to a more output-oriented approach to data processing. As the goal of such an approach, which integrates many dimensions and actors, the Ethics Council identifies data sovereignty, which it interprets as "the shaping of informational freedom" – both a term that has changed from the traditional nomenclature to 'data sovereignty', in order to also indicate the shift to output orientation terminologically. The multi-actor and multi-dimensional governance approach, which is intended to secure data sovereignty, must in turn be oriented towards ethical criteria. The Ethics Council has identified these as such: 1.) Use opportunities and potential. We would not conduct all the debates that we conduct if Big Data (and AI) did not also bring recognizable advantages, 2.) protect individual freedom and privacy, 3.) secure justice and solidarity, and finally promote responsibility and trust. Since, despite the 50 to 60 further, detailed and differentiated recommendations, the question of how the whole thing can be implemented technically keeps coming up again and again, I would like to briefly answer this question: at the technical level, data sovereignty can be effectively established, protected and reconquered, for example by means of data agents and data trustees. Data agents act like information technology representatives of the data subject by automatically implementing her preferences for handling her data in the infinite data stream. Data trustees manage this process. In concrete terms this means: The data agent is installed at the data interfaces that digital companies normally use to process the data from the data subject. This is not a technical witchcraft, but the normal way in which all data users get their data. This data agent now tracks the transmission and further processing of the information extracted from the data subject and notifies the data trustee if a use is made that the data subject dislikes. The data trustee "knows" the preferences of the data subject because the latter has entered them on an app or can change them there again and again. This control should not be imagined as if army troops of employees were monitoring the data stream. The whole thing takes place mechanically. Also, possible objections to the data subjects are created automatically, as well as their possible first rejection of the objection and the then conceivable raising of a new objection – until the time when the machines can not "agree", the conflict will be "reported" to humans and people will be involved if necessary legal steps are considered. Everything before can take place in milliseconds, which are not noticeable for humans, thus in quasi-real time, like it is the case nowadays in high frequency trading in the financial economy.
Questions undoubtedly arise with such a technical "solution" as to how data sovereignty can be implemented as controlability: Does such a model not lead to new injustices, because some can afford a highly competent and effective data trustee, while others have to content themselves with a middle-class provider that capitulates in the first inter-machine dispute round? Such constellations are conceivable. Therefore, the possible market development has to be carefully observed when introducing this new business model and, if necessary, measures have to be taken to limit a considerable unjust asymmetry of power. One may also ask sceptically: Can politicians, for example, not use the model of securing data sovereignty over data agents and data trustees to carry out effective censorship by suppressing reports or comments on their activities? This is an objection that should be taken seriously: Of course, the data trust dealer model must adhere to the existing legal framework. Freedom of expression and freedom of the press must not be undermined either theoretically or practically by this model. Since, however, it is also a truism of the social and political sciences that general claims such as transparency or participation or formal legal claims such as freedom of expression or freedom of the press cannot simply be implemented one-to-one in practice, careful observation and, if necessary, countermeasures must be taken if this model de facto leads to an unintentional restriction of freedom of expression. In short, the approach of securing data sovereignty through data agents and data trustees also requires legal and political design. But it can already be realized today with limited technical effort and would not burden the extensive flow of Big Data and AI with the de facto dysfunctional old data protection principles but would still redefine output-oriented and quasi-real-time privacy as sovereignty and controllability of data.
Nevertheless, under the conditions of Big Data, AI and machine learning, data sovereignty as an expression of informational freedom and thus in the flight line of human dignity can only be guaranteed and protected if not only technical procedures, legal regulations and economic incentives are created for this purpose, but also if a culture is kept alive and promoted in which 1.) economic competition is maintained at all, 2.) the basic idea of a civil society public beyond filter bubbles and echo chambers is appreciated and made possible, and 3.) the extraordinary, the deviant, the vulnerable are promoted as central moments of individuality and kept socially high, and we do not allow ourselves to be put to sleep by notions of normality that are imposed on us by large Internet platforms. Only with the necessary sensitivity to difference and self-critical tolerance of ambiguity will we remain data sovereign and free. Therefore, in order to survive well under the complex conditions of the Big Data, AI and machine learning age as an individual and as a society seeking plurality and social cohesion, we not only have to teach skills such as programming or media studies, but we also need to promote general judgement more than ever, especially in order to foster what is called difference competence and ambiguity tolerance. In short: classical education. I recommend: the Bible, Faust, mathematics and one or two foreign languages – oh, yes, and a friend called out to me: music, Peter, that appeals to cognition and emotion.
And I would not be a theologian if I were not deeply convinced that the religious culture of Christianity, of Churches and Christian theology could be an important inspiration to cultivate constructively, critically and sustainably the foundations of our coexistence beyond technology, law and economy under the conditions of the Big Data, AI and machine learning. Finally, I would like to briefly mention three points in which I see public church, public theology and public Protestantism as well as public Catholicism on the agenda:
1.) Churches should remember beyond the platform economy that they themselves offer a unique platform not only to celebrate faith, but also to actively participate in the search for public reason and public good: two thousand years in the unique combination of global-universal message and local testimony, which is not limited to cognitive, emotional, financial or political tribalisms.
2.) Churches are one stakeholder among others in the shaping of public discourse in the onlife world. But from this shaping tradition and shaping power no entitlement for being privileged arises, but at best a prerogative of responsibility. This can be taken up by the idea that contrary to the tendency inherent in social media, in the (my understanding of) Protestant tradition walls can be broken and emotions can be taken back and, for example, other religious cultures may be supported which up to now cannot refer to a quantity of experience in dealing with a complex and diversified society.
3.) If coping with the onlife world is not only a matter of competences but above all of education, then Christian religious culture transports a treasure of resources for interpreting life style and life course, which under these conditions must precisely be spelled out anew and which in turn will also change the Churches. I will only mention it:
- From the promise that man is nobilitized as God's image, but that God himself – spoken with Eberhard Jüngel – "may be recognized and witnessed as the mystery of the world", follows the encouragement to understand analogously also the secrecy of every human being as a limitation of chargeability and to oppose all attempts to direct the communication of human beings alone under the condition of profit maximization driven by microtargeting.
- From the sober anthropology that man cannot finally complete his life out of himself, called theologically sin, a high sensitivity arises for the limitation, vulnerability and weakness of every human being (even if she celebrates herself as hero or doer).
- From the promise that exactly this "crooked wood" is promised from outside reconciliation and salvation, the insight is motivated that freedom must be realized and defended in relationship.
- From the belief in God's greater faithfulness to the unfaithful man, who is believed to be greater than ever, the commitment to inclusion is strengthened, which does not exclude plurality, but allows for it within the limits of expanding solidarity and justice, and which is thus inspired by Jeremiah's word that he addressed to the exile community in the foreign, pluralistic metropolis of Babylon: "Seek the best for the city! (Jer 29:7)
Asia Times Staff. "Alibaba, Tencent among World’s Top 10 Brands." Asia Times, 13.06.2019 2019. https://www.asiatimes.com/2019/06/article/alibaba-tencent-among-worlds-top-10-brands/.
Barnett, Ian, and John Torous. "Ethics, Transparency, and Public Health at the Intersection of Innovation and Facebook's Suicide Prevention Efforts." Annuals of Internal Medicine 170, no. 8 (2019): 565–66. https://doi.org/10.7326/M19-0366 .
Boutin, Paul. "Your Results May Vary. Will the Information Superhighway Turn into a Cul-De-Sac Because of Automated Filters?" The Wall Street Journal, 20.05.2011 2011. https://www.wsj.com/articles/SB10001424052748703421204576327414266287254.
Burden, David, and Maggi Svan-Baden. Virtual Humans. Today and Tomorrow. Boca Raton: Taylor & Francis, 2019.
Card, Catherine. "How Facebook Ai Helps Suicide Prevention." Facebook Newsroom, 10.09.2018 2018. https://newsroom.fb.com/news/2018/09/inside-feed-suicide-prevention-and-ai/.
Chase, Jefferson. "Afd: What You Need to Know About Germany's Far-Right Party." DW, 24.09.2017 2017. https://www.dw.com/en/afd-what-you-need-to-know-about-germanys-far-right-party/a-37208199.
Deutscher Ethikrat. "Big Data Und Gesundheit – Datensouveränität Als Informationelle Freiheitsgestaltung. Stellungnahme." (30.11.2017 2017). Accessed 18.03.2018. http://www.ethikrat.org/dateien/pdf/stellungnahme-big-data-und-gesundheit.pdf.
Digiovanna, James. "Artificial Identitity." In Robot Ethics 2.0. From Autonomous Cars to Artificial Intelligence, edited by Patrick Lin, Ryan Jenkins and Keith Abney, 307–21. Oxford: Oxford University Press, 2017.
Eagleton, Terry. Hope without Optimism. Charlottesvolle: University of Virginia Press, 2015.
Esguerra, Richard. "Google Ceo Eric Schmidt Dismisses the Importance of Privacy." EFF, 10.12.2009 2009. https://www.eff.org/de/deeplinks/2009/12/google-ceo-eric-schmidt-dismisses-privacy.
European Commission. "Competition." European Commission, 2019. https://ec.europa.eu/competition/index_en.html.
———. "Proposal for an Eprivacy Regulation." European Commission, 19.07.2019 2019. https://ec.europa.eu/digital-single-market/en/proposal-eprivacy-regulation.
European Parliament, and Council of the European Union. "Regulation (Eu) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the Protection of Natural Persons with Regard to the Processing of Personal Data and on the Free Movement of Such Data, and Repealing Directive 95/46/Ec (General Data Protection Regulation)." EUR-Lex, 27.04.2016 2016. https://eur-lex.europa.eu/legal-content/DE/TXT/?uri=uriserv:OJ.L_.2016.119.01.0001.01.ENG&toc=OJ:L:2016:119:TOC .
Facebook. "Suicide Prevention." Facebook Safety Center, 2019. https://www.facebook.com/safety/wellbeing/suicideprevention.
Federal Ministry of Labour and Social Affairs. White Paper Work 4.0. Re-Imagining Work. Berlin: Federal Ministry of Labour and Social Affairs, 2017. https://www.bmas.de/SharedDocs/Downloads/EN/PDF-Publikationen/a883-white-paper.pdf?__blob=publicationFile&v=3.
Fitzi, Gregor, Juergen Mackert, and Bryan S. Turner, eds. Populism and the Crisis of Democracy. Volume 1: Concepts and Theory. London: Routledge, 2018.
———, eds. Populism and the Crisis of Democracy. Volume 2: Politics, Social Movements and Extremism. London: Routledge, 2018.
———, eds. Populism and the Crisis of Democracy. Volume 3: Migration, Gender and Religion. London: Routledge, 2018.
Floridi, Luciano. The 4th Revolution. How the Infosphere Is Reshaping Human Reality. Oxford: Oxford University Press, 2014.
Frey, Carl Benedikt, and Michael A. Osborne. "The Future of Employment. How Susceptible Are Jobs to Computerisation?", 17.09.2013 2013. https://www.oxfordmartin.ox.ac.uk/downloads/academic/The_Future_of_Employment.pdf.
Galloway, Scott. The Four. The Hidden DNA of Amazon, Apple, Dacebook, and Google. New York: Portfolio, Penguin, 2017.
German Ethics Council. Big Data and Health – Data Sovereignty as the Shaping of Informational Freedom. Opinion. Executive Summary & Recommendations. Berlin: German Ethics Council, 30.11.2017, 2018. https://www.ethikrat.org/fileadmin/Publikationen/Stellungnahmen/englisch/opinion-big-data-and-health-summary.pdf.
Habermas, Jürgen. The Theory of Communicative Action. 2 Volumes. Translated by Thomas McCarthy. Boston: Beacon Press, 1984.
Jüngel, Eberhard. God as the Mystery of the World. On the Foundation of the Theology of the Crucified One in the Dispute between Theisim and Atheism. Translated by R. David Nelson and Darrell L. Gudler. London: Bloomsbury, 2014.
Kenney, Martin. "The Rise of the Platform Economy." Issues 32, no. 3 (2016). https://issues.org/the-rise-of-the-platform-economy/.
Klenner, Manfred. "What Does It Mean to Be a Wutbürger? A First Exploration." Zurich Open Repository and Archive, 05.12.2018 2018. https://www.zora.uzh.ch/id/eprint/159171/1/paper4.pdf.
Kobie, Nicole. "The Complicated Truth About China's Social Credit System." Wired, 07.06.2019 2019. https://www.wired.co.uk/article/china-social-credit-system-explained.
Kušen, Ema, Mark Strembeck, and Mauro Conti. "Emotional Valence Shifts and User Behavior on Twitter, Facebook, and Youtube." In Influence and Behavior Analysis in Social Networks and Social Media, edited by Mehmet Kaya and Reda Alhajj, 63–83. Cham: Springer, 2018.
Lawless, W. F., Ranjeev Mittu, Donald Sofge, and Stephen Russell, eds. Autonomy and Artificial Intelligence: A Threat or Saviour? Cham: Springer, 2017.
Lepore, Jill. "Does Journalism Have a Future?" The New Yorker, 28.01.2019 2019. https://www.newyorker.com/magazine/2019/01/28/does-journalism-have-a-future.
Martin, David. "Germany's Afd to Open Its Own Newsroom in Preparation for Mainstream Media Offensive." DW, 09.02.2019 2018. https://www.dw.com/en/germanys-afd-to-open-its-own-newsroom-in-preparation-for-mainstream-media-offensive/a-42517544.
Matsakis, Louise. "How the West Got China's Social Credit System Wrong." Wired, 29.07.2019 2019. https://www.wired.com/story/china-social-credit-score-system/.
Muller, Denis. Journalism Ethics for the Digital Age. Brunswick: Scribe, 2014.
OECD. Oecd Employment Outlook 2019: The Future of Work. Paris: OECD Publishing, 2019. doi:10.1787/9ee00155-en n .
Pariser, Eli. The Filter Bubble. What the Internet Is Hiding from You. London: Viking, 2011.
Reckwitz, Andreas. Die Gesellschaft Der Singularitäten. Zum Strukturwandel Der Moderne. Berlin: Suhrkamp, 2017.
Rodgers, Shelly, and Esther Thorson, eds. Digital Advertising. Theory and Research. 3 ed. New York, London: Routledgee, 2017.
Thompson, Derek. "Google's Ceo: 'The Laws Are Written by Lobbyists'." The Atlantic, 01.10.2010 2010. https://www.theatlantic.com/technology/archive/2010/10/googles-ceo-the-laws-are-written-by-lobbyists/63908/.
Weisberg, Jacob. "Bubble Trouble. Is Web Personalization Turning Us into Solipsistic Twits?" Slate, 10.06.2011 2011. https://slate.com/news-and-politics/2011/06/eli-pariser-s-the-filter-bubble-is-web-personalization-turning-us-into-solipsistic-twits.html.