
Description
We talk to Dr Hanna Reichel about the doctrine of omniscience, predictive modelling, digitality as language and how theology can provide resources in the ana...
Part I: The objectivity fallacy and the neutrality fallacy
Like all looming yet unknown developments, the „digital age,“ ushered in by the rise of information and communication technologies as well as momentous advances in computational powers, inspires both utopian hopes and dystopian fears. Tech pessimists paint apocalyptic scenarios of the dependencies, alienations, and incontainable dynamics associated with technological determinism, while tech-optimists herald the salvation of humankind which they see dawning in technological progress. Writers on different sides often invoke „omniscience“ or attribute God-like qualities when referring to data-driven technologies.1 Usually such invocations are rhetorical, dramatizing hyperboles that critique frightening powers that need to be contained.
If we leave religious forms of „dataism“ aside (which do exist, but that is a topic for a different day), we will presumably not see Big data enabled „knowledge“ as divine or that God operates like a super-computer.2 However, it is not farfetched to postulate that the digital age is an age of superhuman knowledge. While the relationship between data, information, and knowledge is a tricky and contentious one, machine-learning empowered „big data“ analytics allows for both „more“ and a different kind of knowledge than could every be accumulated (or understood) by human agents, whether individual or collective.3 In this sense, such „knowledge“ may be seen as superhuman – „beyond the human“, even if not divine by any means – and even more so if we consider the powerful ways in which the application of such knowledge might augment and threaten human agency and amplify, limit, or transform what we conceive of as human freedom.
It is striking how many of the questions raised around data-based surveillance seem to be variations on themes that Christian theologians have wrestled with for centuries. Today we may ask, To what extent does data-based targeted advertising manipulate our purchasing behavior, desires, even our political choices? Calvin used to ask, How does divine providence guide and steer our actions and fate in mysterious ways according to a divine plan?4 Today we may ask, Can algorithms read our minds and predict our behavior? Boethius would have asked, If God knows everything, can my choices be considered free?5 Today we may ask, Do we want intelligent machines to track all of our movements, purchases, conversations, and behavior? And the Psalmist would have wondered, „You know when I sit down and when I rise up; you discern my thoughts from afar. You search out my path and my lying down and are acquainted with all my ways. Even before a word is on my tongue, behold, O Lord, you know it altogether. […] Where shall I go from your Spirit? Or where shall I flee from your presence?” (Ps 139:2–7).
That digitization is in some (some!) ways comparable to divine omniscience is my working hypothesis, and the point of comparison, I will argue, is its world-duplicating character. In one of the most recent sociological analyses of digitization, Armin Nassehi defines the digital as “simply the duplication of the world in the form of data with the technical possibility of connecting data with each other, in order to re-translate them to particular issues.”6 Nassehi sees its unspecificity, or its universal applicability, to paradoxically be the particularity of the digital – a characteristic which, as he states, “up to date had been reserved for the presence of God and the use of writing.”7 Nassehi’s comparison may surprise, but the point here is that the digital is less like particular, specific technological innovations (think: steam engine, airplane, or telephone), not even like a technology underlying the widespread development of more technology (think: electricity). Instead, it is more like other translations or duplications of the world into discrete discourses, like money, like language, like the mind of God.
Language, already in its spoken form, has the same property of being ubiquitously applicable and effectively translating the world into text – even more so through writing, which creates a world of its own, an archive in which different independent items „have the properties of being mobile but also immutable, presentable, readable and combinable with one another“8. In examining the world as text, writing refers to writing, establishes connections between writing and writing in the form of more writing, and generates new textual output which can be re-ascribed to the world. New insights about the world emerge not only through interaction with the world, but in the interaction between writing and writing. In some ways, digitization is but a radicalized form of writing – writing in a rigorously simplified and standardized language.9
Money is a similar medium: a formalized language which translates everything (everything!) into values that are commensurable and which therefore allow someone to calculate, aggregate, analyze, and cross-reference things which previously could not be put into a relationship. Money, just like writing, is a rendering technology that is universally applicable to anything in the world, creating a particular kind of shadow text of the world onto the world, on which operations can be performed that in turn are non-neutral to the world itself. It duplicates the world without containing it while having real repercussions in it.
And God? In light of the parallels between digitization, writing, and money, it should be clear that the reference to God is not just a shallow allusion to the often invoked or even aspired ubiquity of digital technology. In traditional Christian thought, God’s omnipresence and omniscience create a similar „film“ on all of reality, an accompanying presence that pervades all contexts and adds an interpretive layer. In many more analytically inclined theologies, the mind of God is even understood as the perfect representation of all that is, all possible data in all meaningful relationships. It is the very definition of a data double of the world towards which digitization can only aspire. More than money or language, divine omniscience is therefore a strong conceptual parallel for the digital.
That is not to say that theology could comprehensively give an account of emergent technologies and the societal transformations in their wake – that would be absurd. But in the centuries of conversations about divine omniscience, theology may have developed conceptual frameworks which can provide helpful guidance in the interrogation of „the digital“ today. On the other hand, examinations of „digital“ issues may contribute important corrections for theological reflection. In what follows, I want to offer some specific ways in which drawing on theological discursive formations allow us to discern and hone important questions and contentions vis-a-vis digitization. Even if I can only cursorily treat them here, I hope these suggestions – tentative in nature and presumably in need of correction from experts in technology, philosophy of science, and sociology – open routes of conversation.
In a first part, I will sketch how parallels in the discussion of divine omniscience call into question two wide-spread (if not uncontested) assumptions about data-based knowledge: its objectivity and its neutrality. In a second part, I will to build on these theoretical foundations and proceed to demonstrate how thought developed in the discussion of divine omniscience can illuminate why the contemporary focus on privacy is not enough: Privacy is incapable of accounting for deeper structural transformations through digitizations and therefore fails to address issues that emerge from them.
Contemporary treatments of divine omniscience almost invariably start something like this: „Since omniscience is maximal or complete knowledge, it is typically defined in terms of knowledge of all true propositions“10. And the propositional model is very powerful, since it devises a universally applicable, abstract, and formalized structure which can be used to formalize truths and truth claims, distill them to the point of almost being able to calculate truth through all possible combinations of true propositions. The propositional approach, however, leads into unsolvable dilemmas when applied to divine knowledge.11
Most importantly, it is typically understood to engender a difference between the thing that is known and the knowledge of it. A proposition is a formal entity derived by abstracting a specific property of some thing, rendering it into a specific form which is not the thing itself. The set of true propositions would thus be seen to create a kind of discursive shadow layer of the things it describes. Reality then exists twice: once as it is, and once in the form of true propositions about reality in the mind of God.
This creates a further, and – for the theologian – even more problematic difference: a difference in God: between the essence of God and God’s knowledge. The essence of God, according to classic12 theistic conceptions is simple, unchangeable, and eternal – but God’s knowledge, if made up of propositions, would be composite. It would also be either temporal or at least temporally indexed, since propositions about future events only acquire a truth status, and therefore only enter into the realm of God’s knowledge, with the passing of time.
These issues illustrate why classical theologians have actually typically not understood God’s belief to be propositional. If God is thought of as absolute simplicity, then there can be no distinction between God, God’s knowledge, and the objects of God’s knowledge. God’s knowledge has to be immediate and intuitional rather than propositional and indirect; there can be no „detour“ of propositions or other medial translations/duplications. Brought to its logical conclusion as in Thomas Aquinas, this means that God’s knowledge can only be God’s own essence and the knowledge of the world simply has to be inscribed into God’s knowledge of God’s own will.13
From the tensions created in the doctrine of God, theologians have inferred more generally: „It seems plausible to suppose that the propositional character of human knowledge stems from our limitations. Why is our knowledge parcelled out in separate facts? […] First, we cannot grasp any concrete whole in its full concreteness, […] Second, we need to isolate separate propositions in order to relate them logically, so as to be able to extend our knowledge inferentially.“14 Propositional knowledge can never be perfect knowledge and has therefore not traditionally been adopted to conceptualize divine omniscience – it is too indirect, too mediated, and too much reliant on a logical or proto-linguistic structure, and it therefore fails to be comprehensive, unbiased, and objective.
What does this insight from the doctrine of God yield for assessing „the digital“? Well, the digital is the epitome and radicalization of the propositional form – with all its limitations. Working off Nassehi’s above mentioned definition, the digital is not so much a new technology as a formalized mediation of the world, a mode of reading the world. It renders the world into data, duplicating it, producing a discursive world of its own. This duplication entails both a simplification and a complexification. It is clearly a simplification because in order to produce data, a reduction is necessary, a concentration on certain aspects which are then (re)presented in form of data. It is the divestment of information about the world that makes the incommensurable commensurable, allowing for the computability of the world15. The digital form is in fact ingenious in maximally reducing the complexity of information to a binary signal – 0 or 1, off or on – or any combinations of such binary signals which may be long, but remain simple, and are therefore easy to store, transmit, and read. This is the promise of the digital: that because of its reduced and computable form, it is both universally applicable and highly efficient.
This simplification is however an operation which generates complexities. Data – despite what the name might imply – is of course never simply „given“ but has to be generated through a process that involves complex hermeneutic operations: „Raw data is an oxymoron“16. The process of abstraction and reduction that „gives“ the world the form of data rests on interpretive processes: what is established as the object of measurement, as what any given instance „counts“, when it starts counting, and so forth. Categories and types have to be imagined according to which things are then counted. Seemingly objective data has to be produced through highly subjective processes of observation – regardless of whether the observer is a human being or a sensor – „the perception of the world and the processing of information is primarily discernment of patterns, where the patterns are less inherent in the object itself, and more in the object-ivity (Gegenstaendlichkeit) generated through perception.“17
The resulting data is a construction, a creation: new entities which exist as supposed duplications of reality – the world in the form of data. Information is translated into a homogeneous medium of signals which allows for the drawing of relations between hitherto incommensurable things. In order to derive information from such data, an active process of generation of information out of signals takes place, not a mere passive reception.18 As is well established in information theory, interpretation is not only irreducibly involved at the sending, but also at the receiving end of communication. Contrary to naive (or programmatic) tech optimist beliefs, data can never „speak for itself“19: „working with Big Data is still subjective, and what it quantifies does not necessarily have a closer claim on objective truth“20.
In this process, belief plays a decisive role. Scholarly definitions see Big Data not only as a technological phenomenon, but as a complex „cultural, technological and scholarly phenomenon that rests on the interplay of 1) Technology: maximizing computation power and algorithmic accuracy to gather, analyze, link, and compare large data sets. 2) Analysis: drawing on large data sets to identify patterns in order to make economic, social, technical, and legal claims. 3) Mythology: the widespread belief that large data sets offer a higher form of intelligence and knowledge that can generate insights that were previously impossible, with the aura of truth, objectivity, and accuracy.“21 The mythology is actually instrumental in making the promise true as it drives a self-reinforcing cyclical process: The belief in bigger data sets will facilitate the spread of the technology (1) which will enhance the pattern detection in the analysis (2), thus further strengthening the conviction that large data sets generate superior insight.
The theological tradition can prompt us to take any claims of objectivity of data-driven approaches with a grain of suspicion. To be clear: The issue with the digital here is not (primarily) that it is quantifiable or reductionist, but that it is invariably epistemologically closed: „Data duplicates the world, but doesn’t contain it.“22 The world outside of data only comes into view in and through its representation by data. Data science can only find patterns in the data it recombines, aggregates and cross-references, not in the world itself. Only data-oid things enter the calculation, and the patterns that are produced in this process are properties of the data, not of the world. The digital shares the paradox of all signals, which come to stand at the same time for themselves and for that which they signify. Just like perception, data is not the world nor does it objectively represent the world, its function is rather the „testing of hypotheses about the world“23. The propositional form is not the only way to think about knowledge, and is in fact one that is interpretionally quite „productive“ – which leads to the fallacy discussed in the next section, the „non-neutrality“ of digitization: It re-makes the world in the particular form of propositional statements.
Before launching the next section to explicate this „productivity“ of technology and why therefore the neutrality view of technology is a fallacy, I want to earmark for further theological discussion that when we compare recent propositional accounts of divine omniscience with the classical conceptions, we might actually see the beliefs driving the digital age reinfiltrating theology. Defining divine omniscience as knowledge of all true propositions which effectively duplicates the world into a „mind of God“ is quite a recent invention, and it may not be a coincidence that its spread goes hand in hand with the rise of „the digital age“24. What presents itself as an objective, general model of knowledge may in fact be quite contextual to the specific branch of modernity we live on.
Not only does digital propositionality fail to achieve perfect knowledge, it is also not neutral – which I want to explicitly distinguish from „not objective“. Digitization does not just add an external interpretive layer to reality which remains external to reality while leaving it untouched – as the propositional form did. Digitization also alters the reality which it only pretends to represent. Whereas non-objectivity points to the inevitably interpretational nature of digitization, non-neutrality points to its real-world effects: what Foucault would have called its productivity.
Of course most people are aware that digitization has real-world effects, and that such effects could be judged to be positive or negative. There are those who think that digital technologies hold the key to everything good: progress, economic growth, personal enjoyment, convenience, self-perfection and enhancement, and that they will usher in a new age with unprecedented possibilities through more precise knowledge, increased efficiency, and better tailoring of technological solutions to cure all of society’s ills. There are also those who call out the way in which digital technologies generate social alienation, replace whole employment sectors, amplify bias, or facilitate oppression or even totalitarianism through corporate power, political manipulation and control of individual behavior as well as societal processes.
However, most people will lean towards the seemingly more balanced assumption that technology as such is neither good nor bad in itself, but that it has to be judged according to the uses it is put to, in short, that it is, in and of itself, neutral. Knowledge is power: It enhances the possibilities of the wielder to achieve their aims, but whether it is good or bad depends on the use it is being put to. Such an assumption rests on an instrumental view of technology. The instrumental understanding views technology – from its simplest to its most sophisticated forms – as a tool. A tool, like a hammer or a knife, is not good or bad in and of itself, but has to be judged according to the end to which it is put, the intentions with which it is applied, the outcome, and the consequences its application engenders. A hammer can be used to build a shelter or to break a person’s scull. Data analysis can be used for racial profiling as well as for life-saving medical diagnostics. Knowledge derived from social media data can be used to manipulate elections as well as to facilitate grassroots organizing. And so on.
This view of technology is not completely wrong of course (I don’t deny moral accountability for the way individuals, institutions or corporations use either hammers or data analysis), but it is incomplete. In the 1980s, Melvin Kranzberg, one of the 20th century’s most important historians of technology formulated what has become well-known since as Kranzberg’s first law of technology: „Technology is neither good nor bad; nor is it neutral.“25 Kranzberg saw the need to take into account „the utopian hopes versus the spotted actuality, the what-might-have-been against what actually happened, and the trade-offs among various ‚goods’and possible ‚bads‘“ as well as „how technology interacts in different ways with different values and institutions, indeed, with the entire sociocultural milieu.“26 These broader factors would make any judgment more ambivalent – differing effects come together without cancelling each other out, which yields an uneasy, „it’s complicated“.
In practice, this version of non-neutrality usually evolves into a view of technology as „benign, if regulated“: The technology as such will continue to be seen as ambivalent in its effects but in itself morally neutral. This means it could potentially be used for good; the issue becomes discerning where to draw the line between good applications and problematic applications. This is an important task, and it will legitimately take up the bulk of ethical and legal reflection on emergent technologies.
I am not an ethicist or a politician. Others are better qualified to assess the moral quality of potential effects and to develop regulatory frameworks. As a systematic theologian, my relevant expertise may instead lie in assessing the more general differences a difference in the structural architecture of any „system“ makes. I am therefore interested in the non-neutrality of technology even „before“ any of its applications.27 I want to examine the specific ways in which technology is non-neutral, i.e., the broader transformative power of the technology under question, or what Foucault would have called its productivity. How digitization changes the nature of the problem – this non-neutrality has to be distinguished from the moral neutrality or non-neutrality of its uses.
In divine omniscience, we do find views corresponding to the more instrumental understanding such that God uses God’s knowledge to influence the course of events. In these accounts, God is an agent who interacts with history like others, and God’s knowledge enhances God’s power in the same way as technological tools enhance human abilities to achieve their intended aims. Like in a game of chess, God’s intricate knowledge of the game and the other players gives God a unique and decisive advantage.28 If knowledge is power, then more knowledge is more power, and omniscience evokes omnicompetence (if not outright omnipotence). So far so good, so unspectacular.
What should give us pause is that the instrumental view of knowledge is not the primary angle on divine omniscience in the tradition, and that theologians have seen human freedom as seriously threatened by omniscience even though abuse of power is not a common worry raised with regard to God. Nevertheless, theologians have raised contentions with regard to divine omniscience based on the control it exerts or might be thought to exert to the verge of determinism. What we could learn from theology is that the moral non-neutrality of the effects of superhuman knowledge might not be the only or indeed the most fundamental non-neutrality involved. We need to think about the „productivity“ of technology, beyond – or before – the question of its right use.
From divine omniscience we learn that not only is knowledge power, but more importantly: Power is knowledge. Divine omniscience is not only a tool that would intervene in the world in this way or that, instead, it forms the world itself according to its image. Not only is there nothing that exists that God doesn’t know, without divine knowledge of it, there wouldn’t even be a world.
Theologians have argued over the centuries whether God’s eternal decree to create the world precedes God’s knowledge of the world, or the other way around. In the first case, God knows the world infallibly because God willed all of reality into being. What is true is then true because God willed it to be, and God knows God’s will. To stay with the game metaphor: God invented the game, laid down the rules, and designed the characters playing it. Since God is in control of the game as its creator from eternity, God already knows the outcome – no wonder that under these assumptions theologians have invariably run into dilemmas between divine foreknowledge and human freedom.29 Even if this conception sees God’s knowledge as reflective of being, not causative thereof, the fact that God knows things to be true infallibly from eternity basically precludes their ability to be otherwise.
In the second case – as prominent thinkers have stipulated – God’s knowledge actually causes the world to be. This even more clearly „productive“ understanding of divine knowledge can be summed up as follows: „God’s power is His knowledge. He creates by thinking. Whatever is is sustained immediately by the knowledge of God. […] The mirror passively reflects the objects present. God’s knowledge produces them.“30 In that case, there is no difference, no double text, because the world that exists is the world in the mind of God.31 „Esse is percipi”32— to be is to be perceived, or: it is God’s knowledge that sustains reality in being.
Whether God’s knowledge is seen as causative of the world, or whether it is understood to reflect God’s will that brought forth creation, theologians have usually agreed that God’s knowledge of the world ontologically precedes its existence, and that divine knowledge and power are co-constitutive, co-extensive, and identical with God’s essence33. In other words, we do not need to learn from Foucault34 that knowledge is not just an instrument which confers power over a world, but that power is what generates knowledge and gives it its particular shape.
Doctrine can teach us that at the intersection of power and knowledge, manipulation or abuse is not the only issue. With the „mind of God“, the productivity of the data double is immediately apparent, in the case of technology, the productivity may not be quite as crass. But even if the technological knowledge of the world does not create the (whole) world itself, it is still clearly non-neutral to it.
These insights apply therefore even when most data is actually not collected in order to manipulate35 anyone (in the sense of: moving them towards doing something specific against their will or natural inclination), but to control behavior, i.e., to make it readable and predictable, to account for every variable in it, and to expand the duplicate data world. The latter may even be the most decisive factor because it draws on a self-reinforcing loop: More data generates more power because it generates more reality: Firstly, it expands the shadow universe, not only by adding the respective individual items of data to its archive, but also by in this way expanding it with an infinite number of additional possible combinations, correlations, predictions and inferences which in turn yield a lot of additional data, therefore further augmenting the duplicate text.
Secondly, it expands the real world: Digital technologies do not just generate a shadow text that is external to the world. Like with writing, the generated text is in the world as more concrete objects and artifacts – data, code, algorithms… – which are not just an interpretive layer on reality, but objects with which the „original“ world itself then interacts. While the world is duplicated into the digital without being contained in it, the digital itself is in fact contained in the world, populates it, and becomes a part of the world itself and establishes its own materially, socially and culturally relevant relations in it.36
The duplication into data generates a version of the world in which both problems and their solutions can be precisely described. This is in fact the appeal and the promise of the digital, what makes it so efficient – that its reduced and computable form allows it to discern relationships in the data of the duplicated world, to perform operations on it in the form of aggregation, cross-referencing, analysis, at the end of the day in the hope of managing the world which it describes. „The paradox situation ensues that the border between them cannot be overcome, but in practice always is overcome.“37 Technologies of knowledge are not neutral to the world they describe – they are involved in „the reality business“38.
Continuing in the game metaphor39, we can describe the non-neutrality of technology as follows: Technology is not neutral not because it produces good or bad game moves or because it makes good or bad people win the game, but insofar is it puts new pieces on the board within the game, manufactures the board on which the game is played, and fundamentally alters the rules according to which the game is played.
Tech optimists and pessimists alike point to the deeply transformative effects of technology, effects that extend beyond the good or bad intentions of those who apply them: „Change the instruments, and you will change the entire social theory that goes with them“40. Part of the game-changing nature pertains to the change of the very criteria for what can become objects of knowledge: They change „the standards governing permissible problems, concepts, and explanations“ as well as „the institutional and conceptual conditions of possibility for the generation, transmission, accessibility, and preservation of knowledge“41. Technologies of knowledge do not just expand the range of possibilities to whoever is in control of these knowledges; Extant power structures shape the processes and technologies of data extraction and determine what becomes knowledge – an observation from the non-neutrality of technology which adds another aspect to the non-objectivity discussed earlier. Technologies of knowledge engender certain kinds of power relations and certain kinds of subjectivities through the way they mediate reality.42
As technologies change the ways we view the world, the way we interact with it, and the ways we make decisions, they engender and shape epistemic possibilities as well as conditions of freedom. „As the advantages of the computational approach to research […] become persuasive […] the ontological notion of the entities they study begins to be transformed. These disciplines thus become focused on the computationality of the entities in their work.“ Berry even goes so far as to stipulate: „Computationality might then be understood as an ontotheology, creating a new ontological ‚epoch‘ as a new historical constellation of intelligibility.“43
The doctrine of omniscience can direct our attention to the fact that technologies of knowledge production are non-neutral to the world because they change the rules of the game. In the next part, I will address more concretely some of the particular ways in which digital technology is non-neutral, and what different kinds of issues come into view once we take seriously this non-neutrality. In particular, I will argue that the contemporary focus on issues of privacy fails to take into account the non-neutrality of digital technologies and how it is therefore completely unable to track and account for crucial emergent issues.
This second part of my contribution — why privacy is not the central issue of the digital, and how Luis de Molina’s concept of “middle knowledge” can help us understand that better — can be found here: https://cursor.pubpub.org/pub/reichel-omniscience-ii
Alston, William P. “Does God Have Beliefs.” Religious Studies 22 (1987): 287–306.
______. Divine Nature and Human Language: Essays in Philosophical Theology, 1989.
Anderson, Chris. “The End of Theory: The Data Deluge Makes the Scientific Method Obsolete.” Wired, no. 16.07.2008 (2008). http://archive.wired.com/science/discoveries/magazine/16-07/pb_theory.
Anselm of Canterbury. Proslogion. St. Anselm: Basic Writings. Chicago: Open Court, 1962.
Aquinas, Thomas. Summa Theologica: Blackfriars Edition: 61 Vols., Latin and English with Notes and Introductions. London / New York: Eyre & Spottiswoode / McGraw-Hill, 1964.
Asimov, Isaac. The Complete Stories. Vol. Vol. 1, 1993.
Beilby, James K., and Paul R. Eddy. Divine Foreknowledge: Four Views. Carlisle: Paternoster Press, 2001.
Berry, Daniel. “The Computational Turn: Thinking about the Digital Humanities.” Culture Machine 12 (2011): 1–22.
Boethius. The Consolation of Philosophy. Oxford: Oxford University Press, 1999.
Bollier, David. The Promise and Peril of Big Data, 2011. http://www.aspeninstitute.org/sites/default/files/content/docs/pubs/The‗ Promise‗ and‗ Peril‗ of‗ Big‗ Data.pdf.
Boyd, Danah, and Kate Crawford. “Critical Questions for Big Data: Provocations for a Cultural, Technological and Scholarly Phenomenon.” Information, Communication & Society 15, no. 5 (2012): 662–679.
Foucault, Michel. “The Ethics of the Concern for Self as Practice of Freedom.” In Ethics, edited by Paul Rabinow, 281–302. The Essential Works of Michel Foucault, 1954-1984. London / New York: Allen Lane / The New Press, 1997.
______. Discipline and Punish: The Birth of the Prison. New York: Vintage, 1995.
______. Power/Knowledge: Selected Interviews and Other Writings, 1972-1977. New York: Pantheon, 1980.
Gitelman, Lisa. ‘Raw Data’ Is an Oxymoron. Infrastructures Series. Cambridge, Massachusetts and London, England: MIT, 2013.
Halavais, Alexander M. Campbell. Search Engine Society. Second edition. Digital Media and Society Series. Cambridge: Polity Press, 2018.
John Calvin. Institutes of the Christian Religion. Louisville, KY: Westminster John Knox, 1559.
Kranzberg, Melvin. “Technology and History: \enquoteKranzberg’s Laws.” Technology and Culture 27, no. 3 (1986): 544–560.
Latour, Bruno. “Tarde’s Idea of Quantification.” In The Social after Gabriel Tarde, edited by Matei Candea, 145–162. Culture, Economy and the Social. London: Routledge, 2010.
Latour, Bruno. “Visualization and Cognition: Thinking with Eyes and Hands.” Knowledge and Society 6 (1986): 1–40.
Nassehi, Armin. Muster: Theorie Der Digitalen Gesellschaft. 1. Auflage. München: C.H.Beck, 2019.
O’Neil, Cathy. Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy. First edition. New York: Crown, 2016.
Presner, Todd. Digital Humanities 2.0: A Report on Knowledge, 2010. http://cnx.org/content/m34246/1.6/?format=pdf.
Rogers, Katherin A. Perfect Being Theology. Reason and Religion (Edinburgh, Scotland). Edinburgh: Edinburgh University Press, 2000.
Sanders, John. The God Who Risks: A Theology of Providence. Downers Grove, Ill.: InterVarsity Press, 1998.
Schleiermacher, Friedrich. The Christian Faith. Third edition. Cornerstones (London, England). London: Bloomsbury Academic, 2016.
Shannon, Claude Elwood. The Mathematical Theory of Communication. Urbana: The University of Illinois Press, 1949.
Stump, Eleonore. Aquinas. Arguments of the Philosophers. London: Routledge, 2005.
Taureck, Bernhard H. F. Überwachungsdemokratie: Die NSA Als Religion. Wilhelm Fink: Essays. Paderborn: Fink, 2014.
Wierenga, Edward R. “Omniscience.” The Stanford Encyclopedia of Philosophy Spring 2018 Edition (2018). https://plato.stanford.edu/archives/spr2018/entries/omniscience/.
Zagzebski, Linda T. The Dilemma of Freedom and Foreknowledge. New York: Oxford University Press, 1991.
Zuboff, Shoshana. The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. New York, NY: PublicAffairs, 2019.