Skip to main content

Worldmaking knowledge: What the doctrine of omniscience can help us understand about digitization

Part I: The objectivity fallacy and the neutrality fallacy
Worldmaking knowledge: What the doctrine of omniscience can help us understand about digitization
·
Contributors (1)
Published
Nov 11, 2019

1Superhuman knowledges – convergences between divine omniscience and „the digital“

Like all looming yet unknown developments, the „digital age“ ushered in by the rise of information and communication technologies and momentous advances in computational powers inspires both utopian hopes and dystopian fears. Tech pessimists paint apocalyptic scenarios of the dependencies, alienations, and incontainable dynamics associated with technological determinism, while tech-optimists herald the salvation of humankind which they see dawning in technological progress. Writers on different sides often invoke „omniscience“ or attributes God-like qualities when referring to data-driven technologies.1 Usually such invocations are rhetorical, dramatizing hyperboles that critique frightening powers that need to be contained.

If we leave religious forms of „dataism“ aside (which do exist, but that is a topic for a different day), we will presumably not see Big data enabled „knowledge“ as divine or that God operates like a super-computer2. However, it is not farfetched to postulate that the digital age is an age of superhuman knowledge. While the relationship between data, information, and knowledge is a tricky and contentious one, machine-learning empowered „big data“ analytics allows for both „more“ and a different kind of knowledge than could every be accumulated (or understood) by human agents, whether individual or collective3. In this sense, such „knowledge“ may be seen as superhuman – „beyond the human“, even if not divine by any means –, and even more so if we consider the powerful ways in which the application of such knowledge might augment and threaten human agency, and amplify, limit or transform what we conceive of as human freedom.

It is striking how many of the questions raised around data-based surveillance seem to be variations on themes Christian theologians have wrestled with for centuries. Today we may ask, To what extent does data-based targeted advertising manipulate our purchasing behavior, desires, even our political choices? Calvin used to ask, How does divine providence guide and steer our actions and fate in mysterious ways according to a divine plan?4 Today we may ask, Can algorithms read our minds and predict our behavior? Boethius would have asked, If God knows everything, can my choices be considered free?5 Today we may ask, Do we want intelligent machines to track all of our movements, purchases, conversations, behavior? And the Psalmist would have wondered, „You know when I sit down and when I rise up; you discern my thoughts from afar. You search out my path and my lying down and are acquainted with all my ways. Even before a word is on my tongue, behold, O Lord, you know it altogether. […] Where shall I go from your Spirit? Or where shall I flee from your presence?” (Ps 139:2–7).

That digitization is in some (some!) ways comparable to divine omniscience is my working hypothesis, and the point of comparison, I will argue, is its world-duplicating character. In one of the most recent sociological analyses of digitization, Armin Nassehi defines the digital as “simply the duplication of the world in the form of data with the technical possibility of connecting data with each other, in order to re-translate them to particuar issues.”6 Nassehi sees its unspecificity, its universal applicability to paradoxically be the particularity of the digital – a characteristic which, as he states, “up to date had been reserved for the presence of God and the use of writing.”7 Nassehi‘s comparison may surprise, but the point here is that the digital is less like particular, specific technological innovations (think: steam engine, airplane, or telephone), not even like a technology underlying the widespread development of more technology (think: electricity). Instead, it is more like other translations or duplications of the world into discrete discourses, like money, like language, like the mind of God.

Language, already in its spoken form, has the same property of being ubiquitously applicable and effectively translating the world into text – even more so through writing, which creates a world of its own, an archive in which different independent items „have the properties of being mobile but also immutable, presentable, readable and combinable with one another“8. In examining the world as text, writing refers to writing, establishes connections between writing and writing in the form of more writing, and generates new textual output which can be re-ascribed to the world. New insights about the world emerge not only in interaction with the world, but in the interaction between writing and writing. In some ways, digitization is but a radicalized form of writing – writing in a rigorously simplified and standardized language.9

Money is a similar medium: a formalized language which translates everything (everything!) into values that are commensurable and which therefore allow to calculate, aggregate, analyze, and cross-reference things which previously could not be put into a relationship. Money, just like writing, is a rendering technology that is universally applicable to anything in the world, creating a particular kind of shadow text of the world onto the world, on which operations can be performed that in turn are non-neutral to the world itself. It duplicates the world without containing it while having real repercussions in it.

And God? In light of the parallels between digitization, writing, and money, it should be clear that the reference to God is not just a shallow allusion to the often invoked or even aspired ubiquity of digital technology. In traditional Christian thought, God‘s omnipresence and omniscience create a similar „film“ on all of reality, an accompanying presence that pervades all contexts and adds an interpretive layer. In many more analytically inclined theologies, the mind of God is even understood as the perfect representation of all that is, all possible data in all meaningful relationships. It is the very definition of a data double of the world towards which digitization can only aspire. More than money or language, divine omniscience is therefore a strong conceptual parallel for the digital.

That is not to say that theology could comprehensively give an account of emergent technologies and the societal transformations in their wake – that would be absurd. But in the centuries of conversations about divine omniscience, theology may have developed conceptual frameworks which can provide helpful guidance in the interrogation of „the digital“ today. On the other hand, examinations of „digital“ issues may contribute important corrections for theological reflection. In what follows, I want to offer some specific ways in which drawing on theological discursive formations allow to discern and hone important questions and contentions vis-a-vis digitization. Even if I can only cursorily treat them here, I hope these suggestions – tentative in nature and presumably in need of correction from experts in technology, philosophy of science, and sociology – open routes of conversation.

In a first part, I will sketch how parallels in the discussion of divine omniscience call into question two wide-spread (if not uncontested) assumptions about data-based knowledge: its objectivity and its neutrality. In a second part, I will to build on these theoretical foundations and proceed to demonstrate how thought developed in the discussion of divine omniscience can illuminate why the contemporary focus on privacy is not enough: Privacy is incapable of accounting for deeper structural transformations through digitizations and therefore fails to address issues that emerge from them.

2The imperfections of propositionality: The objectivity fallacy

2.1Divine perfections and propositional knowledge

Contemporary treatments of divine omniscience almost invariably start something like this: „Since omniscience is maximal or complete knowledge, it is typically defined in terms of knowledge of all true propositions“10. And the propositional model is very powerful, since it devises a universally applicable, abstract, and formalized structure which can be used to formalize truths and truth claims, distill them to the point of almost being able to calculate truth through all possible combinations of true propositions. The propositional approach, however, leads into unsolvable dilemmas when applied to divine knowledge.11

Most importantly, it creates a difference between the thing that is known and the knowledge of it. A concrete thing is not the same as a proposition about the thing, the proposition is an abstract entity derived by abstracting a specific property of the thing, rendering it into a specific form which is not the thing itself. The world of true propositions thus creates a kind of discursive shadow layer over the world. Reality then exists twice: once as it is, and once in the form of true propositions about reality in the mind of God.

This creates a further, and – for the theologian – even more problematic difference: a difference in God: between the essence of God and God‘s knowledge. The essence of God, according to classic12 theistic conceptions is simple, unchangeable, and eternal – but God‘s knowledge, if made up of propositions, would be composite. It would also be either temporal or at least temporally indexed, since propositions about future events only acquire a truth status, and therefore only enter into the realm of God‘s knowledge, with the passing of time.

These issues illustrate why classical theologians have actually typically not understood God‘s belief to be propositional. If God is thought of as absolute simplicity, then there can be no distinction between God, God‘s knowledge, and the objects of God‘s knowledge. God‘s knowledge has to be immediate and intuitional rather than propositional and indirect; there can be no „detour“ of propositions or other medial translations/duplications. Brought to its logical conclusion as in Thomas Aquinas, this means that God‘s knowledge can only be God‘s own essence and the knowledge of the world simply has to be inscribed into God‘s knowledge of God‘s own will.13

From the tensions created in the doctrine of God, theologians have inferred more generally: „It seems plausible to suppose that the propositional character of human knowledge stems from our limitations. Why is our knowledge parcelled out in separate facts? […] First, we cannot grasp any concrete whole in its full concreteness, […] Second, we need to isolate separate propositions in order to relate them logically, so as to be able to extend our knowledge inferentially.“14 Propositional knowledge can never be perfect knowledge and therefore is not the right category for divine omniscience – it is too indirect, too mediated, too much reliant on a logical or proto-linguistic structure, and it therefore fails to be comprehensive, unbiased, and objective.

2.2The interpretive character and epistemological closure of digitization

What does this insight from the doctrine of God yield for assessing „the digital“? Well, the digital is the epitome and radicalization of propositional form – with all its limitations. Working off Nassehi‘s above mentioned definition, the digital is not so much a new technology as a formalized mediation of the world, a mode of reading the world. It renders the world into data, duplicating it, producing a discursive world of its own. This duplication entails both a simplification and a complexification. It is clearly a simplification because in order to produce data, a reduction is necessary, a concentration on certain aspects which are then (re)presented in form of data. It is the divestment of information of the world that allows to make the incommensurable commensurable, allows for the computability of the world15. The digital form is in fact ingenious in maximally reducing the complexity of information to a binary signal – 0 or 1, off or on – or any cominations of such binary signals which may be long, but remain simple, and are therefore easy to store, transmit, and read. This is the promise of the digital: that because of its reduced and computable form, it is both universally applicable and highly efficient.

This simplification is however an operation which generates complexities. Data – despite what the name might imply – is of course never simply „given“, but has to be generated through a process that involves complex hermeneutic operations: „Raw data is an oxymoron“16. The process of abstraction and reduction that „gives“ the world in form of data rests on interpretive processes: what is established as the object of measurement, as what any given instance „counts“, when it starts counting, and so forth. Categories and types have to be imagined according to which things are then counted. Seemingly objective data has to be produced through highly subjective processes of observation – regardless of whether the observer is a human being or a sensor – „the perception of the world and the processing of information is primarily discernment of patterns, where the patterns are less inherent in the object itself, and more in the object-ivity (Gegenstaendlichkeit) generated through perception.“17

The resulting data is a construction, a creation of something new which only exists in its duplication – the world in form of data. Information is translated into a homogeneous medium of signals which allows to draw relations between incommensurable things. In order to derive information from such data, an active process of generation of information out of signals takes place, not a mere passive reception.18 As is well established in information theory, interpretation is not only irreducibly involved at the sending, but also at the receiving end of communication. Contrary to naive (or programmatic) tech optimist beliefs, data can never „speak for itself“19: „working with Big Data is still subjective, and what it quantifies does not necessarily have a closer claim on objective truth“20.

In this process, belief plays a decisive role. Scholarly definitions see big Data not only as a technological phenomenon, but as a complex „cultural, technological and scholarly phenomenon that rests on the interplay of 1) Technology: maximizing computation power and algorithmic accuracy to gather, analyze, link, and compare large data sets. 2) Analysis: drawing on large data sets to identify patterns in order to make economic, social, technical, and legal claims. 3) Mythology: the widespread belief that large data sets offer a higher form of intelligence and knowledge that can generate insights that were previously impossible, with the aura of truth, objectivity, and accuracy.“21 The mythology is actually instrumental in making the promise true as it drives a self-reinforcing cyclical process: The belief in bigger data sets will facilitate the spread of the technology (1) which will enhance the pattern detection in the analysis (2), thus further strengthening the conviction that large data sets generate superior insight.

The theological tradition can prompt us to take any claims of objectivity of data-driven approaches with a grain of suspicion. To be clear: The issue with the digital here is not (primarily) that it is quantifiable or reductionist, but that it is invariably epistemologically closed: „Data duplicates the world, but doesn‘t contain it.“22 The world outside of data only comes into view in and through its representation by data. Data science can only find patterns in the data it recombines, aggregates and cross-references, not in the world itself. Only data-oid things enter the calculation, and the patterns that are produced in this process are properties of the data, not of the world. The digital shares the paradox of all signals, which come to stand at the same time for themselves and for that which they signify. Just like perception, data is not the world nor does it objectively represent the world, its function is rather the „testing of hypotheses about the world“23. The propositional form is not the only way to think about knowledge, and is in fact one that is interpretionally quite „productive“ – which leads to the fallacy discussed in the next section, the „non-neutrality“ of digitization: It re-makes the world in the particular form of propositional statements.

Before launching the next section to explicate this „productivity“ of technology and why therefore the neutrality view of technology is a fallacy, I want to earmark for further theological discussion that when we compare recent propositional accounts of divine omniscience with the classical conceptions, we might actually see the beliefs driving the digital age reinfiltrating theology. Defining divine omniscience as knowledge of all true propositions which effectively duplicates the world into a „mind of God“ is quite a recent invention, and it may not be a coincidence that its spread goes hand in hand with the rise of „the digital age“24. What presents itself as an objective, general model of knowledge may in fact be quite contextual to the specific branch of modernity we live on.

3Worldmaking beyond manipulation: The neutrality fallacy

3.1Real-world effects under the neutrality assumption: Knowledge is power

Not only does digital propositionality fail to achieve perfect knowledge, it is also not neutral – which I want to explicitly distinguish from „not objective“. Digitization does not just add an interpretive layer to reality which remains external to reality while leaving it untouched – which the propositional form did. Digitization also alters the reality which it only pretends to represent. Whereas non-objectivity points to the inevitably interpretational nature of digitization, non-neutrality points to its real-world effects: what Foucault would have called its productivity.

Of course most people are aware that digitization has real-world effects, and that such effects could be judged to be positive or negative. There are those who think that digital technologies hold the key to everything good: progress, economic growth, personal enjoyment, convenience, transhumanism, and that they will usher in a new age with unprecedented possibilities through more precise knowledge, increased efficiency, and better tailoring of technological solutions to cure all of society‘s ills. There are also those who call out the way in which digital technologies generate social alienation, replace whole employment sectors, amplify bias, or facilitate oppression or even totalitarianism through corporate power, political manipulation and control of individual behavior as well as societal processes.

However, most people will lean towards the seemingly more balanced assumption that technology as such is neither good nor bad in itself, but that it has to be judged according to the uses it is put to, in short, that it is, in and of itself, neutral. Knowledge is power, it enhances the possibilities of the wielder to achieve their aims, but whether it is good or bad depends on the use it is being put to. Such an assumption rests on an instrumental view of technology. The instrumental understanding views technology – from its simplest to its most sophisticated forms – as a tool. A tool, like a hammer or a knife, is not good or bad in and of itself, but has to be judged according to the end to which it is put, the intentions with which it is applied, the outcome, and the consequences its application engenders. A hammer can be used to build a shelter or to break a person‘s scull. Data analysis can be used for racial profiling as well as for life-saving medical diagnostics. Knowledge derived from social media data can be used to manipulate elections as well as to facilitate grassroots organizing. And so on.

This view of technology is not completely wrong of course (I don’t deny moral accountability for the way individuals, institutions or corporations use either hammers or data analysis), but it is incomplete. In the 1980s, Melvin Kranzberg, on of the 20th centuries important historians of technology formulated what has become well-known since as Kranzberg's first law of technology: „Technology is neither good nor bad; nor is it neutral.“25 Kranzberg saw the need to take into account „the utopian hopes versus the spotted actuality, the what-might-have-been against what actually happened, and the trade-offs among various ‚goods‘ and possible ‚bads‘“ as well as „how technology interacts in different ways with different values and institutions, indeed, with the entire sociocultural milieu.“26 These broader factors would make any judgment more ambivalent – differing effects come together without cancelling each other out, which yields an uneasy, „it‘s complicated“.

In practice, this version of non-neutrality usually evolves into a view of technology as „benign, if regulated“: The technology as such will continue to be seen as ambivalent in its effects but in itself morally neutral. This means it could potentially be used for good; the issue becomes discerning where to draw the line between good applications and problematic applications. This is an important task, and it will legitimately take up the bulk of ethical and legal reflection on emergent technologies.

I am not an ethicist or a politician. Others are better qualified to assess the moral quality of potential effects and to develop regulatory frameworks. As a systematic theologian, my relevant expertise may instead lie in assessing the more general differences a difference in the structural architecture of any „system“ makes. I am therefore interested in the non-neutrality of technology even „before“ any of its applications.27 I want to examine the specific ways in which technology is non-neutral, i.e., the broader transformative power of the technology under question, or what Foucault would have called its productivity. How digitization changes the nature of the problem – this non-neutrality has to be distinguished from the moral neutrality or non-neutrality of its uses.

3.2Divine omniscience: Power is knowledge

In divine omniscience, we do find views corresponding to the more instrumental understanding such that God uses God‘s knowledge to influence the course of events. In these accounts, God is an agent who interacts with history like others, and God‘s knowledge enhances God‘s power in the same way as technological tools enhance human abilities to achieve their intended aims. Like in a game of chess, God‘s intricate knowledge of the game and the other players gives God a unique and decisive advantage.28 If knowledge is power, then more knowledge is more power, and omniscience evokes omnicompetence (if not outright omnipotence). So far so good, so unspectacular.

What should give us pause is that the instrumental view of knowledge is not the primary angle on divine omniscience in the tradition, and that theologians have seen human freedom seriously threatened by omniscience even though abuse of power is not a common worry raised with regard to God. Nevertheless, theologians have raised contentions with regard to divine omniscience based on the control it exerts or might be thought to exert to the verge of determinism. What we could learn from theology is that the moral non-neutrality of the effects of superhuman knowledge might not be the only or indeed the most fundamental non-neutrality involved. We need think about the „productivity“ of technology, beyond – or before – the question of its right use.

From divine omniscience we learn that not only is knowledge power, but more importantly: Power is knowledge. Divine omniscience is not only a tool that would intervene in the world in this way or that, instead, it forms the world itself according to its image. Not only is there nothing that exists that God doesn‘t know – without divine knowledge of it, there wouldn‘t even be a world. Theologians have argued over the centuries whether God‘s knowledge of the world precedes God‘s eternal decree to create the world or the other way around.

Either God knows the world infallibly because God willed all of reality into being. What is true is then true because God willed it to be, and God knows God‘s will. To stay in metaphor: God invented the game, laid down the rules, and designed the characters playing it. Since God is in control of the game as its creator from eternity, God already knows the outcome – no wonder that under these assumptions theologians have invariably run into dilemmas between divine foreknowledge and human freedom.29 Even if this conception sees God‘s knowledge as reflective of being, not causative, the fact that God knows things to be true infallibly from eternity basically precludes their ability to be otherwise.

Or – as prominent thinkers have stipulated – God‘s knowledge of the world actually causes the world to be. This even more clearly „productive“ understanding of knowledge can be summed up as follows: „God‘s power is His knowledge. He creates by thinking. Whatever is is sustained immediately by the knowledge of God. […] The mirror passively reflects the objects present. God‘s knowledge produces them.“30 In that case, there is no difference, no double text, because the world that exists is the world in the mind of God.31 „Esse is percipi”32— to be is to be perceived, or: it is the God‘s knowledge that sustains reality in being.

Whether God‘s knowledge is seen as causative of the world, or whether it is understood to reflect God‘s will that brought forth creation, theologians have usually agreed that God‘s knowledge of the world ontologically precedes its existence, and that divine knowledge and power are co-constitutive, co-extensive, and identical with God‘s essence33. In other words, we do not need to learn from Foucault34 that knowledge is not just an instrument which confers power over a world, but that power is what generates knowledge and gives it its particular shape.

3.3Digital game-changing, or: Towards a computational ontotheology?

Doctrine can teach us that at the intersection of power and knowledge, manipulation or abuse is not the only issue. With the „mind of God“, the productivity of the data double is immediately apparent, in the case of technology, the productivity may not be quite as crass. But even if the technological knowledge of the world does not create the (whole) world itself, it is still clearly non-neutral to it.

These insights apply therefore even when most data is actually not collected in order to manipulate35 anyone (in the sense of: moving them towards doing something specific against their will or natural inclination), but to control behavior, i.e., to make it readable and predictable, to account for every variable in it, and to expand the duplicate data world. The latter may even be the most decisive factor because it draws on a self-reinforcing loop: More data generates more power because it generates more reality: Firstly, it expands the shadow universe, not only by adding the respective individual items of data to its archive, but also by in this way expanding it with an infinity number of additional possible combinations, correlations, predictions and inferences which in turn yield a lot of additional data, therefore further augmenting the duplicate text.

Secondly, it expands the real world: Digital technologies do not just generate a shadow text that is external to the world. Like with writing, the generated text is in the world as more concrete objects and artifacts – data, code, algorithms… – which are not just an interpretive layer on reality, but objects with which the „original“ world itself then interacts. While the world is duplicated into the digital without being contained in it, the digital itself is in fact contained in the world, populates it, and becomes a part of the world itself with which the world materially, socially and culturally relevant relations.36

The duplication into data generates a version of the world in which both problems and their solutions can be precisely described. This is in fact the appeal and the promise of the digital, what makes it so efficient – that its reduced and computable form allows to discern relationships in the data of the duplicated world, to perform operations on it in form of aggregation, cross-referencing, analysis, at the end of the day in the hope of managing the world which it describes. „The paradox situation ensues that the border between them cannot be overcome, but in practice always is overcome.“37 Technologies of knowledge are not neutral to the world they describe – they are involved in „the reality business“38.

Continuing in the game metaphor39, we can describe the non-neutrality of technology as follows: Technology is not neutral not because it produces good or bad game moves or because it makes good or bad people win the game, but insofar is it puts new pieces on the board within the game, manufactures the board on which the game is played, and fundamentally alters the rules according to which the game is played.

Tech optimists and pessimists alike point to the deeply transformative effects of technology, effects that extend beyond the good or bad intentions of those who apply them: „Change the instruments, and you will change the entire social theory that goes with them“40. Part of the game-changing nature pertains to the change of the very criteria for what can become objects of knowledge: They change „the standards governing permissible problems, concepts, and explanations“ as well as „the institutional and conceptual conditions of possibility for the generation, transmission, accessibility, and preservation of knowledge“41. Technologies of knowledge do not just expand the range of possibilities to whoever is in control of these knowledges; Extant power structures of course shape the processes and technologies of data extraction and determine what becomes knowledge – an observation from the non-neutrality of technology which adds another aspect to the non-objectivity discussed earlier. Technologies of knowledge engender certain kinds of power relations and certain kinds of subjectivities through the way they mediate reality.42

As technologies change the ways we view the world, the way we interact with it, and the ways we make decisions, they engender and shape epistemic possibilities as well as conditions of freedom. „As the advantages of the computational approach to research […] become persuasive […] the ontological notion of the entities they study begins to be transformed. These disciplines thus become focused on the computationality of the entities in their work.“ Berry even goes so far as to stipulate: „Computationality might then be understood as an ontotheology, creating a new ontological ‚epoch‘ as a new historical constellation of intelligibility.“43

The doctrine of omniscience can direct our attention to the fact that technologies of knowledge production are non-neutral to the world because they change the rules of the game. In the next part, I will address more concretely some of the particular ways in which digital technology is non-neutral, and what different kinds of issues come into view once we take seriously this non-neutrality. In particular, I will argue that the contemporary focus on issues of privacy fails to take into account the non-neutrality of digital technologies and how it is therefore completely unable to track and account for crucial emergent issues.

This second part of my contribution — why privacy is not the central issue of the digital, and how Luis de Molina’s concept of “middle knowledge” can help us understand that better — can be found here: https://cursor.pubpub.org/pub/reichel-omniscience-ii

Footnotes
43
Comments
38
Michael Hemenway: how is predictive capacity of AI and its impact on our identity similar and different from the social predictions we enact on each other based on patterns/stereotypes/dispositions?
Michael Hemenway: Middle knowledge sounds a little bit like inference in a machine learning system. A selection based on probabilities from a large set of possibilities.
Kate Ott: Agreed! Great point.
Michael Hemenway: If knowledge is always probabilistic, then is there ever such a thing as omniscience as traditionally imagined?
Kate Ott: I wonder how this overlaps with the last of my intro stories related to technological value of unlimited abundance.
Michael Hemenway: Brilliantly articulated!
Michael Hemenway: Technology is not neutral because it is always a participant in the game, no? This is what affordances remind us of.
Michael Hemenway: What about neural networks and the debate around black box models in machine learning? Perhaps the digital is not all duplication?
Michael Hemenway: Does Nassehi differentiate between digitization and digital? Even if so, Is there a difference that matters? The reduction of digital to duplication unsettles me a bit, so I will percolate on that and ask more questions.
Thomas Renkert: In that sense, Nietzsche would have called Christianity as the first form of digitalisation in the course of history!
Hanna Reichel: can you help me with the Nietzsche reference?
+ 2 more...
Florian Höhne: Yes, that makes a lot of sense!
Florian Höhne: Ok… ….and here i am with you.
Florian Höhne: Here, my second question from above is relevant: Data science can only find pattern in the data, not in the world itself. But in a relation beween data and world itself that is mutual in the first place, the patterns or inscribed into the world observed. I run regularly (well, i don’t), because of an app observes that, i write comments differently on facebook, i take different routs from A to B, because of or at least influenced by the patterns that matter in digital observation.
Clifford Anderson: I take Hannah to be making a distinction between the world as we experience (i.e. what Husserl termed “the lifeworld”) and scientific observation of the world, which abstracts from the world in order to represent it for a particular purpose.
Florian Höhne: I find the approach to take insights form the discussion of the doctrine of God to better assess “the digital” fascinating, helpful, and very plausibel. Reading this first of this sub-chapter about why the digital is the radicalization of propostional form, raised the following two questions form me: 1) In the light of machine learning and dynamic algorithms, wouldn’t it be plausible to understand the digital rather as the radicalization of procedural knowledge: data processing units don’t know that, they only know how to deal with data. 2) Correct me, if i am wrong, but the underlying picture here seems to be predominantly a linear one: the world —> a simplified and complexified duplicate of this very world in the form of (always already interpreted) data + the process of working with such data. I am wondering how fitting this difference between world and its digital form is and whether the process of digitalization does not work more mutual in the first place… Does that make sense?
Hanna Reichel: I am intrigued - tell me more about the mutuality you are envisioning here!
+ 1 more...
Benedikt Friedrich: This concludes with the propositionality-account on divine knowledge. But isn’t there also an option to recognize the temporal structure of what we call knowledge. It seems to me that you understand divine knowledge always as some kind of foreknowledge meaning that the “all” that God knows is the “all” of all times, especially of the future’s. And I know that this would add another provocative challenge to many theologians, but wouldn’t this problem disappear if we could think of God’s knowledge as temporal?
Hanna Reichel: Yes, the bulk of theological discussion on omniscience has centered on foreknowledge, presumably because it is there where a conflict might be seen between creaturely freedom and divine knowledge. And the classically theistic models of divine knowledge have to reject the temporality of God, because of simplicity vs. difference in God: God’s knowledge has to be identical with God’s essence, therefore it cannot change or be subject to time etc.Open Theism has no qualms with thinking about omniscience in temporal terms, and to leave the future out of it: God has perfect knowledge of all that is — i.e., the past and the present, and therefore God’s knowledge basically grows with time in the same way as the ontological universe expands with its history.But what really trips me up is that Open Theism is at the same time the theology most prone to confusing divine knowledge with the propositional, datafied, statistical account: for them, God makes predictions about the future in precisely the same way as your computer model would: by calculating probabilities and filling in the blanks. Somehow this makes me think they might be more under the spell of contemporary developments than a helpful resource to get theology beyond them. What do you think? Or were you thinking of other models altogether?
Benedikt Friedrich: I think this raises the question of how incommensurable things actually are. There was a rather polemic review on Nassehi’s book problematizing this paradox.
Hanna Reichel: link?/reference?
Benedikt Friedrich: One might also conclude the exact opposite — for example by saying that the world is all an imagination of God. Then you could think of the proposition and the thing unified within God’s mind.
Florian Höhne: …and that would be a way of understanding God’s knowledge as not to be propositional, right?
+ 1 more...
Benedikt Friedrich: This observation sounds absolutely convincing to me! And it’s both fascinating and scary. My question would be: What could a robust account on God’s alterity mean in this context? Would it mean to dispense with this model of world-duplication or…?
Hanna Reichel: So the theological accounts that point to the ways in which divine knowledge has to be different from human knowledge (e.g. because of the conflicts with simplicity etc) become somewhat apophatic at that point. They will usually indicate that God’s knowledge has to be simple, non-propositional, and just somehow immediate and intuitive, without being able to describe how so (and probably, it’s not really possible to get this specified this way)… But may the alterity you envision would be more “qualitative”/”particular” than that? I would be intrigued, e.g., to explore what a Christocentric account of omniscience might look like. There are several interesting inroads here, e.g. between “logos” theology and information theory (Niels Gregersen) but I haven’t found quite the right angle yet. WWBS?
Benedikt Friedrich: Maybe this is also a direct result of theologies that are confusing “God” with what is in any way transcendent to me or what is beyond my capabilities to comprehend.
Hanna Reichel: sure. And theologies have been contributing actively to this confusion for centuries…
Frederike van Oorschot: very compelling!
Frederike van Oorschot: I would love to bring your thoughts, Hanna, together with Kate Otts description of technology as tool and culture, which offers a slighly different aspect, as I understood it.
Hanna Reichel: if I understand Kate correctly, she also rejects a purely instrumental account. The tools we use MAKE cultures. Yes, I’m interested in that discussion, too
Frederike van Oorschot: Michael, I would be very interested to hear your comment on this description related to your project! I am looking forward to our discussion.
Michael Hemenway: It will be fun to discuss this! I absolutely agree that mythology is part of what drives the investment in big data approaches to machine learning and such. I am grateful that Hanna is challenging the objective fallacy that runs rampant in the machine learning discourse. As to the 2 items you highlighted, Frederike, i can not argue much with point (1) - More data will likely enhance pattern detection. Yet, having enhanced pattern detection on data sets that are fundamentally biased will not generate anything useful at all. On (2), i would not argue that large data sets give us superior insights. I would argue that reading alongside machines and large data sets can generate DIFFERENT insights and that difference can be valuable.
Frederike van Oorschot: Very compelling, I agreee completely, Hanna.
Frederike van Oorschot: and thereby constructing the world?