"To think a world without thought": Negotiating Speculative Realism in a Digital Humanities Practice

Abstract

In John Napier’s classic study of the structure, function, and behavior of the human hand, he states that "[i]magination is basic to tool-making. All human-made tools start off as chunks of undifferentiated material, which are then shaped according to some cerebral blueprint of what is required" (1993, 99). These words can be productively repurposed to describe the digital humanities practice of tool making, tool use, and modelling. After all, making, shaping, and using tools for analysis, archiving, and visualisation has become central to humanistic research methods. Napier's "cerebral blueprint" blends imaginative and technical ways of making that emulates how digital humanists seek out new insights alongside technological development. As increasing numbers of digital humanists hone their literacy of the programming languages to build and remake tools, I argue that the means by which we describe this particular philosophy of tool use will become an increasingly thorny issue that may even hinder the usefulness of the knowledge produced by the digital humanities. Therefore, digital humanists will need to find ways to negotiate this role in the humanities and better define their critical agency within the history of epistemology. Additionally, a collaborative development of tools requires a theoretical framework that is critical of the value of data derived from literature in a purely instrumental way and is able to redefine research artifacts in the humanities to include digital tools.

 

Dans son étude classique de la structure, de la fonction et du comportement de la main humaine, John Napier affirme que "l’imagination est à la base de la fabrication des outils. Tous les outils fabriqués par l’humain sont, au départ, des morceaux de matériaux indifférenciés, qui sont ensuite façonnés selon un certain schéma cérébral de ce qui est requis" (1993, 99). Cette affirmation peut être réutilisée de façon productive pour décrire la fabrication, l’utilisation et le modelage des outils, selon la pratique des humanités numériques. Après tout, fabriquer, façonner et utiliser des outils pour l’analyse, l’archivage et la visualisation est maintenant au cœur des méthodes de recherche en sciences humaines. Le "schéma cérébral" de Napier combine des moyens imaginatifs et techniques de fabrication qui imitent la façon dont les humanistes numériques recherchent de nouvelles idées en même temps que le développement technologique. À mesure qu’un nombre de plus en plus élevé d’humanistes numériques perfectionnent leurs connaissances des langages de programmation pour bâtir et refaire des outils, j’argumente que les moyens par lesquels nous décrivons cette philosophie particulière d’utilisation des outils deviendront une question de plus en plus épineuse qui pourrait même gêner l’utilité des connaissances produites par les humanités numériques. Par conséquent, les humanistes numériques devront trouver des moyens de négocier ce rôle dans les humanités et de mieux définir leur mandat critique au sein de l’histoire épistémologique. De plus, un développement collaboratif d’outils nécessite un cadre d’applications théorique qui est essentiel à la valeur des données dérivées de la littérature de façon purement instrumentale, et qui est capable de redéfinir les artéfacts de recherche dans les humanités afin d’y inclure les outils numériques.

Keywords

philosophy, distant reading, materiality, speculative realism, instrumentality, empericism

How to Cite

Mauro, A. (2014). \"To think a world without thought\": Negotiating Speculative Realism in a Digital Humanities Practice. Digital Studies/le Champ Numérique, 5(1). DOI: http://doi.org/10.16995/dscn.52

Download

Download HTML

1171

Views

165

Downloads



Introduction

In John Napier’s classic study of the structure, function, and behavior of the human hand, he states that "[i]magination is basic to tool-making. All human-made tools start off as chunks of undifferentiated material, which are then shaped according to some cerebral blueprint of what is required" (1993, 99). These words, I wish to argue, can be productively repurposed to describe the digital humanities (DH) practice of tool making, tool use, and modelling. After all, making, shaping, and using tools for analysis, archiving, and visualisation has become central to humanistic research methods. Napier's "cerebral blueprint" blends imaginative and technical ways of making that emulates how digital humanists seek out new insights alongside technological development. As increasing numbers of digital humanists hone their literacy of the programming languages to build and remake tools, I argue that the means by which we describe this particular philosophy of tool use will become an increasingly thorny issue that may even hinder the usefulness of the knowledge produced by the digital humanities. Therefore, digital humanists will need to find ways to negotiate this role in the humanities and better define their critical agency within the history of epistemology. A collaborative development of tools requires a theoretical framework that is critical of the value of data derived from literature in a purely instrumental way and redefines research artifacts in the humanities to include digital tools.

There is currently a division in DH regarding the assumed value of quantitative analysis, which has been criticised for eliding cultural or political arguments and sidestepping traditional close reading techniques. The term "distant reading" first articulated by Franco Moretti has come to describe the process of computational analysis of literary texts (2000, 56), but it has also become emblematic of a perceived distance from the cultural, social, and political origins of literary art. Matthew Jockers’s Macroanaylsis (2013) has recently added to Moretti’s original description with a greater emphasis on a larger scale of analysis that questions the value of literary interpretation predicated on close reading a small selection of exemplary texts. As a further critique of canons and dominant high culture, distant reading holds the promise of an objective view of cultural thought over time. Meanwhile, Alan Liu has predicted in the essay from Debates in the Digital Humanities, "Where is Cultural Criticism in the Digital Humanities," that an emphasis on quantitative analysis at the expense of cultural criticism will "block the digital humanities from becoming a full partner of the humanities" (2012, 492). With a view to breaking down any false dichotomy between pure instrumentalism and subjective speculative thought, Liu follows Willard McCarty by suggesting that understanding how the computer is more than "just a tool" will open DH to the kind of cultural criticism that is needed in the field (2012, 498). The cultural importance of computing in the humanities stems from both the function and cultural value of tools more generally, which suggests that computing tools have a historical and cultural significance bound up with the analytical tasks they are called to perform. In other words, the critical infrastructure of the digital humanities must be predicated on a reflexive critical awareness of the technical limitations and assumptions of the tools at hand. However, the rise of the digital humanities has also gone hand in hand with a decline in the stylistic excesses of the theoretical paradigm that preceded it. With an awareness that methodological experimentation is supplanting theoretical speculation, this article will describe the most recent theoretical and philosophical contexts of two competing philosophies that inform so much DH practice. But, rather than reworking any philosophical shorthand of terminology that is so often derided as mere jargon, the purpose of this article is to translate this recent theoretical discourse on handedness and tool use into a DH context.

Speculative materialism

While the ranks of philosophers in North America and Europe have yet to engage fully with DH tools, digital humanists are situated at the edge of a philosophical debate regarding tool use. On the one hand, the speculative realists—led by Alan Badiou, Quentin Meillassoux, and Graham Harman—have found that the absolute meaning and clarity afforded by the analytic tools of mathematics and computer science have also produced a viable alternative to a metaphysics predicated on a correlation between language, thinking, and being.[1] In other words, computation is undermining the phenomenological basis of ontology; or, in still other words, proving one’s existence no longer requires human perception because complex tools can verify these facts independently of our experience. The absolute truths made possible by, for example, radiocarbon dating or interplanetary sciences—scientific truths that Meillassoux has termed as "the ancestral" (2008, 10)—do not require direct human experience and are therefore outside the domain of the metaphysical correlation between language and being. The "ancestrality" of the geological or cosmic timescales demand for us "to think a world without thought," says Meillassoux, "a world without the givenness of the world" (2008, 28). By grasping science and philosophy simultaneously, Meillassoux imagines a "speculative materialism" (2008, 121) that does away with the subjective agency in the speculative understanding of ontology. On the other hand, an early defence of the poststructuralist tradition has been mounted by Alexander Galloway in a recent issue of Critical Inquiry, where he describes how "[s]oftware is the thorn in the side of contemporary philosophy" (2013, 359) because so much of our real life experience is now bound up with "the logic of mathematical disciplines" through technology (2013, 359). Galloway then argues that speculative realist philosophy, which is predicated on natural laws, scientific proofs, and mathematical certainty is "dangerous" because it "abdicates the political decision" (2013, 365).[2] For this reason, Galloway also argues in his short monograph, The Interface Effect (2012), for a theory of new media that demands that "the medium of the computer is being" (Galloway 2012, 21). Without attention to the nested relationship between computer programs and hardware that makes the computer so much more than a singular medium, Galloway argues that the current headway in the humanities that is made possible with computational tools represents, first and foremost, a further metaphysical exploration of the human ontological condition. So, the debate unfolds like this: either the digital tools that allow us to think through humanist questions are a correlate of human thought and are therefore ontological in nature, or DH tools are radically independent of us and offer a more objective view of our research and reveal real truths unhindered by human limitations and biases. While I acknowledge that this dichotomy is crude, I would like to argue that this rough dialectic dissolves through the diminution of the physical manifestations of computing and the textual basis of code as language that finds expression in the real world. The UNIX "whoami" command may superficially confirm Galloway’s ontological metaphors, but Moretti’s observation on the "sheer enormity of the task" of reading world literature completely is a perfect analogue of Miellassoux’s ancestral timescale (2000, 55). Between hardware systems and the layers of software to operate a DH tool or codebase, there is a supplementarity of tools upon tools that obfuscates the agents and actions of computation. I wish to situate this interplay between human and instrumental agents in a digital humanities methodology within a theoretical continuum of thought originating in the twentieth century.

The commonly cited inception of this debate on the value of digital prototypes originates from a comment made by Lev Manovich at the 2007 Digital Humanities conference. Manovich reportedly claimed that "a prototype is a theory," and that digital humanities practitioners need to "stop apologizing" for these prototypes. A short time later, Alan Galey and Stan Ruecker describe how this perspective on designing and making things—whether they are instrumental tools or contingent prototypes—is an "ethos of thinking through making" (2010, 407). Through the profound abilities of computer technology to function as a modelling machine, this rather pragmatic hermeneutical approach allows for a prototype to embody or enact an argument about a text or, in the case of Galey and Ruecker, a simultaneous future and history of the book. In other words, digitizing the past also invents the future. In another essay from the recent Debates in the Digital Humanities text, Stephen Ramsay and Geoffrey Rockwell offer a departure from this correlation between thinking and making by describing their "materialist epistemology" in defence of "building" as a "scholarly endeavor" (Ramsay and Rockwell 2012, 77). They go on to warn against an overly simplistic pragmatic philosophy of building as a Heideggerean "ready to hand" theory (2012, 78). For example, the purity of purpose of Heidegger’s hammer does not easily transfer to more complex technologies. When Heidegger explained in Being and Time (1926) that "[t]he hammering itself uncovers the specific ‘manipulability’ [‘Handlichkeit’] of the hammer" (1926, 98), he was speaking of very simple hand tools used in carpentry. Heidegger was fascinated by dovetailed cabinet drawers and not complexed network computing systems. Because the transparency between hand tools and intention does not easily translate to even twentieth century technological systems, he would later refer to these systems generally as "mechanisms." For this reason, the perception of a tool's lack of transparency is most often noticed when it fails, or, as Ramsay and Rockwell explain, "[w]here there is argument, the artifact has ceased to be a tool and has become something else" (2012,78). This "something else" is discourse and argument that operates like so much scholarship. Since digital humanists have so many readily available tools and so many digital humanists have made passing reference to Heidegger’s theory of tools, I believe it is important to unpack just what this ready at hand means for our sense of building or making as humanistic research.[3] I will then situate materialist thought within this new speculative realism, offer an account of Heidegger’s theory, and compare it to current DH theory of tool use.

Outside the (operating) system

Materialist epistemology and speculative materialism appear to be parallel terms in this trend towards using and making things or tools to study text. "Materiality" has become a perennial concern for digital humanists eager to rebuff the assumed transcendence of data and digital technologies, but this materiality also represents an objective stance outside the text and an attempt to access the real beyond seer discourse. As Lisa Gitelman describes in Always Already New, "[t]he quotation marks around material serve obliquely to interrogate the claims being made. Both critics imply that there is no putting a finger exactly on the matter at hand, if logic is logic, but material is ‘material’" (2006, 91). To continue with Rockwell and Ramsay’s term, this materialist epistemology contains a duplicity that demands that digital tools are at once material in the realist sense and also objects of speculation. Matthew Kirschenbaum has long worked in this hand-in-hand movement between materialism and discourse: firstly, the false assumption that such digital technologies are immaterial, as Kirschenbaum described in the "Editing the Interface" essay, has long been identified as a "tactile fallacy" (2002, 43); secondly, he also describes in the Agrippa chapter in Mechanisms (2008), "[t]here is always something outside the (operating) system" (2008, 242). While Kirschenbaum uses the example of a rodent chewing on computer cables, it is possible to extend his thinking into the software systems that bother Galloway so much and are inherently reliant on hardware systems that are reliant, in turn, on a vast web of infrastructure and human relationships of work and collaboration. By moving beyond the operating system, as Kirschenbaum insists, the network of reliances gesture towards the cultural, social, and political connections between tools, humanistic inquiry, and lived experience. The manifold materiality of technology can be exposed as a constant consideration in DH theory: for example, Katherine Hayles describes in How We Became Post-Human (1999) how a new subjectivity that acknowledges embodiment and discursive manifestations is "constituted by the crossing of the materiality of informatics with the immateriality of information" (1999, 193); Lev Manovich’s The Language of New Media (2001) undermines the "myth of the digital" as mere multimedia and emphasises the computational dimension of new media (2001, 68); Marlene Manoff describes the implications of such materiality for archives in 2004 and libraries in 2006; and Gitelman’s book continues this exploration of new media as a historical object by charting the development of sound recording, computational record keeping, and the web as following congruent development trajectories.

This short history of materiality in DH theory follows with the emergence of material culture and new media studies and the endurance of the material study of history. Stephen Greenblatt’s New Historicism was perhaps the first attempt to outstrip the post-modern notion, not without some reductionistic thinking about deconstruction, that culture is merely text. As Greenblatt and Catherine Gallagher explain in Practicing New Historicism (2000), New Historical methods encourage researchers to "identify new objects for study, bring those objects into light of critical attention, and insist upon their legitimate place in the curriculum" (2000, 11). Tools, quite simply, are the new objects for study that DH is now arguing for validity. Additionally, Bill Brown’s "Thing Theory" takes up this broader trend in both academic and cultural discourses to think through things. "If thing theory sounds like an oxymoron", explains Brown regarding his complicated negotiation between materialism and theory, "then, it may not be because things reside in some balmy elsewhere beyond theory but because they lie both at hand and somewhere outside the theoretical field, beyond a certain limit, as a recognizable yet illegible remainder or as the entifiable that is unspecifiable" (2013, 5). The beyond for Brown and Kirschenbaum is not some gauzy metaphysical realm of the unknown; instead, the beyond returns to the real world and lived experience by considering things as objects of speculative inquiry. It is precisely because the real is perceptible, by human senses or machine sensors, that digital humanities practice must study the effects on both technological and cultural systems. However, the issue for Brown is not the material effects of ideas and ideology. Rather, Brown describes the "ideological effects of the material world and of transformations of it" (2013, 7). This materialism would do more than fetishize things—or tools in the case of DH practice—at the expense of the human subject and the political imperative of humanities research. In Brown’s formulation, this kind of materialism retains the politics of class and economic disenfranchisement from its Marxian roots with Georg Lukács, Theodore Adorno, Herbert Marcuse, and Raymond Williams. Unlike those practitioners of cultural materialism that see material conditions as a symptom of the subject’s political existence, Kirschenbaum and Gitelman’s study of material objects and tools speculates on an escape from the supposedly commonsense dialectic between things and culture. Without swinging the pendulum from economic determinacy to technological determinacy, tools become the imaginative objects that blend speculative and material ways of knowing. Furthermore, the politics of the open source movement, which animates research dissemination models in DH, allow anyone with an internet connection to learn, participate, and create. Rather than languishing in the "rhetoric of seamlessness," as Matt Ratto describes the surface or consumer level experience of technology, making or prototyping digital objects or tools is a profoundly liberating humanistic enterprise.

The technology of enchantment

The theoretical tradition that has been used to anchor a humanities methodology of tool use spans disciplinary boundaries, including history, anthropology, new media studies, cultural studies, and Marxist traditions of theoretical thought. If DH is to be accepted as a full partner in the humanities as Liu hopes, it is necessary to acknowledge the theoretical lineage of DH method within a longer tradition of humanities practice. Hayles describes a formulation that was previously coined by Liu in a conversation as early as 2008, when Liu aptly observed that "[t]hese are not just tools but tools that we think through" (Hayles 1999, 48). By succumbing to a certain rhetoric of seamlessness between tool use and thought, DH risks alienating philosophy and reducing the imaginative and creative acts needed to solve technological challenges. However, there is a remaining schism within the continental tradition of philosophy that I described at the outset of this article. Heidegger has been a central concern for the speculative realist and postmodern perspectives, but Heidegger’s early thought on tools and things is a common rejoinder for DH scholarship.

Far from any simplistic pun on the etymology of the digital, digits, and hands, Heidegger’s philosophy of the hand and tool use originates within the German language and the historical context of the Second World War. The purity of expression found within the Heidegger’s tool making philosophy—or what he calls a "readiness-to-hand [Zuhandenheit]" in Being and Time (1926, 103)—describes the relationship between thinking, language, action, and tool use as a primordial expression of Being. For Heidegger, there is a cyclical feedback between these different manifestations of being and the event. For this reason, he appeals to the very beginning of western thought with Parmenidian didactic poetry with On Nature, which divides knowing into "the way of truth" in things and "the way of opinion" in discourse. With a ranging analysis of Greek, Roman, and German etymology, he links German philosophy and language as the direct descendant of this primordial classical origin. When taken to its logical conclusion, Heidegger’s philosophy represents the etymological justification for German supremacy. If "Man is the animale rationale" (Heidegger 1992, 68), as Heidegger terms in an albeit sullied latinate description, it is the hand that marks the difference between man and animal: "Man himself acts [handelt] through the hand [Hand]; for the hand is, together with the word, the essential distinction of man. Only a being which, like man, ‘has’ the word, can and must ‘have’ ‘the hand.’ [...] No animal has a hand and a hand never originates from a paw or a claw or a talon" (1992, 80). From this basic ligament between handedness and language—between acting upon things and understanding—Heidegger presumes that "writing, from its originating essence, is hand-writing" and therefore describes the mechanism of printing and the typewriter as an inferior, less essential, form of communication technology: "the typewriter is not really a machine in the strict sense of machine technology, but is an ‘intermediate’ thing, between a tool and a machine, a mechanism" (1992, 85-6). Therefore, these tools and mechanisms have revealed, according to Heidegger, "the transformed relation of Being to man, appearing in technology, is of such a kind that Being has withdrawn itself from man and modern man has been plunged into an eminent oblivion of Being" (1992, 86). Particularly, the mechanisms for writing threaten the purity of tool making and tool use as the pure expression of mind.

When taken to its logical conclusion, Hedeigger's philosophy represents the etymological justification for German supremacy. Written from his position as a Nazi appointed Rector of the University of Freiburg in 1942-43, the Parmenides lectures delivered during that winter semester are a full expression of how dangerous the deterministic logic of the presumed clarity of tools lends itself to a history and tradition of essentialist and discriminatory thought. At a time when the Third Reich was murdering those deemed to be lesser humans with the roach gas Zyklon B, the logic used to determine those humans who can master technology becomes one of the most important questions of the twentieth century. With the wrong technology or the wrong language, Heidegger has set up a hierarchy of access to the privileged origin of western thought that places the German people and language near its origin. Following closely on Jacques Derrida’s second Geschlecht essay, "Heidegger’s Hand," which surely emphasizes the political decision in developing an ethics of the treatment of animals, the ethical imperative of humanistic work—or Galloway’s "political decision"—has long been wary of technology as an agent of inhumanity. Through exploring the "relation of the hand to speech and to thought," Derrida concludes that he cannot "talk about the hand without talking about technology" (2008, 35, 36). By ignoring Derrida and all his linguistic proclivities, Graham Harman attempts to ameliorate Heidegger’s dialectic between man and animal with a casual addendum to the corpus by saying that "even animals encounter entities as entities, and thereby engage in dealings with meaningful signs, however blurrily or inadequately" (2002, 76). After all, Harman’s monograph, Tool-Being (2002), works to wrest philosophical concerns with the ontology of language and experience and implement an Object Oriented Philosophy that allows material reality to serve as the basis for what is knowable. In other words, Harman claims that a human or animal experience of an object is equal, but animals lack the tools necessary to grasp it. It is from this context that we must understand "The Question Concerning Technology" lecture, which Heidegger revised in various iterations from 1945 to 1953. It is a radical departure, I argue, from his philosophy during the war. The essentialism of German linguistic genealogy and Aryan genetics has now been dulled to return to the more general goals of Being and Time in 1926. The mastery of technology and the gathering powers of the hand are merely a concern of the mastery over tools. However, there remains something mysterious about technology. It is "no mere means," Heidegger reminds us; rather, "technology is a way of revealing": the ability to grasp the world only continues the hierarchy instituted by Heidegger's later essay "The Question Concerning Technology" (1954), when he describes this mastery of technology as getting it "intelligently in hand" (1977, 313). The ability to grasp technology intelligently goes, as it were, hand in hand with sentient human experience. Tools are transformative to our understanding of being because tools are capable of doing something or of acting in the real world. The expression of that effect is the expression of their being, according to Harman: "Then the tool is reference; for the tool, to be is to mean" (2002, 25-6). However, the expression of his apologia for Heidegger in a DH context represents a hierarchy of technological literacy or, perhaps, technological dexterity. If doing DH requires us to use or make digital tools, we must be careful not to build a hierarchy that defines those with knowledge of more languages and more systems as better humanists or better humans.

This philosophical tradition drawn from Heidegger contains a central debate within DH theory, but it is also a deeply humanistic question. The digital humanities contain a fundamental tension between the digital object and the humanist subject of inquiry and speculation. Rather than merely exploring the cultural dimension of technology or the material determinants on culture, which is the dialectic proposed by the Heideggarian theory of technology, DH proposes a method of making that enacts the cultural dimension of technology within material restraints. In the words of Bruno Latour, "Humanists are concerned only about humans; the rest, for them, is mere materiality or cold objectivity" (1999, 10). Dh is critical of this traditional humanist reflexive speculative process as a self-actualising system. However, software systems also hold the potential to be uncritically self-actualizing. To think of cultural criticism and cultural production as a larger and coherent system, it is helpful to turn to the famed British anthropologist Alfred Gell. While Gell is responsible for expanding the scope of his field into art and technology, he describes how seamlessly beautiful technology holds a two-fold hazard: "the technology of enchantment is founded on the enchantment of technology" (1998, 44). In other words, beautifully made technology has the tendency to seem like "enchanted vessels of magical power" (1998, 46). This enchantment has been most recently viewed with the release of Apple’s iPad as a "magical and revolutionary" device, which is a rhetorical strategy used to forestall hacking, manipulation, or modification of propriety technology. This kind of marketing of superlatives is possible in a broadening culture that reifies design as the mode and method of creation. In Latour’s keynote lecture for the Networks of Design conference in 2008, entitled "A Cautious Prometheus? A Few Steps Toward a Philosophy of Design," he describes "a post Promethean theory of action" that alters the meaning of making and creation that casts creative agency with, what he borrows from scientific domains, as "the precautionary principle" (2008, 4). By embracing the term design, Latour describes how digitization is upending the distinction between form and function, which explains how "hermeneutics have seeped deeper and deeper into the very definition of materiality" (2008, 4). However, design is only ever a process of remaking what has already been made: "In other words, there is always something remedial in design" (2008, 5). There is no ex nihilo or tabula rasa in the beginning of design; there is no Adamic origin for consumer electronics or software systems. Reflexivity, then, is a by-product of a collaborative research and development. Thus echoing a postmodern critique of origins together with a technological ethos of collaborative development, Latour claims that Peter Sloterdijk is among the first to present a philosophy that walks this divide at the heart of the digital humanities.

Our tools shape us

As a final addendum to this history of materialism, Sloterdijk shifts the meaning of materialism to include the human, thereby casting humanism within the same constructed or designed artificiality of the tools and things made for speculative thought. During the course of a heated public debate between Jurgen Habermas and Sloterdijk over the state of eugenics, genetic engineering and tool use occupied a central place in German public discourse. The tools used to make humans and material objects were set side-by-side at this time and greatly challenged the ethical limits of prototyping, modelling, and making. Sloterdijk’s provocation also links his philosophy to the more problematic aspects of Heidegger’s fascist vision. "Yes," Latour insists, "humans have to be artificially made and remade, but everything depends on what you mean by artificial and even more deeply by what you mean by ‘making'" (2008, 10). Within the history of the twentieth century and the terms of eugenics debate in Germany in particular, the cautiousness of this process of making and remaking is paramount. The unifying thread that runs throughout the speculative realist perspective and the anthropological perspectives of Gell, Latour, and Sloterdijk is a shared desire to remove the dialectic between fact and artifact. If making a thing is an act of imagination as Napier believes, then a tool, a thing, and an artifact are then objects of speculation.

This is no easy equivalency, however. An object of speculation does not allow for tools to take up an essential, natural, and untroubled meaning. There are two problems with the speculative realist appeal to the clarity of tools, particularly in the context of digital technology and software: on the one hand, code is executable because it exists in a self-actualising system. The code only runs because it is validated by layers of programmed systems that have been written for it to hold logical value down to the binary level. Without the supplementary layers of languages, computer code is worthless. On the other hand, digital technologies are always only products of the culture of which they emerge and are, therefore, still only ever historical discourse. It must then also be said that, in Heidegger’s post-war writings, he continues to ruminate on technological meaning after the atom bomb. In his later essay, "The Thing" (1950), he shifts this essentialism of the hand to the essentialism of the Thing. In the shadow of a mushroom cloud, Heidegger fears what "the explosion of the atom bomb could bring with it. [Modern society] does not see that the atom bomb and its explosion are the mere final emission of what has long since taken place, has already happened" (1971, 166). When thinking of a nuclear bomb, Heidegger suggests, we must think of its development and understand the systems that produced its technology. The creation of the atom bomb by the Manhattan Project could only happen after Enrico Fermi’s so-called Chicago Pile demonstrated the ability to break atoms. The history of technological development is as important as how a technology is used. The explosion is the logical side-effect of making these weapons. The speculative realists are direct inheritors of this thinking of the Thing, but these technological things are firmly within the realm of discourse. Indeed, Alain Badiou’s Being and Event II (2009) may be seen as a contemporary expression of this particular line of postwar thought. The simplicity of Badiou’s thesis—"There are bodies and languages, except that there are truths" (2009, 4)—may elide the underlying complexity of his philosophical gesture. Humans are embodied, which he admits of phenomenology; humans understand the world through language, which he accepts of deconstruction; but Badiou adds that there are, quite simply, truths. Things exist and the human consciousness can know them without physical senses or faculties of language. While Heidegger simply says, "[t]he jug remains a vessel whether we represent it in our minds or not" (1971, 167), Badiou appeals to a classical logic proof of the materiality of human existence through the existence of atoms. His philosophical gesture represents, what he calls, "a calculated phenomenology" (2009, 38) or even a "transcendental algebra" (2009, 194). Here, he turns to mathematics to prove material existence without direct human experience: "π(x) = ∑ {Id(b,x) / bƐB} is an atom" (2009, 263). For Badiou, it is the calculable definition of existence that allows for his evolution away from deconstruction’s free play of signifiers and endless churning of discourse. Yet, when I consider the precautionary principle, I must argue that this realist turn does not give us a glimpse of the real. After all, Badiou’s calculation is meaningless outside of the discursive context he creates. The self-reflexive and self-validating systems of algebra that Badiou uses to prove material existence must be seen as a warning to digital humanists to be wary of the self-reflexivity of software systems and the potentially self-validating conclusions of markup and digitisation.

A coherent and pragmatic humanist methodology must temper the tendency to assume that digital tools are fundamentally instrumental and offer transparent insights. Indeed, these specific technologies carry an implicit philosophy, theory, and ideology that require analysis. This division cuts to the very core of the meaning made with digital tools and engages with a very long philosophical tradition that must be acknowledged. Perhaps, then, it is more productive to argue that digital tools are theoretical insofar as the performance of computer language—the executable expression of code—represents nothing less than the full expression of language made manifest. The digital humanities holds the promise that language and discourse can be an executable expression of writing in the world. Computer code is writing that does something. As the fulfillment of performative language that surpasses J.L. Austen’s How to Do things with Words and postmodern critical gesture, code triggers technology to act. In this context, the displacement of agency becomes the new object of cultural critique. The division or blending of human and machine readers is fast becoming a critical concern. Because there is no credible criteria by which coding can be separated from our theoretical understanding of language, it is necessary to incorporate our current best understanding of the performance of language and the digital manifestations of code. The Heideggerean equivalence between tool use and thought has not disappeared, but it also remains unclear if these digital tools and the hands that make them carry a similar hierarchy of privileged access, or, as Latour describes it, the fabrication of scientific facts and technical artifacts (2008, 21). Latour’s critique of scientific rhetoric of fact or certainty in works like Science in Action (1987) positions him as Meillassoux’s greatest critic, but both thinkers share a desire to delineate the terms of scientific thought. However, Meillassoux moves away from discourse that finds human experience, interests, and concerns at its centre: "All those aspects of the object that can give rise to a mathematical thought (to a formula or to digitization) rather than to a perception or sensation can be meaningfully turned into properties of the thing not only as it is with me, but also as it is without me" (2008, 3). It is remarkable in this context that the philosophical expression of digital humanistic thought removes the human altogether.

The limits of human subjectivity might now be observed by comparing a processor's clock speed and the human time scale. After all, the speed of calculation is radically mismatched with the pace of human experience as George Dyson has recently observed in Turing’s Cathedral (2012). Meillassoux’s argument stems from a basic thesis: "it is unthinkable that the unthinkable be impossible" (2008, 41). In a world awash in digitised text, a holistic assessment of cultural production on the internet is similarly unthinkable; however, simply because the amount of content available outstrips an individual's ability to understand does not mean it does not exist. In other words, Meillassoux’s desire to think a world without thought is a useful correlate to distant reading (Moretti 2000, 56), and the need to read without reading. Moretti’s term is born of a desire to read world literature "as a collective system" across languages (2005, 4). While the reliability of reading any genre as a finite system is certainly debatable here, the availability of digitised literary data and computational systems capable of parsing them makes these pursuits at least theoretically possible. Or, perhaps more cautiously phrased, it places such a project within the realm of our imagination. The creative and critical use of computational tools now dictate our ability to assess and understand digital culture. The frequently quoted phrase, "[w]e shape our tools and thereafter our tools shape us," is often ascribed to Marshall McLuhan because he often used it in conversation, but the line was in fact coined by J.M. Culkin (1967). An alternative formulation may be more fruitful: we shape our tools and therefore our tools erase us. While I think that it would be wrong to rely too heavily on such an apocryphal phrase, because it emphasizes too heavily the ontological aspect of tool making, we must accept that our tools also operate on a time scale beyond our experience and therefore removed from the site of our own speculation. The way McLuhan defines new media as containers for older media may be more useful. In other words, film became an amalgam of photography, audio recording, and theatrical performances. Just as vaudeville, still photography, and audio recording mature into established forms, a new technology emerges to aggregate old technologies into something new. Early print history reworked black type scripts to acknowledge the previous manuscript culture, and now digital media reworks page turns and paper based metaphors in user interface design: desktops, files, and documents are all manifestations of this technological and cultural absorption of media forms.However, computer systems are so fast and complex that the depth of literacy required to achieve something like the craftsman-like handiwork that interested Heidegger is now long past. The very human admission of our own failures in the face of technological systems, or even our own erasure, says something very profound about the human relationship to technology. Such a relationship demands that humanists acknowledge our inability to ever truly master technology and that the meaning we seek through computational tools is always in the making.


Notes

[1] Speculative realism is named for a conference by the same name held at the University of London in 2007. The conference was moderated by Alberto Toscano of Goldsmiths College, and featured presentations by Ray Brassier of American University of Beirut, Iain Hamilton Grant of the University of the West of England, Graham Harman of the American University in Cairo, and Quentin Meillassoux of the École normale supérieure in Paris. Credit for the name "speculative realism" is generally ascribed to Brassier, though Meillassoux concludes After Finitude by bemoaning philosophy’s detour through transcendentalism rather than the realist approach called "speculative materialism" (2008, 121).

[2] Tara McPherson’s essay "Why Are the Digital Humanities So White?" (2012) is an excellent example of an essay that works to situate computing within history and issues of social justice. By describing the evolution of UNIX and early computing through the context of the U.S. civil rights movement, computing becomes racialized in a way that takes on the social valences of technology and lends political validity to the digital humanities discourse.

[3] While I cannot account for the full contexts in which Heidegger is cited in digital humanities writing, I would like to offer a brief catalogue of instances of his invocation. For example, Alan Liu’s early essay "Transcendental Data" (2004) describes how "web pages" exist in the world in a similar sense of Heideggerean "thrownness" (Liu 2004, 61), which Heidegger uses to mark the existential experience of everyday life; Mark Ratto offers a response to Chalmers and Galani’s easy relationship with Heidegger’s notion of "ready-to-hand" (Ratto 2007, 24); in the proceedings to the 2010 conference entitled, "Computational Turn," Yuk Hui frames his analysis of networks and the Internet through Heidegger’s concept of the "world picture" drawn from the ontological changes described as a result of broader Newtonian concept of global society; David Berry frames his edited collection Understanding Digital Humanities by describing "computationality" as a Heideggerean "ontotheology" governed by intelligibility and cognition as metaphysical descriptors of computer code (Berry 2012, 16). The relationship between handedness and the "digital" has recently come under attention by Anna Chen in her essay "In One’s Own Hand." Her chronicle of the "cultural fears about the loss or degradation of human embodiment" in expressive mediums of all kinds lends a nostalgic tone in the anxiety associated with all technological shifts. For Chen, the expressiveness of handwriting remains meaningful despite the inherently inconclusive basis of such interpretations. Chen’s essay discusses the cultural and social connotations and historical impacts of handwriting. By discussing the handwriting of Emily Dickinson and Ronald Reagan, Chen argues that "the cultural perception of handwriting" has become more pronounced "in an increasingly technologized world" (2012, 3). While Chen’s essay struggles with the highly subjective interpretations that any reading of handwriting undertakes, she quite rightly identifies the return to how an "embodied presence is embedded within the cultural value that has been historically placed upon handwriting" (Chen 2012, 30).


Works Cited

Badiou, Alan. 2009. Logics of Worlds: Being and Event II. London: Continuum.

Berry, David. 2012. Understanding the Digital Humanities. New York: Palgrave Macmillan.

Brown, Bill. 2013. "Thing Theory." Critical Inquiry28.1 (Autumn 2001): 1-22. JSTOR.

Chen, Anna. 2012. "In One's Own Hand: Seeing Manuscripts in a Digital Age." Digital Humanities Quarterly. 6.2.

Culkin, J.M. 1967. "A Schoolman’s Guide to Marshall McLuhan." Saturday Review (March 18): 51-53, 71-72.

Derrida, Jacques. 2008. "Heidegger’s Hand (Geschlecht II." ) Psyche: Inventions of the Other, Vol. II. Ed. Peggy Kamuf and Elizabeth Rottenberg. Trans. John P. Leavey Jr. and Elizabeth Rottenberg. Stanford: Stanford UP.

Galey, Alan and Stan Ruecker. 2010. "How a Prototype Argues." Literary and Linguistic Computing25.4: 405-424.

Gallagher, Catherine and Stephen Greenblatt. 2000. Practicing New Historicism. Chicago: U of Chicago P.

Galloway, Alexander R. 2012. The Interface Effect. Cambridge: Polity.

Galloway, Allexander R. 2013. "The Poverty of Philosophy: Realism and Post-Fordism" . Critical Inquiry39.2: 347-366. JSTOR.

Gell, Alfred. 1998. "The Technology of Enchantment and the Enchantment of Technology" . Art and Agency. Cambridge: Clarendon P, 41-63.

Gitelman, Lisa. 2006. Always Already New. Cambridge: MIT P.

Harman, Graham. 2002. Tool-being: Heidegger and the Metaphysics of Objects. Chicago: Open Court.

Hayles, N. Katherine. 1999. How We Became Posthuman: Virtual Bodies in Cybernetics, Literature, and Informatics. Chicago: U of Chicago P.

Hayles, N. Katherine. 2012. "How We Think: Transforming Power and Digital Technologies." Understanding Digital Humanities. Ed. David M. Berry. New York: Palgrave, 42-66.

Heidegger, Martin. 1926. Being and Time. Trans. John Macquarrie and Edward Robinson. New York: Harper.

Heidegger, Martin. 1992. Parmenides. Trans. André Schuwer and Richard Rojcewicz. Bloomington: Indiana UP.

Heidegger, Martin. 1971. "The Thing." Poetry, Language, Thought. Trans. Albert Hofstadter. New York: Harper,165-186.

Heidegger, Martin. 1977. "The Question Concerning Technology." The Question Concerning Technology and Other Essays Trans. William Lovitt. New York: Garland, 3-35.

Kirschenbaum, Matthew G. 2002. "Editing the Interface: Textual Studies and First Generation Electronic Objects." Text: An Interdisciplinary Annual of Textual Studies 14: 15-51.

Kirschenbaum, Matthew G. 2008. Mechanisms: New Media and the Forensic Imagination. Cambridge: MIT UP.

Latour, Bruno. 2008. "A Cautious Prometheus? A Few Steps Toward a Philosophy of Design (with Special Attention to Peter Sloterdijk)" . Presented at the Networks of Design meeting in Falmouth, Cornwall, 3 Sept 2008. www.bruno-latour.fr/article.

Latour, Bruno. 1999. Pandora’s Hope: Essays on the Reality of Science Studies. Cambridge: Harvard UP.

Liu, Alan. 2009. "The End of the End of the Book: Dead Books, Lively Margins, and Social Computing" . Michigan Quarterly Review 48.4: 499–520.

Liu, Alan. 2004. "Transcendental Data: Toward a Cultural History and Aesthetics of the New Encoded Discourse." Critical Inquiry. 31.1 (Autumn 2004): 49-84.

Liu, Alan. 2012. "Where is Cultural Criticism in the Digital Humanities?" Debates in the Digital Humanities. Ed. Matthew K. Gold. Minneapolis: U Minnesota P, 75-84.

Manoff, Marlene. 2006. "The Materiality of Digital Collections: Theoretical and Historical Perspectives." 6.3: 311-325.

Manoff, Marlene. 2004. "Theories of the Archive from Across the Disciplines" . portal: Libraries and the Academy. 4.1: 9-25.

Manovich, Lev. 2012. "How to Compare One Million Images?" Understanding the Digital Humanities. Ed. David M. Berry. New York : Palgrave Macmillan, 249-278.

Manovich, Lev. 2001. The Language of New Media. Cambridge: MIT P.

McGann, Jerome. 2001. Radiant Textuality: Literature After the World Wide Web. New York: Palgrave.

McPherson, Tara. 2012. "Why are the Digital Humanities so White?" Debates in the Digital Humanities. Ed. Matthew K. Gold. Minneapolis: U Minnesota P, 139-160.

Meillassoux, Quentin. 2008. After Finitude: An Essay on the Necessity of Contingency. Trans. Ray Brassier. London: Continuum.

Meillassoux, Quentin. 2012. The Number and the Siren: A Decipherment of Mallarmé’s Coup de Dés. Trans. Robin MacKay. United Kingdom: Urbanomic.

Moretti, Franco. 2000. "Conjectures on World Literature." New Left Review 1: 54-68.

Moretti, Franco. 2005. Graphs, Maps, Trees: Abstract Models for Literary History. New York: Verso.

Napier, John. 1993. Hands. Revised by Russell H. Tuttle. New Jersey: Princeton UP.

Ramsay, Stephen and Geoffrey Rockwell. 2012. "Developing Things: Notes Towards an Epistemology of Building in the Digital Humanities." Debates in the Digital Humanities. Ed. Matthew K. Gold. Minneapolis: U Minnesota P, 75-84.

Ratto, Matt. 2007. "The Rhetoric of Seamlessness: Resources and Future Directions." International Review of Information Ethics 8 (December 2007): 20-27. JSTOR.

Rockwell, Geoffrey and Stéfan Sinclair. 2008. "There’s a Toy in My Essay: Problems with the Rhetoric of Text Analysis (Draft of Chapter of Heremeneuti.ca)." http://www.philosophi.ca/pmwiki.php/Main/TheRhetoricOfTextAnalysis.

Valid XHTML 1.0!

Share

Authors

Aaron Mauro

Download

Issue

Dates

Licence

Creative Commons Attribution 4.0

Identifiers

File Checksums (MD5)

  • HTML: 5756ef1f273d733d22d1bb5ba96240e3