It may be worth asking whether we consider the DH as a set of non-traditional, computational tools that are useful for humanists in the pursuit of their traditional scholarly goals, in both research and pedagogy, or an entirely new paradigm for knowledge work in the  humanities, which transforms not just methods and techniques of inquiry and interpretation but the very nature of humanistic inquiry and interpretation. To put it in a slightly different way: do DH simply apply computational techniques to traditional data sets, or transform the very nature of the data we humanists consider in our investigative and interpretive practices? This transformation is often seen in parallel to the advent of a “fourth paradigm” in human knowledge: data-intensive scientific discovery (http://research.microsoft.com/en-us/collaboration/fourthparadigm/4th_paradigm_book_complete_lr.pdf).

These two views of DH can be characterized respectively as the “translational” and the “transformative” view. Is DH just a translation onto the digital platform of values, methods and procedures largely shaped in a pre-digital world? Or is DH an entirely new form of generative knowing, which actively shapes new cognitive values (perhaps more compatible with those of the social and biological sciences) along with new methods of inquiry?

According to the translational view, digital tools must translate on to the digital platform proven, pre-digital philological and historical techniques and methodologies, and integrate them with computational procedures. Data curation is first and foremost data preservation, which includes also the preservation of pre-digital cognitive and interpretive modes and frames embedded in pre-digital, analog documents and artifacts. Example: Text encoding.
According to the transformative view, instead, “the advent of Digital Humanities implies a reinterpretation of the humanities as a generative enterprise: one in which students and faculty alike are making things as they study and perform research, generating not just texts (in the form of analysis, commentary, narration, critique) but also images, interactions, cross-media corpora, software, and platforms” (Burdick & Others, Digital_Humanities). This translates into newly conceived data sets and models for the humanities.

It is worth noting that the generative conception of the DH adopts a scientific-technological model that is rapidly becoming dominant or hegemonic in our culture. As Evelyn Fox Keller (Professor Emerita of the History and Philosophy of Science, a historian of biology at MIT) wrote ten years ago in her book, Making Sense of Life, (Cambridge, Harvard University Press, 2002, p. 203): we “live and work in a world in which what counts as an explanation has become more and more difficult to distinguish from what count as a recipe for construction.” The gulf between understanding (as the primary cognitive mode for the ideographic humanities) and explaining (as the primary cognitive mode for the nomothetic sciences) is widening: or is it? Fox Keller argues that also the nature of explanation is changing in the age of data-intensive scientific discovery.

This seems also the cognitive attitude reflected in this quote from Trevor Owens’ “Defining Data for Humanists”: “as constructed things, data are a species of artifact” (Zoe also picked up on this quote in her post). A similar point is made by Tara Zepel who sees Visualization as a self-standing (sub) discipline within the DH: “Visualization is an entire framework for building, communicating, and most importantly experiencing knowledge.” And even more radically this point of view is advocated by the Manifesto: “The theory after Theory is anchored in MAKING.” A constructivist ethos is clearly pervading the DH.

Interestingly enough, Owens goes back to textual practices as templates for data modeling conceived as an interpretive practice: conceived or constructed as artifacts, data (notice the plural), according to Owens, “can be interpreted as texts, and can be computed in a whole host of ways to generate novel artifacts and texts which are then open to subsequent interpretation and analysis…In short, data as text, artifact, and processable information…[is] a multifaceted object which can be mobilized as evidence in support of an argument…“The production of a data set requires choices about what and how to collect and how to encode the information” – these choices can be interpreted as we interpret the argument made in a text (one may ask whether this interpretive or critical attitude toward data sets is peculiar to the humanities). “Humanists can, and should interpret data as an authored work…while a reader-response theory approach to data would require attention to how a given set of data is actually used, understood, and interpreted by various audiences…”; “data is not a kind of evidence; it is a potential source of information that can hold evidentiary value.”

In conclusion, data modeling in the DH is indeed “transformative” – new types of data and data sets are assembled and analyzed thanks to computational techniques; but it must also be “translational” – adapting interpretive practices typical of the humanities and their various disciplinary fields to the new objects (artifacts) of inquiry. A perfect compromise?

Further examples for discussing this (false or true) dichotomy
Translational or Transformative?
Projects of the LitLab at Stanford
Mapping Galileo (Lit Lab at Stanford)
The Salons Project
The Density Design Sets, see in particular: Brain Houses as an example of transformative panoramic view (based on a visual genre or genres, but innovative and “transformative” in their interpretive application).

Advertisements