Category: Pre-Workshop Reflection (page 2 of 2)

Authentic [Digital] Manuscripts

Lou Burnard pushes against the notion of digitized-manuscript-as-facsimile, implying that digital humanists should not try to reproduce a material manuscript on a screen, as that involves defining an impossible authenticity, a nebulous “higher purer reality.” Burnard’s vision of a single, structured encoding system for manuscript digitization does, however, include standardizing markup in such a way as to create a more predictable experience with a digitized manuscript. A standard markup practice, if it is widely implemented, has the potential to become practically invisible over time as users begin to take it for granted. If we learn to expect certain manuscript viewing experiences on our screens, then readers might eventually see through markup (or its effects) whenever necessary in order to access the text in a slightly more direct (“authentic”?) way.* It will be interesting to see how our hermeneutic practices change as a result of TEI’s near-ubiquitous use for manuscript encoding.

 

*Standard markup practices may also increasingly permit users to control how visible, or how invasive, markup appears to be on any digitized text. Amanda Gailey provides one example of how this might work when she mentions giving the viewer the option to turn dialect translations on (accessing searchable text) or off (accessing text-as-written) in the works of Joel Chandler Harris.

The two readings rise some crucial points concerning editing practices. Gailey’s piece, in particular, shows how TEI is not only a useful tool in practical terms but also (and, in some respects, more importantly) contributes to creating a “theory of the text” (132). In other words, TEI helps thinking about both formalistic issues concerning the text (e.g., what is a line or a stanza? Shall we ‘regularise’ the language of the ‘original’ text?) and substantial questions about the text, i.e. what a text is. Digitised texts or critical editions make us think the text as a fluid text. Moreover, digital critical editions exhort us to reconsider the ‘authority’ of the editor(/s), as they offer to the readers not only the opportunity to interpret the text but also to suggest or propose different editorial solutions.
Burnard’s piece focuses on the various nature of information a digital critical edition may provide through the markup system. In particular, the markup system takes into account compositional (i.e., rather formal) features of a text, contextual features, and interpretative features. The markup translates in a set of codes the human interpretation of the texts. Thereby, text encoding also helps thinking more theoretically about both formalistic and more content-related elements of the text.

Pre-Workshop Tasks

1. Read this article: Lou Burnard. “On the hermeneutic implications of text encoding.” Domenico Fiormonte and Jonathan Usher (eds.) New Media and the Humanities: Research and Applications. Oxford: Humanities Computing Unit, 2001. 31-38. http://users.ox.ac.uk/~lou/wip/herman.htm
2. Read this article: Amanda Gailey. “A Case for Heavy Editing: The Example of Race and Children’s Literature in the Gilded Age.” Amy E. Earhart and Andrew Jewell (eds.) The American Literature Scholar in the Digital Age. University of Michigan Press, 2011. (Attached to
the email you received.)
3. Once you have read the articles, post a short (100 word or so) reflection on them on our CampusPress class website, under the tab “Pre-Workshop Reflections.” Feel free to comment on others’ reflections, too. Select “Pre-Workshop Reflections” as the category of your post so it appears here!

Newer posts
Skip to toolbar