Paper Name : Literary Criticism
Assignment Topic : Digital
Humanity
Name: Solanki Pintu V
Sem : 2
Roll No : 31
Enrollment No: PG15101037
Submitted to :
M.K. BHAVNAGAR UNIVERSITY
v Digital Humanities and
Computer Assisted Literary Criticism
What Is Digital Humanities ?
This chapter sets out a definition of “digital
humanities”, also known as “humanities
computing.” Digital humanities can be defined as a
field of study, research, teaching, and invention concerned with the
intersection of computing and the disciplines of the humanities. At its core,
digital humanities is more akin to a common methodological outlook than an
investment in any one specific set of texts or even technologies.
The digital humanities, also known as humanities computing, is a field
of study, research, teaching, and invention concerned with the intersection of
computing and the disciplines of the humanities. It is methodological by nature
and interdisciplinary in scope.
It involves investigation,
analysis, synthesis and presentation of information in electronic form. It
studies how these media affect the disciplines in which they are used, and what
these disciplines have to contribute to our knowledge of computing.
Yet it is also a social undertaking. It harbors networks of people who
have been working together, sharing research, arguing, competing, and
collaborating for many years. The chapter also traces how digital humanities
went from being a term of convenience into something of a movement and presents
reasons why digital humanities is associated with English departments.
People who say that the last battles of the computer revolution in
English departments have been fought and won don’t know
what they’re talking about. If our current use of
computers in English studies is marked by any common theme at all, it is experimentation at the most basic level. As a profession, we are just learning
how to live with computers, just beginning to integrate these machines
effectively into writing- and reading-intensive courses, just starting to
consider the implications of the multi-layered literacy associated with
computers.
—Cynthia Selfe, “Computers in English
Departments: The Rhetoric of Technopower”
A New Computer Assisted Literary Criticism by
RAYMOND G. SIEMENS
Ø Literary Studies and
Humanities Computing: Modeling,
Points of Intersection :-
Perhaps the best historical model for documenting the accepted points of
intersection shared by literary studies and humanities computing is that
expressed several decades ago by John Smith in his seminal article, “Computer Criticism.” Within, one finds
computing applications for language and literary studies divided into two
groups based on their resultant products: one consisting of those“in which the computer was used to produce through textual
manipulation conventional aids for future research (dictionaries, concordances,
etc.),” and the other made up of “those in which the computer was used in the actual analysis of
specific works of literature (thematic analyses, stylistic studies, etc.)”.
Escaping my quotation above, but clearly
evident in Smith’s larger argument, is the founding of
each in and on the literary text in electronic form. Indeed, and as the
humanities computing community has reminded itself a number of times, literary
studies is largely defined by its reliance on and its attention to the literary
text, broadly construed: the textual artefact and its intellectual contents.
Not surprisingly, the literary text in the computing-enabled form that our
community has explored it has, for some time, been accepted as the central point
in the relationship between literary studies and computing.
While such a focus has remained constant, not all has been static. Of
note is that the idea of the literary text in its ultimate electronic scholarly
form – the electronic scholarly edition of historical
texts and what we might call the “electronic literature” of contemporary texts – has undergone
considerable change, invention, and reinvention since Smith’s work of the late 1970s. Equally significant is the considerable
rise in acceptance of computing approaches within the literary studies
community since that time.
And yet, even with such change in the electronicallycast object of our
focus and the increasing acceptance of computing enhanced approaches, a model
with the widespread application and utility of that expressed by Smith, a model
that might best assist us in broad scoped consideration of the changing and
increasingly positive relationship between literary studies and humanities
computing, has rarely been articulated since Smith’s
expression over two decades ago; the several exceptional literary-computing
theories that have seen expression of late – such as
those that have treated hypertext and its embodiment of literary theoretical
principles, narrative studies as it relates to the electronic medium, and other
aspects of electronic literary textuality – focus on
points of intersection shared by literary studies and computing that are of the
utmost importance, to be sure, but operate with a scope considerably less than
that of Smiths work.
Ø 2. Computing Tools and
Computer Criticism / High and
Low Criticism :-
It is well worth establishing something as basic, and essential, as the
foundation of a general model that allows us to examine the intersection of
humanities computing techniques and the pursuits of those in literary studies
in a broad way, in an environment typified by changing notions of the literary
text and, perhaps, with reference to changing levels of acceptance of
computing-influenced work. Such a foundation is most clearly informed by Smith’s work, but that model does not explicitly take into account the
relationship among the many types of work carried out in the literary studies
community. For this purpose in particular, a model worth presenting alongside
Smith’s is one more recently articulated by
literary/textual scholar Tim William Machan.
In the introduction to his Medieval Literature: Texts and
Interpretation, Machan succinctly expresses a division of literary critical and
scholarly work into two chief categories: what he terms “Lower Criticism,” which is chiefly textual
and bibliographical in nature, and “Higher Criticism,” which is typified by interpretive studies. Lower criticism, Machan
notes, is most “commonly viewed as the more factual or ‘scientific’; it provides numerical,
analytical, and categorical information which is used to define . . . realities” ; higher criticism is often seen as “the
spirit which gives life to the letters established by the Lower Criticism; it
is the intellectual and aesthetic activity which, depending on one’s critical viewpoint, reveals, constitutes, or disassembles the
meanings of a text” . As one might expect – and as one who works with either knows –
the relationship between the two is mutually influential, for “without the traditional Lower Criticism’s
constructing of texts, there can be no focus for the theorizing of Higher
Criticism, just as without the traditional Higher Criticism’s interpretation of texts there can be no contexts within which
Lower Criticism can identify facts” . In short, each is
somewhat distinct, but each also necessarily assists in the definition and
development of the other.
Recalling the central role of the electronic literary text in the
intersection of computing and literary studies, it is important also to note
that one such embodiment of that text, the electronic scholarly edition,
occupies an important place when we think about that which both Machan and
Smith address: respectively, the influence of lower criticism on higher
criticism and, further, the influence of humanities computing tools on higher
literary critical concerns in the form of what Smith calls “computer criticism.” In addition to being a
flagship of sorts today for the work of humanities computing in the field of
literary studies, electronic editions of several sorts – primarily dynamic (which combine electronic text and text-analysis
software such that the text indexes and concords itself) and hypertextual
(which use links to facilitate a reader’s interaction
with the apparatus that traditionally accompanies scholarly editions) – represent the culmination of decades of humanities computing work
that has both supported and directly participated in interpretive studies.
Dynamic interaction with a text – a process which is,
essentially, enacting accepted lower critical practices upon a text – is a critical process that duplicates the sorts of tasks that Smith
outlined as making up much of computer criticism; restated, such interaction
is, itself, part of an interpretative process, with the computer enabling the
lower-critical tasks to be carried out swiftly and seamlessly
Truly, it is through the electronic scholarly edition that, today, one
can most easily witness the influence of that which is chiefly textual and
bibliographical in nature upon that which is more interpretive by nature – as well as the concomitant influence that schools of interpretation
exert upon that which is bibliographic in nature; this latter point is best
evinced by Schreibman’s paper, second in this
collection, and the former given considerable support by Best.
Such a meeting and mutual information of high and low critical
endeavours in the electronic literary text is implicit in most papers in this collection
– as is the observation that the electronic scholarly
edition is only one type of such a text; truly, as Schreibman and Best both
note in their consideration of aspects of the edition, even this type of
electronic literary text is undergoing considerable change, reflecting intended
or possible applications well-beyond those of earlier-generation editions. At
their very essence, Winder suggests, recent literary critical schools and
methodologies have combined with computing technology to force us to reconsider
aspects of the literary text and its textuality –
aspects not as disparate as one might think, Van Pelt convinces us, from the
meaning that we are able to construct from its contents. Indeed, and as treated
most directly by the contributions of Soules, Rockwell, and Grigar, new forms
of textual narrative and communicative interaction in new electronic literary
texts have themselves opened up previously unavailable points of intersection
between the humanities computing and literary studies communities.
Ø 3. Papers Towards a New
Computer-Assisted Literary
Criticism :-
The papers of this collection demonstrate well
the broad range of new work in computing-influenced areas of literary
criticism. They suggest a number of things both positive and valuable: that
trends within the literary studies community at large have expanded that
community’s notion of how computing relates to it – both explicitly and implicitly; that, while at times disputed,
there is a strong sense of continuity among past work in humanities computing
that addresses literary studies and similar work being carried out a present;
and that there is a strong sense of continued promise for, and easily apparent
value in, work taking place at the intersection of literary studies and
computing.
Expounding and exemplifying the benefits of the electronic edition,
Michael Best’s “The Text of
Performance and the Performance of Text in the ElectronicEdition” explores the notion of the “performance
crux” – a moment, puzzling to the director and actors,
that calls for some kind of stage business to justify or explain action – in the surviving texts of many of Shakespeare’s plays. Using the example of such a crux in Romeo and Juliet, he
suggests how a modern, multimedia electronic edition can provide tools for the
reader or actor to explore the possibilities both of the basic text and the
performance that grows from it, ultimately treating the mutual illumination of
text and performance in the dramatic electronic scholarly edition.
In her article, “Computer-mediated Texts and
Textuality: Theory and Practice,” Susan Schreibman
continues concern with the scholarly electronic edition, beginning with the observation that the majority of
literary archives in electronic form within have been conceived more as digital
libraries than disquisitions that utilise the medium as a site of
interpretation – tracing this situation to the
underlying philosophy of texts and textuality implicit in TEI-SGML. In her
treatment of electronic textual theory, she urges that our understanding of
electronic texts and textuality deepens as advances in technology allow for the
realization of presentations and readings of electronic textual materials that
could not, previously, be implemented in HTML or SGML.
Beginning with the observation that one high literary critical mode,
French neostructuralism, is built directly on the achievements of structuralism
using electronic means,WilliamWinder’s “Industrial Text and French Neo-structuralism” discusses that mode in the context of its origins in reaction to
French post-structuralist theorization and examines a number of exemplary
approaches to text analysis in this vein. Further, he considers how
computer-assisted accumulation of text-based expertise in the world at large
complements this approach, ultimately concluding that we can anticipate the
direction of critical studies to be radically altered by the sheer size of the
economic stakes implied by a new kind of text, the industrial text which lies
at the centre of an information society.
Exploring further the cross-fertilization of theoretical approaches and
computing is Tamise Van Pelt’s “The Question Concerning Theory: Humanism, Subjectivity, and
Computing.” Within, Van Pelt surveys the shift from
humanist, to anti-humanist, to posthumanist assumptions in literary critical
circles and questions whether today’s computing
environments can still be approached through late twentieth century
anti-humanist theories or whether electronic texts demand new, media-specific
analyses.
Current work in new media, she asserts,
suggests that the dominant discourse on the subject –
the rational individual of the humanistic enlightenment, which gave way to the
constructed subject of the mid-twentieth century (the discourse underlying much
contemporary critical theory) – is being challenged by
an emergent discourse of the posthuman.
Marshall Soules, in his “Animating the Language
Machine: Computers and Performance,” explores how we
consider a recently-emergent type of text – the computer-mediated
writing space – as a unique performance medium with
characteristic protocols. Drawing on contemporary performance theory, literary
criticism, and communication theory, Soules proposes that technologists,
academics, and artists are developing idiomatic rhetorics to explore the
technical and expressive properties of the new “language
machines” and their hypertextual environments.
The role of improvisation, and
its cross-disciplinary protocols, provides a further focus in the discussion of
computing practice and performance.
In “Gore Galore: Literary Theory and Computer
Games,” Geoffrey Rockwell provides a brief history of
another recently-emergent type of text, the computer game, and asserts that
they have not been adequately theorized. Rockwell develops a topology of
computer games and a theory, based on Bakhtin’s poetics
of the novel, that views them as rhetorical artifacts well-suited for critical
study.
Bookend to this introduction is Dene Grigar’s
examination of the genre of adaptive narrative. In her “Mutability, Medium, and Character,” Grigar
explores the future of literature created for and with computer technology,
focusing primarily on the trope of mutability as it is played out with the new
media. In its speculation about the possibilities of this new genre, it
explores ways in which we may want to think when developing future theories
about literature – and all types of writing – generated by and for electronic environment.
No comments:
Post a Comment