Susan Hockey
Professor (emeritus) of Library and Information Studies and Director of the School of Library, Archive, and Information Studies at University College London and founding member and chair of the Association for Literary and Linguistic Computing (1984–97)
During my career, I have been involved in a variety of projects and I have worked in a research laboratory, an academic computing service, a library and two different academic departments, all in different institutions. My interests have mostly concentrated on the creation, manipulation and delivery of text-based humanities resources. In this talk I will attempt to show how key issues in some areas have led to new developments in other areas. I will look at some of the challenges facing early users and consider which of these have been to a large extent resolved by developments in technology and which still raise serious intellectual questions.
In broad terms I would characterize the history of digital humanities as a combination of intellectual curiosity and the development and use of tools, resources and knowhow to meet the needs of humanities researchers, of funding bodies and, more recently, of libraries. Given the cost of creating and managing resources, how to coalesce these needs into some common reusable applications has presented serious intellectual issues, which are still apparent as more and more information is only in digital format. Moreover, in some ways, work in digital humanities has been ahead of its time, making it more difficult for other researchers, and funders, to understand why it is needed.
For a long time practitioners in humanities computing formed a fairly small community, which met at regular conferences, where most of the papers were about individual research projects. Intellectual challenges excited this audience, but they also found that they had to take on wide-ranging organizational issues and, with the advent of the Internet, projects like the TEI which crossed international and cultural boundaries.
The advent of the World Wide Web in the early 1990s enabled almost anyone to be a publisher. It brought in many more potential users and it made it much easier for people to promote and publicize their work. With so much information, but no gatekeeper, on the Web, it became more difficult for new users to build on existing knowhow in the humanities computing community. It was easier to re-invent the wheel. Teaching the next generation became more important, but there were questions in deciding what to teach and what qualifications might be appropriate. Libraries also now play an important role in the delivery of electronic resources but they, too, have had to adapt to new kinds of user needs and new requirements for metadata.
Elisabeth Burr
Professorin für Französische / frankophone und italienische Sprachwissenschaft, Universität Leipzig
During the last decade, i. e. ever since I got my chair in September 2005 at Leipzig University the question of how to install Digital Humanities at the university has occupied my mind. Right from the start I tried to introduce modules like Corpus Linguistics into traditional curricula and to establish contact with colleagues from Computer Science; I organised a workshop on “Text Markup & Database Design” and twice the “Culture and Technology European Seminar Series” together with other European Universities. I managed to sign contracts which allowed the exchange of modules with Information Systems and Computer Science, took part in many discussions about Digital Humanities / e-Humanities curricula, organised a meeting on a European Master in Digital Humanities and was on the board of the Leipzig eHumanities seminar and eHumanities Innovation Award. In the long run it would seem, however, that projects like the European Summer University or the environment for collaborative annotation and knowledge creation by themselves say quite a lot about teaching and learning the new epistemology of Digital Humanities and thus should be reflected on more closely. In my contribution I will try and recall some of the concepts and challenges which paved my way and draw some tentative conclusions.
Wilhelm Ott
Universität Tübingen and Pagina Publikationstechnologien
"The conventional approach to design of computing tools for the humanities is", as Willard McCarty says in "Humanities Computing" (Palgrave Macmillan 2014, p. 217), "exemplified by the Tübingen System of Text Processing Programs": it "demonstrates a cogent and practical design, which in turn raises questions we need to be asking: Are these the right sort of primitives? Are they at the right level of abstraction?"
Questions asked in and the contents examined by "digital humanities" or "e-humanities" have of course shifted since 1966 when I started computing in the humanities and when I had been employed by the Computing Center of the University of Tübingen to continue my own research and to care for computer support for other humanities projects.
Nevertheless, also today "close reading" questions and the respective tools play an essential role in the text based humanities. It may therefore be useful to have a closer look at some key insights we have gained from giving technical advice and support to projects from the whole spectrum of text-based humanities research, and which we always regarded as important for designing a toolbox for scholarly text data processing.
One of the key features of this toolbox which Kuno Schälkle and myself developed from 1970 is its consistent modularity: it is made up of a set of programs, each covering and concentrating on a relatively elementary function of text data handling required in the different phases of a text based project, from data entry over data analysis, data manipulation, up to the publication in print or on electronic media or the web. Each of these programs takes a text data file as its input and writes the output to a new text data file. Parameters may be used to define the details of processing. The single modules can be combined in almost arbitrary ways in order to provide a solution for each single step to be performed, also for problems not foreseen by the developer, and for establishing a workflow for the whole project.
The paper will give a short account how we very early arrived at this concept, and it will give a short example of its implementation and application in TXSTEP, the XML-based user interface to this proven toolbox.
Fotis Jannidis
Professor für Computerphilologie und Neuere Deutsche Literaturgeschichte, Universität Würzburg
It is well-known, that at some point a change in quantity will become a change in quality too. In the beginning, quantitative text analysis could be viewed as a form of reading support, using basically the same texts a human reader would use, offering insights into, for example, the distributions of interesting phrases or cooccurrences, allowing the human researcher to zoom in into interesting passages of the text under scrutiny. This view was already a challenge for many traditional scholars, but it shared many basic assumptions with them: The important part of the analysis was done by a human reader. The key concept I am interested in is the shift to the use of very large text collections for textual analysis which in the beginning only looks like a more (much more) of the same, but it really changes the rules. It shifts the focal point of the activity of the researcher from perusing lists to setting up complex pipelines of natural language processing tools and doing statistical analysis of the outcomes. The competences needed to do this kind of research differ dramatically from those of the older research setups. And there is another change: Because at the moment we do not have enough texts of scholarly quality in any language to do this kind of large text collections, a main tenet of philological work is suspended, even if it is understood to be only temporarily. The development of this approach to textual analysis was supported by concepts and needs which had arisen in textual studies without any link to the digital. The critique of the canon in literary studies with its implied call to reassess literary history and to have an open eye for popular genres, the rise of cultural history in many disciplines in the Humanities and an interest in everyday culture, all this marks an interest for the non-elite, for the masses, and creates a demand for tools which are able to handle the large amounts of texts used in this kind of research. The concept of "distant reading" (Moretti) was, before it went digital, a way of looking at large-scale historical developments by using metadata on the texts and by compiling information laid down in scholarly articles instead of reading everything yourself and. The metaphor "distant reading" has been challenged by some, who proposed "macroanalysis" (Jockers), thus emphasizing the analytical part, or scalable reading (Mueller), emphasizing the connection to the reading of text. I will try to cover some of the more prominent steps in the complex history of this concept, but will have to concentrate on the English and German research literature though it is obvious that the negotiations around this concept can differ quite a lot, depending on the self-conception of the Humanities and other factors in the national and international research community.
Geoffrey Rockwell
Professor of Philosophy and Humanities Computing, University of Alberta
Woven through the history of the digital humanities are practices of thinking through the analysis of texts. Analytical tools have been presented as a telescope for the mind that extend our sight. This paper is about the thinking that happens through analytics. I will draw on what tools developers have said about the thinking interaction with the text that is enabled. I will show experiments in replicating past analytic techniques and how such experiments can be documented through literary programming notebooks. To conclude I will argue that thinking-through is more inclusive way of understanding the praxis of the digital humanities.
Hans Walter Gabler
Professor (emeritus) für Englische Philologie und Editionswissenschaft, LMU München
While the book remains and will remain with us in which to read the texts of our cultural heritage, the native ground for the scholarly edition is, and will increasingly become, the digital medium: for the digital medium is where the scholarly edition is now, and will yet more comprehensively be, established, circulated, and used. This premise harbours a complex set of challenges. To edit works and their texts in the digital medium differs in manifold ways from the traditional enterprise of scholarly editing in (private) recluse that culminated, and for many practical purposes ended, in printed books. The scholarly edition of the future as digital edition is to be conceived of, rather, as processually dynamic, relational, and interactive: to open it on and as a web platform is not a making-public of an end product of (pre-public) editorial labour; it is the public beginning of its life as a cross-roads of recording, image documentation and diachronic stratification, of texts, of knowledge retrieval and cumulation (vulgo: commentary), and of ongoing, individual as well as communal, research. It is on this spectrum of considerations that I wish to remark, if selectively, in my paper.
Kurt Gärtner
Professor (emeritus) für Ältere deutsche Philologie (Sprachgeschichte), Universität Trier
Looking back at the history of textual criticism I choose three examples, two from my own field, firstly the ‘Arme Heinrich’ by Hartmann von Aue, secondly the ‘Parzival’ by Wolfram von Eschenbach, both works belonging to the classical German literature around 1200, and thirdly the New Testament. All three examples are strongly connected with the establishment of critical editions, and with the founder fathers of Germanistik, the brothers Grimm and Karl Lachmann.
I will follow more extensively the history of editing the ‘Arme Heinrich’, beginning with the first critical edition by the brothers Grimm and ending with my own edition and the new ways it has been created in the digital age, and how the transmission has been made available. Much shorter is the history of editing the ‘Parzival’, because the first critical edition by Karl Lachmann, published in 1833, has never been replaced up to now. However, a new critical edition is on its way, making use of every support DH is offering.
A short look at present days New Testament studies should demonstrate how the transmission of the text could be made available today and used for establishing a critical text, opening up new horizons for all interested in editing the New Testament, although a critical edition should appear in print as a long term reliable source for references. This represents the perfection of a model which the editors of the ‘Arme Heinrich’ and the ‘Parzival’ are striving to follow, although the connection of their critical text to the transmission might vary.
Joachim Veit
Professor der Universität Paderborn, Editionsleiter der Carl-Maria-von-Weber-Gesamtausgabe
Musical editing – as a central field of musicology – has been in an ongoing state of change since several decades and despite the special conditions of „musical objects“ it was heavily influenced by the challenges of text editing starting in the 1970s. Nevertheless the „digital turn“ seems to be of a still more fundamental and dramatical influence on musical editing and even on musicology in general. The first step of this „turn“ was a more technical one: New tools for digital editions of music facilitated the work of editors and allowed to make editorial decisions more transparent for the reader. At the same time the focus of editing began to shift, because a new interest in the act of writing, copying and printing resulted from intensive use of these new tools. But nothing proves to have a more revolutionary impact on music editing than the next step of this „turn“: the rise of new forms of representing „musical objects“: Especially the development of the new scholarly standard by the Music Encoding Initiative (MEI) leads to new concepts of editing and at the same time provides the basis for new forms of research, not only in the editorial context but also for several other fields of musicology. The relevance of this important step should not be underestimated and in my paper I shall try to outline a few of the consequences which are already visible at the horizon – without being tacit about the huge problems which we still confront on our way in the new „digital world“.
Peter Robinson
Professor for Digital Methods in the Humanities, University of Saskatchewan
Scholarly editions, traditionally made by a very few people and looked at by only a very few more, may seem an unlikely starting point for an ambition to change the world. But consider this: almost all we know of the past and present comes to us through documents. Sometimes the documents are paper; increasingly, they are electronic and digital: television news, email messages, kindle readers, and (of course) the world-wide-web. Scholarly editing is all about documents: how they are made, transmitted, altered, read, republished, understood and misunderstood. That is what we do. A succession of scholars over the last decades, notably Jerome McGann on one side of the Atlantic and Don McKenzie on the other, have both broadened the scope of textual scholarship to every text in every document and argued for a shift of textual scholarship to the centre of scholarly discourse. The skills I need to disentangle the texts of Chaucer are the same as those I need to test politicians' statements about immigration. The advent of mass document digitization and digital text encoding offers powerful tools we might use to realize a world where everyone may test every statement, every document. First, the TEI and its partner technologies offer means to encode, link and distribute our understandings (and misunderstandings). Second, the advent of open licensing regimes may open access to knowledge without barriers. However, there are powerful forces acting against this vision: not least, the desire of many academics to retain their privileged ownership of knowledge. This paper will reflect on what we, as scholars and digital humanists, can do within this changing environment.
DH-Concepts
Technische Universität Darmstadt
Institut für Sprach- und Literaturwissenschaft
Dr. Sabine Bartsch
Dr. des. Michael Bender
dh-concepts(at)linglit.tu-darmstadt.de
www.dh-concepts.tu-darmstadt.de
Besucheradresse:
Landwehrstraße 50A
Gebäude S4|23
64293 Darmstadt
Postanschrift:
Dolivostraße 15
64293 Darmstadt