Nowadays, Digital Humanities is a booming subject, but it has a long history. “Quantitative Literary Studies” at the University of Stuttgart investigates how computational methods have been used for the analysis and interpretation of language and literature since the early 19th century. The German Research Foundation (DFG) has now approved the continuation of the project. In the upcoming funding period, project manager Dr. Toni Bernhart (Institute of Literary Studies, NDL II) wants to explore the pioneering achievements of philosopher Max Bense as well as computer scientists Rul Gunzenhäuser and Theo Lutz for quantitative literary studies.
As early as in the 19th century, researchers examined how to capture and explore aesthetic, formal and content peculiarities of texts through quantification and statistical standardization. But until far into the 20th century, such approaches were considered marginal phenomena in the humanities, often disparaged as “word counting” or simply not taken seriously. Rather, it was representatives of other disciplines who were mathematically engaged in literature at the time: Thomas Young, for example, known for his (unsuccessful) attempts to decode Egyptian hieroglyphs with the help of probability calculation, was a doctor. Augustus De Morgan, who tried to clarify the authorship of letters from the Apostle Paul, was a mathematician. A meteorologist, Thomas C. Mendenhall, dealt with the uncertain authorship of Shakespeare dramas. Literary scientists characterized such experiments as curious activities by non-skilled dilettantes. It was only in the 20th century and in the course of digitalization that they increasingly used – at first quite simple – mathematical methods.
Counting machines and “human“ computers
With mathematics, you can investigate stylistic phenomena in the broadest sense, i.e. characteristics that are specific to individual poets, epochs or genres. In large projects, people initially used counting machines or “human computers“, counting and calculating assistants, mostly women. “The efficiency of the research practice as well as the scope of the researched material is amazing,” says Project Manager Bernhart after completing the initial project phase. For example, Friedrich Kaeding, who created the first frequency dictionary of the German language around 1890, examined eleven million words with the help of up to 1,000 persons working free of charge. This scale was only exceeded in 1972 at the IT Department of the University of Hamburg, with, at the time, state-of-the-art punch card technology.
The beginnings of Artificial Intelligence
Two professors from the University of Stuttgart played an extremely important role in the computerization of quantitative literary studies in the 20th century: the mathematician and computer scientist Rul Gunzenhäuser and the philosopher and science theorist Max Bense. In the same environment, the world's first computer-generated lyrical texts by Theo Lutz were created in the late 1950s, which resemble a kind of thrust reversal: If you can analyze texts with the help of statistical methods, then, so Lutz's idea, it must also be possible to produce texts artificially using statistical methods. This is, basically, the beginning of Artificial Intelligence (AI).
In the next project phase, Bernhart wants to investigate the interaction of mathematics, computer science and the humanities in the 1950s and 1960s. Especially this early period was a time when scientists were eager to work together and try out new things, and it represents an important step in the history of science on the threshold of the modern digital humanities. Internationally outstanding examples are the Stuttgart group around Bense, Gunzenhäuser and Lutz, as well as a working group at RWTH Aachen, led by physicist Wilhelm Fucks, that so far has hardly received any attention. Also planned are a book “Quantitative Literary Studies”, which is to be published in 2020, as well as a convention in cooperation with the German Literature Archive (Deutsches Literaturarchiv Marbach) to conclude the project.
The project “Quantitative Literary Studies” was established in 2015 at the Institute of Literary Studies (NDL II) and at the Stuttgart Research Center for Text Studies, both part of the University of Stuttgart, and is associated with the study program “Digital Humanities“. The new funding phase will run until March 2020. For the implementation of the project, Bernhart, who comes from the Freie Universität Berlin, was able to choose the university himself. He opted for the University of Stuttgart, “because the excellent reputation of the Institute of Literary Studies, combined with the research focus on digital humanities and the corresponding study program, as well as the proximity to computer science subjects such as Machine Language Processing, provide an ideal working environment.“
Expert Contact:
PD Dr. Toni Bernhart, University of Stuttgart, Institute of Literary Studies (NDL II), phone: + 49 (0) 711/685 83061, Tel.: +49 (0)711/685 83061, E-Mail