Arts and Humanities Literature and Literary Theory

Digital Humanities and Scholarship

Description

This cluster of papers explores the intersection of digital technology and humanities, particularly focusing on literary studies. It delves into topics such as digital visualization, text analysis, cultural analytics, markup semantics, and collaborative digital scholarship. The papers also discuss the application of social network analysis and data mining in understanding literary history and scholarly editions.

Keywords

Digital Humanities; Literary History; Text Analysis; Cultural Analytics; Markup Semantics; Scholarly Editions; Data Visualization; Social Network Analysis; Text Mining; Collaborative Digital Scholarship

This work provides an analysis of the ways that new media are experienced and studied as the subjects of history, using the examples of early recorded sound and digital networks. … This work provides an analysis of the ways that new media are experienced and studied as the subjects of history, using the examples of early recorded sound and digital networks. In Always Already New, Lisa Gitelman explores the newness of new media while she asks what it means to do media history. Using the examples of early recorded sound and digital networks, Gitelman challenges readers to think about the ways that media work as the simultaneous subjects and instruments of historical inquiry. Presenting original case studies of Edison's first phonographs and the Pentagon's first distributed digital network, the ARPANET, Gitelman points suggestively toward similarities that underlie the cultural definition of records (phonographic and not) at the end of the nineteenth century and the definition of documents (digital and not) at the end of the twentieth. As a result, Always Already New speaks to present concerns about the as much as to the emergent field of new media studies. Records and documents are kernels of humanistic thought, after all, part of and party to the cultural impulse to preserve and interpret. Gitelman's argument suggests inventive contexts for humanities computing while also offering a new perspective on such traditional disciplines as literary history. Making extensive use of archival sources, Gitelman describes the ways in which recorded sound and digitally networked text each emerged as local anomalies that were yet deeply embedded within the reigning logic of public life and public memory. In the end, Gitelman turns to the World Wide Web and asks how the history of the Web is already being told, how the Web might also resist history, and how using the Web might be producing the conditions of its own historicity.
indeed may not be quite possible. But I have no doubt it is worth a try. It has to do with the nature of thought and with one of its … indeed may not be quite possible. But I have no doubt it is worth a try. It has to do with the nature of thought and with one of its uses. It has been traditional to treat thought, so to speak, as an instrument of reason. Good thought is right reason, and its efficacy is measured against the laws of logic or induction. Indeed, in its most recent computational form, it is a view of thought that has sped some of its enthusiasts to the belief that all thought is reducible to machine computability. But logical thought is not the only or even the most ubiquitous mode of thought. For the last several years, I have been looking at another kind of thought (see, e.g., Bruner, 1986), one that is quite different in form from reasoning: the form of thought that goes into the construction not of logical or inductive arguments but of or narratives. What I want to do now is to extend these ideas about narrative to the analysis of the we tell about our lives: our autobiographies. Philosophically speaking, the approach I shall take to narrative is a constructivist one a view that takes as its central premise that making is the principal function of mind, whether in the sciences or in the arts. But the moment one applies a constructivist view of narrative to the self-narrative, to the autobiography, one is faced with dilemmas. Take, for example, the constructivist view that stories do not in the real world but, rather, are constructed in people's heads. Or as Henry James once put it, happen to people who know how to tell them. Does that mean that our autobiographies are constructed, that they had better be viewed not as a record of what
this groundbreaking book, Franco Moretti argues that literature scholars should stop reading books and start counting, graphing, and mapping them instead. In place of the traditionally selective literary canon of … this groundbreaking book, Franco Moretti argues that literature scholars should stop reading books and start counting, graphing, and mapping them instead. In place of the traditionally selective literary canon of a few hundred texts, Moretti offers charts, maps and time lines, developing the idea of “distant reading” into a full-blown experiment in literary historiography, in which the canon disappears into the larger literary system. -- Publisher's website.
In the text-based disciplines, psychoanalysis and Marxism have had a major influence on how we read, and this has been expressed most consistently in the practice of symptomatic reading, a … In the text-based disciplines, psychoanalysis and Marxism have had a major influence on how we read, and this has been expressed most consistently in the practice of symptomatic reading, a mode of interpretation that assumes that a text's truest meaning lies in what it does not say, describes textual surfaces as superfluous, and seeks to unmask hidden meanings. For symptomatic readers, texts possess meanings that are veiled, latent, all but absent if it were not for their irrepressible and recurring symptoms. Noting the recent trend away from ideological demystification, this essay proposes various modes of "surface reading" that together strive to accurately depict the truth to which a text bears witness. Surface reading broadens the scope of critique to include the kinds of interpretive activity that seek to understand the complexity of literary surfaces---surfaces that have been rendered invisible by symptomatic reading.
An abstract is not available for this content so a preview has been provided. Please use the Get access link above for information on how to access this content. An abstract is not available for this content so a preview has been provided. Please use the Get access link above for information on how to access this content.
From the Publisher: Is there a significant difference in attitude between immersion in a game and immersion in a movie or novel? What are the new possibilities for representation offered … From the Publisher: Is there a significant difference in attitude between immersion in a game and immersion in a movie or novel? What are the new possibilities for representation offered by the emerging technology of reality? As Marie-Laure Ryan demonstrates in Narrative as Virtual Reality, the questions raised by new, interactive technologies have their precursors and echoes in pre-electronic literary and artistic traditions. Formerly a culture of immersive ideals—getting lost in a good book, for example—we are becoming, Ryan claims, a culture more concerned with interactivity. Approaching the idea of reality as a metaphor for total art, Narrative as Virtual Reality applies the concepts of immersion and interactivity to develop a phenomenology of reading. Ryan's analysis encompasses both traditional literary narratives and the new textual genres made possible by the electronic revolution of the past few years, such as hypertext, electronic poetry, interactive movies and drama, digital installation art, and computer role-playing games. Interspersed among the book's chapters are several interludes that focus exclusively on either key literary texts that foreshadow what we now call virtual reality, including those of Baudelaire, Huysmans, Ignatius de Loyola, Calvino, and science-fiction author Neal Stephenson, or recent efforts to produce interactive art forms, like the hypertext novel Twelve Blue, by Michael Joyce, and I'm Your Man, an interactive movie. As Ryan considers the fate of traditional narrative patterns in digital culture, she revisits one of the central issues in modern literary theory—the opposition between a presumably passive reading that is taken over by the world a text represents and an active, deconstructive reading that imaginatively participates in the text's creation. About the Author: Marie-Laure Ryan is an independent scholar and former software consultant. She is the author of Possible Worlds, Artificial Intelligence, and Narrative Theory and the editor of Cyberspace Textuality: Computer Technology and Literary Theory.
SHE PHENOMENOLOGICAL THEORY of art lays full stress on the idea that, in considering a literary work, one must take into account not only the actual text but also, and … SHE PHENOMENOLOGICAL THEORY of art lays full stress on the idea that, in considering a literary work, one must take into account not only the actual text but also, and in equal measure, the actions involved in responding to that text. Thus Roman Ingarden confronts the structure of the literary text with the ways in which it can be konkretisiert (realized).' The text as such offers different schematised views' through which the subject matter of the work can come to light, but the actual bringing to light is an action of Konkretisation. If this is so, then the literary work has two poles, which we might call the artistic and the aesthetic: the artistic refers to the text created by the author, and the aesthetic to the realization accomplished by the reader. From this polarity it follows that the literary work cannot be completely identical with the text, or with the realization of the text, but in fact must lie halfway between the two. The work is more than the text, for the text only takes on life when it is realized, and furthermore the realization is by no means independent of the individual disposition of the reader-though this in turn is acted upon by the different patterns of the text. The convergence of text and reader brings the literary work into existence, and this convergence can never be precisely pinpointed, but must always remain virtual, as it is not to be identified either with the reality of the text or with the individual disposition of the reader.
Our projects begin by exploring the relationships between citizens and sourcesbetween members and groups of the public and that diverse body of institutions, knowledges, and disciplinary specialists that we term … Our projects begin by exploring the relationships between citizens and sourcesbetween members and groups of the public and that diverse body of institutions, knowledges, and disciplinary specialists that we term science. We ask questions such as: What do people mean by science? Where do they turn for scientific information and advice? What motivates them to do so? How do they relate this information or advice to everyday experience and to other forms of knowledge? We focus on the diverse encounters with science and expertise that typify everyday experience, a central analytical issue being the construction of authority. Some important prior points must be emphasized:
Ordinarily the word "document" denotes a textual record. Increasingly sophisticated attempts to provide access to the rapidly growing quantity of available documents raised questions about what should be considered a … Ordinarily the word "document" denotes a textual record. Increasingly sophisticated attempts to provide access to the rapidly growing quantity of available documents raised questions about what should be considered a "document." The answer is important for any definition of the scope of Information Science. Paul Otlet and others developed a functional view of "document" and discussed whether, for example, sculpture, museum objects, and live animals, could be considered "documents." Suzanne Briet equated "document" with organized physical evidence. These ideas appear to resemble notions of "material culture" in cultural anthropology and "object-as-sign" in semiotics. Others, especially in the U.S.A. (e.g., Jesse Shera and Louis Shores) took a narrower view. New digital technology renews old questions and also old confusions between medium, message, and meaning. © 1997 John Wiley & Sons, Inc.
Over the past two decades or so, historians of science have lamented the limitations of internalist history and celebrated the rise of contextual history. Historians of technology, however, have not … Over the past two decades or so, historians of science have lamented the limitations of internalist history and celebrated the rise of contextual history. Historians of technology, however, have not accepted the location, by historians of science, of technology within the context of science. Historians of technology see an interaction, rather than contextual dependency. A few historians and sociologists of science and technology are now suggesting `networks' and `systems' as the preferred version of the interactive approach, with the interaction occurring not simply between science and technology, but also among a host of actors and institutions. Networks and systems eliminate many categories in favour of a `seamless web', which may lead to a new appreciation of the complex narrative style.
The paradox of the preface Get access D. C. Makinson D. C. Makinson Worcester CollegeOxford Search for other works by this author on: Oxford Academic Google Scholar Analysis, Volume 25, … The paradox of the preface Get access D. C. Makinson D. C. Makinson Worcester CollegeOxford Search for other works by this author on: Oxford Academic Google Scholar Analysis, Volume 25, Issue 6, June 1965, Pages 205–207, https://doi.org/10.1093/analys/25.6.205 Published: 01 June 1965
This second edition of Jay David Bolter's classic text expands on the objectives of the original volume, illustrating the relationship of print to new media, and examining how hypertext and … This second edition of Jay David Bolter's classic text expands on the objectives of the original volume, illustrating the relationship of print to new media, and examining how hypertext and other forms of electronic writing refashion or "remediate" the forms and genres of print. Reflecting the dynamic changes in electronic technology since the first edition, this revision incorporates the Web and other current standards of electronic writing. As a text for students in composition, new technologies, information studies, and related areas, this volume provides a unique examination of the computer as a technology for reading and writing.
SITUATING DOCUMENT DESIGN. What is Document Design? Evolution of the Field: Contextual Dynamics. OBSERVING READERS IN ACTION. How Documents Engage Readers' Thinking and Feeling. The Impact of Poor Design: Thinking … SITUATING DOCUMENT DESIGN. What is Document Design? Evolution of the Field: Contextual Dynamics. OBSERVING READERS IN ACTION. How Documents Engage Readers' Thinking and Feeling. The Impact of Poor Design: Thinking about Ourselves as Users of Texts and Technology. Seeing the Text: The Role of Typography and Space. The Interplay of Words and Pictures. RESPONDING TO READERS' NEEDS. What Document Designers Can Learn from Readers. Appendices. Bibliography. Indexes. Colophon.
The way we record knowledge, and the web of technical, formal, and social practices that surrounds it, inevitably affects the knowledge that we record. The ways we hold knowledge about … The way we record knowledge, and the web of technical, formal, and social practices that surrounds it, inevitably affects the knowledge that we record. The ways we hold knowledge about the past -- in handwritten manuscripts, in printed books, in file folders, in databases -- shape the kind of stories we tell about that past. In this lively and erudite look at the relation of our information infrastructures to our information, Geoffrey Bowker examines how, over the past two hundred years, information technology has converged with the nature and production of scientific knowledge. His story weaves a path between the social and political work of creating an explicit, indexical for science -- the making of infrastructures -- and the variety of ways we continually reconfigure, lose, and regain the past. At a time when is so cheap and its recording is so protean, Bowker reminds us of the centrality of what and how we choose to forget. In Memory Practices in the Sciences he looks at three memory epochs of the nineteenth, twentieth, and twenty-first centuries and their particular reconstructions and reconfigurations of scientific knowledge. The nineteenth century's central science, geology, mapped both the social and the natural into a single time package (despite apparent discontinuities), as, in a different way, did mid-twentieth-century cybernetics. Both, Bowker argues, packaged time in ways indexed by their information technologies to permit traffic between the social and natural worlds. Today's sciences of biodiversity, meanwhile, database the world in a way that excludes certain spaces, entities, and times. We use the tools of the present to look at the past, says Bowker; we project onto nature our modes of organizing our own affairs.
Contents: Introduction. Part I: The Visual Writing Space. The Computer as a New Writing Space. Writing as Technology. The Elements of Writing. Seeing and Writing. Part II: The Conceptual Writing … Contents: Introduction. Part I: The Visual Writing Space. The Computer as a New Writing Space. Writing as Technology. The Elements of Writing. Seeing and Writing. Part II: The Conceptual Writing Space. The Electronic Book. The New Dialogue. Interactive Fiction. Critical Theory and the New Writing Space. Part III: The Mind as a Writing Space. Artificial Intelligence. Electronic Signs. Writing the Mind. Writing Culture. Conclusion.
Lulled into somnolence by five hundred years of print, literary analysis should awaken to the importance of media-specific analysis, a mode of critical attention which recognizes that all texts are … Lulled into somnolence by five hundred years of print, literary analysis should awaken to the importance of media-specific analysis, a mode of critical attention which recognizes that all texts are instantiated and that the nature of the medium in which they are instantiated matters. Central to repositioning critical inquiry, so it can attend to the specificity of the medium, is a more robust notion of materiality. Materiality is reconceptualized as the interplay between a text's physical characteristics and its signifying strategies, a move that entwines instantiation and signification at the outset. This definition opens the possibility of considering texts as embodied entities while still maintaining a central focus on interpretation. It makes materiality an emergent property, so that it cannot be specified in advance,as if it were a pregiven entity. Rather, materiality is open to debate and interpretation, ensuring that discussions about the text's“meaning” will also take into account its physical specificity as well. Following the emphasis on media-specific analysis, nine points can be made about the specificities of electronic hypertext: they are dynamic images; they include both analogue resemblance and digital coding; they are generated through fragmentation and recombination; they have depth and operate in three dimensions; they are written in code as well as natural language; they are mutable and transformable; they are spaces to navigate; they are written and read in distributed cognitive environments; and they initiate and demand cyborg reading practices.
As digital humanists have adopted visualization tools in their work, they have borrowed methods developed for the graphical display of information in the natural and social sciences. These tools carry … As digital humanists have adopted visualization tools in their work, they have borrowed methods developed for the graphical display of information in the natural and social sciences. These tools carry with them assumptions of knowledge as observer­independent and certain, rather than observer co­dependent and interpretative. This paper argues that we need a humanities approach to the graphical expression of interpretation. To begin, the concept of data as a given has to be rethought through a humanistic lens and characterized as capta, taken and constructed. Next, the forms for graphical expression of capta need to be more nuanced to show ambiguity and complexity. Finally, the use of a humanistic approach, rooted in a co­dependent relation between observer and experience, needs to be expressed according to graphics built from interpretative models. In summary: all data have to be understood as capta and the conventions created to express observer­independent models of knowledge need to be radically reworked to express humanistic interpretation.
If I address crisis of humanities in face of problem of social technology, I want to do so first of all from point of view of United Kingdom, and more … If I address crisis of humanities in face of problem of social technology, I want to do so first of all from point of view of United Kingdom, and more particularly from perspective of growth and development of cultural studies such as it is in Britain. Specifically, this will be from my own experience at Centre for Cultural Studies, where, if one believes in origins, term cultural studies first appeared in its modern manifestation. But this is neither a search for origins nor a suggestion that Birmingham was only way to do cultural studies. Cultural studies was then, and has been ever since, an adaptation to its terrain; it has been a conjunctural practice. It has always developed from a different matrix of interdisciplinary studies and disciplines. Even in Britain, three or four places bold enough to say they are offering courses in cultural studies have different disciplinary roots, both in humanities and social sciences. There should be no implication in my remarks that Birmingham did it right way or even that there was any one Birmingham position; indeed, there is no such thing as Birmingham School. (To hear the Birmingham School evoked is, for me, to confront a model of alienation in which something one took part in producing returns to greet one as thing, in all its inevitable facticity.) My own memories of Birmingham are mainly of rows, debates, arguments, of people walking out of rooms. It was always in a critical relation to very theoretical paradigms out of which it grew and to concrete studies and practices it was attempting to transform. So, in that sense, cultural studies is not one thing; it has never been one thing. In trying to sight problem of humanities and social technology from standpoint of cultural studies a particular sense of irony takes over insofar as cultural studies in Britain emerged precisely from a crisis in humanities. Many of us were formed in humanities; my own degrees are in literature rather than in sociology. When I was offered a chair in sociology, I said, Now that sociology does not exist as a discipline, I am happy to profess it. But truth is that most of us had to leave humanities in order to do serious work in
A visible presence for some two decades, electronic literature has already produced many works that deserve the rigorous scrutiny critics have long practiced with print literature. Only now, however, with … A visible presence for some two decades, electronic literature has already produced many works that deserve the rigorous scrutiny critics have long practiced with print literature. Only now, however, with by N. Katherine Hayles, do we have the first systematic survey of the field and an analysis of its importance, breadth, and wide-ranging implications for literary study.Hayles' book is designed to help electronic literature move into the classroom. Her systematic survey of the field addresses its major genres, the challenges it poses to traditional literary theory, and the complex and compelling issues at stake. She develops a theoretical framework for understanding how electronic literature both draws on the print tradition and requires new reading and interpretive strategies. Grounding her approach in the evolutionary dynamic between humans and technology, Hayles argues that neither the body nor the machine should be given absolute theoretical priority. Rather, she focuses on the interconnections between embodied writers and users and the intelligent machines that perform electronic texts.Through close readings of important works, Hayles demonstrates that a new mode of narration is emerging that differs significantly from previous models. Key to her argument is the observation that almost all contemporary literature has its genesis as electronic files, so that print becomes a specific mode for electronic text rather than an entirely different medium. Hayles illustrates the implications of this condition with three contemporary novels that bear the mark of the digital.Included with the book is a CD, The Electronic Literature Collection, Volume 1, containing sixty new and recent works of electronic literature with keyword index, authors' notes, and editorial headnotes. Representing multiple modalities of electronic writing - hypertext fiction, kinetic poetry, generative and combinatory forms, network writing, codework, 3D, narrative animations, installation pieces, and Flash poetry - the ELC 1 encompasses comparatively low-tech work alongside heavily coded pieces. Complementing the text and the CD-ROM is a website offering resources for teachers and students, including sample syllabi, original essays, author biographies, and useful links. Together, the three elements provide an exceptional pedagogical opportunity.
A pseudo-autobiographical exploration of the artistic and cultural impact of the transformation of the print book to its electronic incarnations.Tracing a journey from the 1950s through the 1990s, N. Katherine … A pseudo-autobiographical exploration of the artistic and cultural impact of the transformation of the print book to its electronic incarnations.Tracing a journey from the 1950s through the 1990s, N. Katherine Hayles uses the autobiographical persona of Kaye to explore how literature has transformed itself from inscriptions rendered as the flat durable marks of print to the dynamic images of CRT screens, from verbal texts to the diverse sensory modalities of multimedia works, from books to technotexts.Weaving together Kaye's pseudo-autobiographical narrative with a theorization of contemporary literature in media-specific terms, Hayles examines the ways in which literary texts in every genre and period mutate as they are reconceived and rewritten for electronic formats. As electronic documents become more pervasive, print appears not as the sea in which we swim, transparent because we are so accustomed to its conventions, but rather as a medium with its own assumptions, specificities, and inscription practices. Hayles explores works that focus on the very inscription technologies that produce them, examining three writing machines in depth: Talan Memmott's groundbreaking electronic work Lexia to Perplexia, Mark Z. Danielewski's cult postprint novel House of Leaves, and Tom Phillips's artist's book A Humument. Hayles concludes by speculating on how technotexts affect the development of contemporary subjectivity.Writing Machines is the second volume in the Mediawork Pamphlets series.
In this volume, Matthew L. Jockers introduces readers to large-scale literary computing and the revolutionary potential of macroanalysis--a new approach to the study of the literary record designed for probing … In this volume, Matthew L. Jockers introduces readers to large-scale literary computing and the revolutionary potential of macroanalysis--a new approach to the study of the literary record designed for probing the digital-textual world as it exists today, in digital form and in large quantities. Using computational analysis to retrieve key words, phrases, and linguistic patterns across thousands of texts in digital libraries, researchers can draw conclusions based on quantifiable evidence regarding how literary trends are employed over time, across periods, within regions, or within demographic groups, as well as how cultural, historical, and societal linkages may bind individual authors, texts, and genres into an aggregate literary culture. Moving beyond the limitations of literary interpretation based on the close-reading of individual works, Jockers describes how this new method of studying large collections of digital material can help us to better understand and contextualize the individual works within those collections.
do we think? N. Katherine Hayles poses this question at the beginning of this bracing exploration of the idea that we think through, with, and alongside media. As the age … do we think? N. Katherine Hayles poses this question at the beginning of this bracing exploration of the idea that we think through, with, and alongside media. As the age of print passes and new technologies appear every day, this proposition has become far more complicated, particularly for the traditionally print-based disciplines in the humanities and qualitative social sciences. With a rift growing between digital scholarship and its print-based counterpart, Hayles argues for contemporary technogenesis-the belief that humans and technics are coevolving-and advocates for what she calls comparative media studies, a new approach to locating digital work within print traditions and vice versa. mines the evolution of the field from the traditional humanities and how the digital humanities are changing academic scholarship, research, teaching, and publication. She goes on to depict the neurological consequences of working in digital media, where skimming and scanning, or hyper reading, and analysis through machine algorithms are forms of reading as valid as close reading once was. Hayles contends that we must recognize all three types of reading and understand the limitations and possibilities of each. In addition to illustrating what a comparative media perspective entails, Hayles explores the technogenesis spiral in its full complexity. She considers the effects of early databases such as telegraph code books and confronts our changing perceptions of time and space in the digital age, illustrating this through three innovative digital productions - Steve Tomasula's electronic novel, TOC; Steven Hall's The Raw Shark Texts; and Mark Z. Danielewski's Only Revolutions. Deepening our understanding of the extraordinary transformative powers digital technologies have placed in the hands of humanists, How We Think presents a cogent rationale for tackling the challenges facing the humanities today.
We live in a world, according to N Katherine Hayles, where new languages are constantly emerging, proliferating, and fading into obsolescence. These are languages of our own making: the programming … We live in a world, according to N Katherine Hayles, where new languages are constantly emerging, proliferating, and fading into obsolescence. These are languages of our own making: the programming languages written in code for the intelligent machines we call computers. Hayles' latest exploration provides an exciting new way of understanding the relations between code and language and considers how their interactions have affected creative, technological, and artistic practices. My Mother Was a Computer explores how the impact of code on everyday life has become comparable to that of speech and writing: as language and code have grown more entangled, the lines that once separated humans from machines, analog from digital, and old technologies from new ones have become blurred. My Mother Was a Computer gives us the tools necessary to make sense of these complex relationships. Hayles argues that we live in an age of intermediation that challenges our ideas about language, subjectivity, literary objects, and textuality. This process of intermediation takes place where digital media interact with cultural practices associated with older media, and here Hayles sharply portrays such interactions: how code differs from speech; how electronic text differs from print; the effects of digital media on the idea of the self; the effects of digitality on printed books; our conceptions of computers as living beings; the possibility that human consciousness itself might be computational; and the subjective cosmology wherein humans see the universe through the lens of their own digital age. We are the children of computers in more than one sense, and no critic has done more than N Katherine Hayles to explain how these technologies define our culture and us. Heady and provocative, My Mother Was a Computer will be judged as her best work yet.
We live in the era of Big Data, with storage and transmission capacity measured not just in terabytes but in petabytes (where peta- denotes a quadrillion, or a thousand trillion). … We live in the era of Big Data, with storage and transmission capacity measured not just in terabytes but in petabytes (where peta- denotes a quadrillion, or a thousand trillion). Data collection is constant and even insidious, with every click and every like stored somewhere for something. This book reminds us that data is anything but raw, that we shouldn't think of data as a natural resource but as a cultural one that needs to be generated, protected, and interpreted. The book's essays describe eight episodes in the history of data from the predigital to the digital. Together they address such issues as the ways that different kinds of data and different domains of inquiry are mutually defining; how data are variously cooked in the processes of their collection and use; and conflicts over what can -- or can't -- be reduced to data. Contributors discuss the intellectual history of data as a concept; describe early financial modeling and some unusual sources for astronomical data; discover the prehistory of the database in newspaper clippings and index cards; and consider contemporary dataveillance of our online habits as well as the complexity of scientific data curation. Essay authors:Geoffrey C. Bowker, Kevin R. Brine, Ellen Gruber Garvey, Lisa Gitelman, Steven J. Jackson, Virginia Jackson, Markus Krajewski, Mary Poovey, Rita Raley, David Ribes, Daniel Rosenberg, Matthew Stanley, Travis D. Williams
| Cambridge University Press eBooks
Artykuł poświęcono możliwościom osadzenia badań historycznojęzykowych w kontekście nowej humanistyki. Refleksją objęto: humanistykę cyfrową, zaangażowaną, kognitywną, posthumanistykę i humanistykę artystyczną. Obserwacje prowadzą do wniosków: (1) historia języka powinna być traktowana … Artykuł poświęcono możliwościom osadzenia badań historycznojęzykowych w kontekście nowej humanistyki. Refleksją objęto: humanistykę cyfrową, zaangażowaną, kognitywną, posthumanistykę i humanistykę artystyczną. Obserwacje prowadzą do wniosków: (1) historia języka powinna być traktowana jako subdyscyplina lingwistyki stanowiąca źródło wiedzy o kulturze; (2) oprócz opisu i interpretacji zjawisk odnoszących się do samego języka i komunikacji, historia języka powinna dążyć do formułowania wniosków diagnostycznych i perspektywicznych, prowadzących do rozwiązania aktualnych problemów człowieka i świata; (3) koniecznym założeniem jest pielęgnowanie i pogłębianie otwarcia metodologicznego historii języka na inne dyscypliny i dziedziny nauki, (4) zgodnie z założeniami neohumanistyki historia języka powinna waloryzować badania jakościowe, w miarę możliwości i zależnie od tematu badawczego, umacniać je za pomocą danych ilościowych; (5) refleksja historycznojęzykowa w obrębie domen nowej humanistyki potwierdza ich krzyżowanie się, tym samym komplementarność i szansę na ujęcie holistyczne; (6) obserwacje włączające się w rozmaite nurty neohumanistyki są widoczne w badaniach nad językową współczesnością, co warto rozszerzyć na dociekania historyczne.
Laura Kokko | AVAIN - Kirjallisuudentutkimuksen aikakauslehti
Tutkin kirjailija Volter Kilven (1874–1939) vastaanottoa suomalaisessa sanoma- ja aikakauslehdistössä vuosina 1900–2019 hyödyntäen lähi- ja etäluvun metodeja digitoidussa aineistossa. Esitän, että Kilven vastaanotto muodostaa itseään vahvistavan kertomuksen, joka on vaikuttanut … Tutkin kirjailija Volter Kilven (1874–1939) vastaanottoa suomalaisessa sanoma- ja aikakauslehdistössä vuosina 1900–2019 hyödyntäen lähi- ja etäluvun metodeja digitoidussa aineistossa. Esitän, että Kilven vastaanotto muodostaa itseään vahvistavan kertomuksen, joka on vaikuttanut lukijoiden odotushorisontteihin nykypäivään saakka, vaikka maineen alun perin synnyttäneiden perustelujen kulttuurinen konteksti on muuttunut. Hidasta muutosta Kilven reseptiossa havainnollistan analysoimalla vastaanotossa määreitä, jotka toistuvat (”vaikea”), vahvistuvat (”klassikko”) ja hiipuvat (”omituinen”) määreitä. Tutkimus tarjoaa kulttuurihistoriallisen näkökulman siihen, miten klassikkokirjallisuuteen on suhtauduttu lehdistössä. Lähestyn Kilven mainetta mnemohistoriallisesti tarkastellen sen periytymistä ja muuntumista kulttuurisessa muistissa. Kysyn, mitä sanastontutkimus paljastaa Kilven maineen rakentumisesta sekä miten havaitsemani ilmiöt laajemmin heijastavat kirjallisuuskäsitysten muuttumista Suomessa. Esille nousee muun muassa se, miten kirjallisuuden yhteiskunnallinen rooli on eri aikoina ymmärretty ja miten se näkyy kirjallisuushistorioiden valinnoissa. Lisäksi artikkeli osoittaa, kuinka digitoitu lehtiaineisto ja digitaalisten metodien joustava käyttö avaavat uusia mahdollisuuksia kirjailijoiden pitkäkestoisen reseptiohistorian tutkimukseen. Tutkin Kilven laajan vastaanoton erityispiirteitä sanoma- ja aikakauslehdistössä pitkällä aikavälillä vuosina 1900–2019 käyttäen hyväkseni lähi- ja etäluvun metodeja digitaalisessa aineistossa. Esitän, että Kilven vastaanotto muodostaa itseään vahvistavan, yhtenäisen kertomuksen, joka on muokannut lukijoiden odotushorisontteja nykypäivään saakka, vaikka maineen alun perin synnyttäneiden perustelujen kulttuurinen konteksti on muuttunut. Aineiston lukeminen osoittaa, että yhä nykypäivänä toistettu käsitys Kilvestä harvoja kiinnostavana klassillisena kirjailijana pohjautuu aina hänen esikoisteoksensa (Bathseba 1900) arvioihin saakka. Muutosta Kilven vastaanoton merkityksissä havainnollistan tutkimalla sanojen ”omituinen” ja ”omalaatuinen” hiipumista Kilpeä arvioivissa teksteissä nykypäivää lähestyttäessä. Kilven pitkän vastaanoton tutkimus avaa kulttuurihistoriallisen näkökulman siihen, miten taiteeseen – tässä tapauksessa klassilliseksi nimettyyn kirjallisuuteen – on suhtauduttu ja suhtaudutaan lehdistössä.
Natalie Wilkinson | Environmental History
This paper outlines a methodology for creating an educational and informative communication system for non-specialised audiences in order to preserve and pass on the heritage of ideas and practices adopted … This paper outlines a methodology for creating an educational and informative communication system for non-specialised audiences in order to preserve and pass on the heritage of ideas and practices adopted in the medieval political and administrative sphere. Through the combined use of digital technologies (such as GISs, 3D modelling and virtual tours), historical sources can potentially reveal how political and administrative aspects affected different areas within the medieval city, not just the main seats of power. Bologna, a prestigious medieval university metropolis, is chosen as a case study because of the remarkable wealth of documentation in its archives from the city’s political culture in the Middle Ages. Written historical sources, including documentary and narrative texts, are among the primary tools employed in the study of European medieval urban communities in general. Documentary sources help us understand and reconstruct the complexities of civic administration, urban policies and the economy, as well as how citizens experience them daily. The involvement of citizens in the political and administrative life of late medieval cities is explored through the management and digital processing of historical documentation. Digital humanities tools can facilitate this analysis, offering a perspective that sheds light on the formation of the pre-modern state. Although digital databases and repositories have significantly contributed to preserving and digitally archiving historical sources, these are often aimed exclusively at the academic level and remain underutilised as privileged didactic and educational tools for a broad audience.
Automated story writing has been a subject of study for over 60 years. Today, large language models can generate narratively consistent and linguistically coherent short fiction texts. Despite these advancements, … Automated story writing has been a subject of study for over 60 years. Today, large language models can generate narratively consistent and linguistically coherent short fiction texts. Despite these advancements, rigorous assessment of such outputs in terms of literary merit—especially concerning aesthetic qualities—has received scant attention. In this paper, we address the challenge of evaluating AI-generated microfiction (MF) and argue that this task requires consideration of literary criteria across various aspects of the text, including thematic coherence, textual clarity, interpretive depth, and aesthetic quality. To facilitate this, we present GrAImes: an evaluation protocol grounded in literary theory; specifically, GrAImes draws from a literary perspective to offer an objective framework for assessing AI-generated microfiction. Furthermore, we report the results of our validation of the evaluation protocol as answered by both literature experts and literary enthusiasts. This protocol will serve as a foundation for evaluating automatically generated microfiction and assessing its literary value.
Bibliographic ontologies are crucial to make the most of networked library metadata, but they show interoperability limitations in the Semantic Web. Following a research study on the subject, this paper … Bibliographic ontologies are crucial to make the most of networked library metadata, but they show interoperability limitations in the Semantic Web. Following a research study on the subject, this paper presents a possible solution to such limitations by means of a reference ontology (RO) intended to allow integration of different ontologies without imposing a common central one and to overcome limitations of mapping techniques, such as crosswalks and application profiles, most used in interconnecting bibliographic ontologies. Interoperability issues of Resource Description and Access (RDA) and Bibliographic Framework Initiative—BIBFRAME (BF) ontologies are addressed using real-world examples from the Library of Congress (LoC) and Biblioteca Nacional de España (BNE) datasets. For a proof of concept of the RO, this paper is focused on two specific interoperability problems that are not solvable with the usual data transformative techniques: misalignments concerning the definition and representation of Work and Expression classes; and the absence of formalization of properties essential to whole-part relationships, namely transitivity, nonreflexivity and asymmetry. The potential of the RO for solving such problem examples is demonstrated by making in-depth use of Resource Description Framework Schema/Web Ontology Language (RDFS/OWL) semantic reasoning and inference mechanisms, combined with Shapes Constraint Language (SHACL), when restrictions are needed to impose data constraints and validation. The RO innovation consists in the formulation of an independent high-level ontology, through which the elements of different source-ontologies are interlinked without being modified or replaced, but rather preserved, and in using semantic mechanisms to generate additional elements needed to consistently describe the relationship between them.
Christopher A. Russell | Bulletin of the Center for Children's Books./Bulletin of the Center for Children's Books
Astri Dwi Andriani | Advances in computational intelligence and robotics book series
This chapter explores the transformative role of artificial intelligence in reshaping the landscape of digital content production across text, image, video, and real-time formats. The authors discuss how technologies such … This chapter explores the transformative role of artificial intelligence in reshaping the landscape of digital content production across text, image, video, and real-time formats. The authors discuss how technologies such as GPT-4, DALL·E 3, and Synthesia enable content creators to accelerate and personalize production processes while addressing ethical implications, copyright concerns, and regulatory challenges. Through analysis of use cases from marketing, journalism, and education, this chapter highlights both the opportunities and potential risks presented by AI-generated content. The authors emphasize the importance of cultural sensitivity, brand reputation, and transparency in integrating AI tools into content strategies. By examining the dynamic interplay between human creativity and machine intelligence, this chapter provides a critical perspective on the future of content creation in an increasingly automated digital ecosystem.
*** ***
The goals of scholarly editing are limited by what can be accomplished in reality. What can be hoped for or aimed at may be the inspiration, but not the goal, … The goals of scholarly editing are limited by what can be accomplished in reality. What can be hoped for or aimed at may be the inspiration, but not the goal, of scholarly editing. Well-argued disagreements among scholars demonstrates that variation in interests, methods, and values for documents, texts, works, history and art, both place perfection out of bounds and valuable triumphs within reach. Textual evidence requires editorial presentation, which requires intellectual added value. Just as there are many ways to get it wrong, there are many ways to get it right.
| Textual Cultures
STS STS
Michael Fox | Textual Cultures
The documentary record of interrelated, multimedia literary works is riddled with indefiniteness. The order of their materials is often uncertain, as is the medium that should serve as their point … The documentary record of interrelated, multimedia literary works is riddled with indefiniteness. The order of their materials is often uncertain, as is the medium that should serve as their point of reference. This article argues that such common and related indefiniteness, like ambiguity and genetic lineage, whether revealed through traditional scholarly methods or computational ones, can be modeled using graph technology. Graph technology, moreover, can be used to model much more about these literary works, from their atomic documentary features to their higher-order features. The article uses as its example Jaime de Angulo’s Old Time Stories, a Modernist American masterpiece consisting of voice, text, and image. Graph modeling, or graph editing, the work results in a fine-grained, computationally accessible representation of it as it really is in all its indefinite and networked nature. Such a representation lends itself to typical hermeneutic investigations enhanced by the power of inferential queries, and it can even serve as an actually authentic source for more quantitative investigations. Most important, like all non-digital natural-language based artifacts, it can also be endlessly modified by future editors without ever giving up from its structure the history of its own making.
| University of Toronto Press eBooks
Abstract: In apartheid South Africa in the 1960s, official discourse about automation and computerization was racially inflected because apartheid racialized the division of labor. In the same era, writers in … Abstract: In apartheid South Africa in the 1960s, official discourse about automation and computerization was racially inflected because apartheid racialized the division of labor. In the same era, writers in different countries programmed computers to generate poetry and prose. One of those writers was J. M. Coetzee. Coetzee’s early fiction bears the trace of his even earlier experiments in computer poetry more than hitherto recognized, and engages critically with the peculiar inflection, in its moment, of apartheid discourse on labor.
Aya Ismail Khairy , Mohamed Ali El Araby , Rania Reda Salama | Journal of Design Sciences and Applied Arts/Journal of Design Sciences and Applied Arts
B Shalini | Shanlax International Journal of Arts Science and Humanities