Transitions: A Prologue and Preview of Digital Humanities Research in Canada

Keywords

digital humanities, Canadian research / humanités numériques, transitions, recherche canadienne

How to Cite

Bonnett, J., & Kee, K. (2010). Transitions: A Prologue and Preview of Digital Humanities Research in Canada. Digital Studies/le Champ Numérique, 1(2). DOI: http://doi.org/10.16995/dscn.106

Download

Download HTML

680

Views

106

Downloads

Introduction: In which we present our theme for this issue -- Transitions

It is a truism in the digital humanities, a constant one, and a good one, that it is always in a state of transition. Such an observation is not surprising since the instrument upon which it relies – the computer – is itself in a state of flux. For the moment, its computational power remains firmly in the grip of Moore's Law, exponentially increasing its computational power as the decades pass. Scholars, whether they want it or not, are constantly being presented with new paradigms of computing -- be it cloud computing, ubiquitous computing, or high performance computing -- and new tools and markup schemes to express, treat and analyze content. In any publication devoted to the digital humanities, then, it would seem superfluous to mention that change is our constant condition and our constant preoccupation, a trite observation best left unsaid. We sympathize with this view. But when it comes to describing digital humanities scholarship generally, and computationally supported scholarship in Canada particularly, we think it is wrong. In Canada and abroad, a number of important developments have recently emerged that will impinge on the practice and future trajectory of our inter-discipline. They are new, important, and are reflected in the contributions to this issue. They are of sufficient moment and frequency that we feel justified in rendering this issue of Digital Studies with the thematic stamp it now bears: that of transition.


1.0 Preview: In which we present the objects of our article – Prologue and Preview

1.1 Prologue of the Past

Our first purpose in this introduction is to present a history of digital humanist scholarship in Canada, in essence to provide a brief prologue of the work featured here. The founding of this journal, instantiated in its recently released first issue – New Paths For Computing Humanists – was an institutional transition. Prior to its establishment, there were few domestic outlets for Canadian digital humanists to publish their research. Certainly, they had access to journals situated in their first disciplines such as English literature and History, but such venues often contained peer reviewers who were sceptical of scholarship supported by computational methods. In 1996, prompted by the emergence of humanities and computing scholarship in the early 1980s, and the establishment of COCH/COSH (the Consortium for Computers in the Humanities / Consortium pour ordinateurs en sciences humaines) in 1986, Canadian digital scholars established their first vehicle for research dissemination: Computing in the Humanities Working Papers. More recently, with the move of Text Technology to McMaster University in 2002, Canadian researchers gained a second domestic outlet to publish their work. The advent of Digital Studies marked the emergence of a third outlet. But for our purposes, it also marked the emergence of something else, an opportunity to give, in this its second issue, an account of the tradition of scholarship that has given rise to it. The last major surveys of digital humanities scholarship in Canada were generated in 2002 and subsequently published in the book Mind Technologies (Siemens and Moorman, 2006).

These were valuable surveys, particularly the historical overview provided by Ian Lancashire (Lancashire, 2006). But as the first decade of this century draws to a close, new centers devoted to digital humanities scholarship have emerged, and new initiatives such as the INKE project and Synergies consortium are now being undertaken. There is a need, therefore, to briefly take stock of the current state of digital humanities scholarship in Canada, a mandate this article will undertake to fulfill.


1.2 Preview of Things to Come

Our second purpose is to provide a preview, and here we mean preview in two distinct senses. As is the case in any editorial introduction to a special issue, our purpose here is to provide an overview of the articles featured in this special issue. You will find here an expression of recent work undertaken in Canada. We make no pretence that is a comprehensive survey. Indeed, there are important domains of the digital humanities that are not included here, including recent work relating to the visualization of text, treatments of cyber-culture, and feminist interpretations of digital culture. We do make the pretence, however, that it is representative of some of the best recent work that Canada has to offer, touching on fields ranging from historical GIS and agent-based simulations, to discussions touching on cyberinfrastructure and the use of RDF schemas on historical text data. Our second sense of preview touches on a different issue. Here, our concern is not primarily the substance of the papers but rather their larger significance. We suggest that in recent years a number of new and important trends have emerged in the digital humanities and related disciplines in the social sciences, trends that will impinge on how scholars treat, aggregate, disseminate and analyze their content. The second purpose of our preview, therefore, is to identify those trends, and to use them in turn as a framework to suggest the significance of the contributions presented here.


2.0 Prologue

2.1 Origins

What is the state of digital humanities scholarship in Canada? The picture has changed significantly in recent years – new programs and projects have been crafted at a remarkable pace. Before embarking on a review of the contemporary situation, however, we need to understand where we began. In the pages that follow, we review the initial digital humanities projects, in both English- and French-speaking Canada, focused on both text analysis and demographic data. We then turn from these progenitors to their inheritors – the major ventures currently underway. Upon this foundation we review, from West to East, the specific scholars at work in Canada today.

A brief history of digital humanities in Canada begins, fittingly, in Québec. At the Programme de recherche en démographie historique (PRDH, or Research Programme in Historical Demography) at the Université de Montréal, formed in 1966, researchers such as Hubert Charbonneau and Jacques Légaré applied the methodologies of the Annales School and historical demography to questions concerning the French population in New France, from their arrival to the seventeenth century. Drawing on demographic data drawn from baptism, marriage and burial certificates, they created a computerized population register. The original project has grown into a multi-purpose database, now served on the Web, which is used for a wide variety of research projects in a plethora of disciplines from medicine to genealogy, producing 200 publications to date.

At the Ontario Institute for Studies in Education, the historian Michael Katz, his colleagues, and his graduate students were similarly inspired by the work of French demographers and those who followed them. Foremost among these was Peter Laslett, whose landmark book, The World We Have Lost (1965), challenged the dominant models of social change in the historical and social scientific literature. Most historians assumed that the high mobility of mid-twentieth-century populations was a contemporary phenomenon. Katz and others who formed the “Canadian Social History Project” used mainframe computers to apply statistical analyzes to census records drawn from the mid-nineteenth century, with a special focus on the linkage of people from one census to another. Echoing similar findings in Europe and the United States, they showed that in places such as Hamilton, Ontario, both social and geographical mobility were higher than expected.

While historians were using mainframes to analyze census records, literary scholars were applying computing to text analysis – studying the content and style of manuscripts and books. In Quebec, Jean-Guy Meunier was one of Canada’s first digital humanists. Starting as a graduate student in the late 1960s, Meunier initiated a three-decade long research agenda dedicated to computer-supported text analysis, and the creation of software to support text analysis. From his position at the Université du Québec à Montréal, he helped launch the university’s Centre d’application des médias technologiques à l’enseignement et à la recherche (CAMTER) in 1972. Meunier also formulated the concept of CARAT – “Computer-Assisted Reading and Analysis of Texts,” a research construct that has proven so powerful that the Society for Digital Humanities recently cited Meunier for his role “in establishing CARAT as an area of inquiry unto itself (SDH-SEMI, 2007).” In later years, Meunier’s efforts were reinforced by scholars such as Jacques Ladouceur at Laval and François Daoust and Luc Dupuy at the University of Québec à Montréal (UQAM), who respectively were responsible for producing software such as DAT (Découpage automatique de texts) and SYREX (Système de reconnaissance des expressions) at Laval, and SATO (Système d'analyse de text par ordinateur) at UQAM (McCarty 1986) (McCarty 1989).

The important initiatives instigated by Meunier and succeeding Quebecois scholars were paralleled in Ontario with the release of PRORA concording software in 1966, and the concurrent publication of Manual for the Printing of Literary Texts by Computer. Developed for the IBM 1401 by Robert Jay Glickman and Gerrit Joseph Staalman, the software was praised in the journal Computers and the Humanities for imposing the “minimum possible knowledge of computers on the part of literary scholars” and for providing “very explicit instructions…to initiate a prospective user of the program” (Lieberman, 1966: 12). Beginning in the 1970s, Ian Lancashire and others at the University of Toronto incorporated databases and other tools into projects such as Records of Early English Drama (REED, which edits and publishes primary source documents related to seventeenth-century British theatre-history), and extended the Early Modern English Dictionaries Database (EMEDD, now the Lexicons of Early Modern English or LEME) to the Web. Lancashire also brought scholars together through his administrative and organizational efforts, creating the Centre for Computing in Humanities at theUniversity of Toronto, and helping launch theConsortium for Computers in the Humanities/Consortium pour ordinateurs en sciences humaines (now the Society for Digital Humanities / Société pour l'étude des médias interactifs (SDH/SEMI)) in 1986.


2.1.2 Significant Projects from the Past Decade

The progenitors above are joined today by a host of new centres, programs and projects that span the spectrum of the digital humanities. Many extend the text analysis research begun by Lancashire and others to new domains and new platforms. The most established and influential of these is the Text Analysis Portal for Research (TAPoR), lead by Geoffrey Rockwell at the University of Alberta. Originating as a network of six Canadian humanities computing centres, and based at McMaster, TAPoR enables researchers to aggregate and access tools for advanced textual analysis. The community has grown beyond the original contributors thanks to a Web portal accessible to anyone interested in experimenting with the tools. Beyond this infrastructure, TAPoR has also sponsored conferences such as CaSTA: the Canadian Symposium on Text Analysis, and collaborated with journals such as Text Technology.

Open access is also a feature of The Orlando Project, another Canadian digital humanities success story. Founded by Patricia Clements, Isobel Grundy and Susan Brown at the University of Alberta, project participants of Orlando – comprising scholars from Canada, the US, UK and Australia – have developed what they call a “textbase” of women’s writing – one that integrates readable text with the functionality of a database. Published in both print and electronic form by Cambridge University Press, Orlando: Women’s Writing in the British Isles from the Beginnings to the Present, permits users to query the writings of nearly a thousand British writers, as well as “a great deal of contextual historical material on relevant subjects, such as the law, economics, science, writing by men, education, medicine, politics” (The Orlando Project, 2000).

In contrast to long-established projects such as TAPoR and Orlando, the Implementing New Knowledge Environments project (INKE) emerged only this year. Described in detail in an essay in this collection, INKE rests on four research agendas respectively devoted to textual studies, information studies, usability and interface design. Its more fundamental purpose, however, is to answer a simple question: what past practices from the book should be appropriated and adapted to next generation digital knowledge environments? A second fundamental concern will be to explore the implications of that transfer: how will the act of reading change when text is ensconced in digital environments? Lead by Ray Siemens at the University of Victoria, the project is a collaboration of 35 scholars from Canada, the USA, and UK, and 21 agencies.

Providing researchers with open access to the publications of their peers is the goal of Synergies: The Canadian Information Network for Research in the Social Sciences and Humanities . Built from Érudit, a Quebec-based publication platform, the Open Journal (OJS) and Open Conference (OCS) Systems, produced by the Public Knowledge Project, and DSpace, produced by MIT Libraries and Hewlett Packard, the Synergies platform will act as both a research and dissemination tool. Published articles, papers not yet published, data sets, presentations and electronic monographs will all be accessible to Canadian researchers, providing them with a tool similar in nature to Project Muse or JStor.

If the projects above can trace their roots back to early experiments in text analysis, the Canadian Century Research Infrastructure (CCRI), lead by Chad Gaffield at the University of Ottawa, is an outgrowth of the application of computing to census records by historians in the 1960s and 1970s. As Anthony DiMascio and Adam Green point out in their paper in this issue, the CCRI has brought together researchers in the humanities and social sciences from seven Canadian universities to create interrelated databases built on the Canadian censuses from 1911 to 1951. With this project, data related to the ethnic, religious and geographic histories of individuals, households and communities is becoming accessible in a fundamentally new way.


2.1.3 Description of Digital Humanities Scholarship Today -- Every Region south of the 60th parallel

The leaders of the projects above are joined by an expanding network of colleagues across Canada. Like any survey, the one that follows is limited by the authors’ time and the publisher’s space. Omissions are therefore inevitable and unavoidable. With that apology as prologue, what follows is a brief overview, from West to East.

The University of Victoria has become a major hub of digital humanities, thanks in part to the pioneering work of Michael Best and Peter Baskerville. Best, the recipient of the Society for Digital Humanities2009 Award for Outstanding Achievement, is well known for his role in founding the Internet Shakespeare Editions. Placed on-line in 1996, the ISE in its own words is dedicated to presenting Shakespeare’s works “in a form native to the medium of the Internet: scholarly, fully annotated texts of Shakespeare's plays, multimedia explorations of the context of Shakespeare's life and works, and records of his plays in performance” (Internet Shakespeare Editions, 2005). Aside from the ISE, Best also contributed his considerable expertise to the development of the TAPoR project, for which he was a principal investigator. Peter Baskerville, now at the University of Alberta, is one of Canada’s most distinguished social and quantitative historians. A co-leader of the CCRI and a member of the Royal Society of Canada, he is also the author, with Chad Gaffield, of prescient articles that predicted the emergence of on-line knowledge portals such as the Synergies platform. Their articles were important because they, among other things, predicted a defining characteristic of knowledge portals: their capacity to support multiple tasks in one repository that traditionally have been supported by multiple institutions separately, tasks relating to knowledge generation, knowledge dissemination, and data storage (Baskerville and Gaffield, 1983-84) (Gaffield and Baskerville, 1985) (Kondratova and Goldfarb, 2003).

Working from the Department of English, Ray Siemens is continuing the tradition of digital scholarship initiated by Best and Baskerville. In addition to his work with the INKE project, Siemens is also the founder of the University of Victoria’s annual Digital Humanities Summer Institute, and the electronic journal Early Modern Literary Studies. He has also extended his considerable energy to organizations such as the Society for Digital Humanities / Société pour l'étude des médias interactifs, where he is currently President (English) of the organization. His colleague in the Department of History, John Lutz is the co-founder, with Ruth Sandwell, of the Great Unsolved Mysteries in Canadian History project, a project that uses murder mysteries from Canadian history to teach students historical method. Lutz has long encouraged historians to look beyond lectures, seminars and assigned readings when considering how they teach. His and Sandwell’s website, which exposes students to primary sources, and forces them to wrestle with competing interpretations of the same, stands as a remarkable contribution to history education. Peter Liddell, founding director of the Humanities Computing and Media Centre (HCMC), was an early user at UVIC through work that explored the use of computation to explore language learning (CALL). Through his directorship of the HCMC, Liddell has been central in supporting important computing initiatives at UVIC, including the Internet Shakespeare Editions.

At Vancouver Island University (formerly Malaspina University College), Patrick Dunae – in conjunction with John Lutz and Jason Gilliland – is using historical GIS to explore Victoria’s social history in the late 19th and early 20th centuries. GIS, or Geographic Information Systems, are suites of software that enable researchers to juxtapose attribute data – such as a person’s class or race – with spatial data, such as the locale in which the individual lived. The three are using GIS to support a project titled “Turning Space Inside Out: Racial Space in Victorian Canada.” The purpose of the project is to challenge the prevailing assumption in multiple literatures that racial groups in 19th century Victoria were typically segregated into distinct enclaves. Basing their findings on discourse and statistical analyses, the three argue that “Victoria’s commercial centre and residential districts were more integrated than historians and urban geographers have indicated,” and that “the boundaries of race…may not have been as fixed as we assumed them to be” (Lutz, Dunae, Gilliland, 2009). From his post in the Department of History, Dunae has been a long-time advocate for computer-mediated history research and learning, as editor of The Homeroom: British Columbia’s history of education web site, and is Project Director of viHistory.ca, a searchable database of historical data.

Marshall Soules , Department of Media Studies, has also been an important contributor to digital humanities scholarship at VIU. Director of the University’s Media Research Lab, and a researcher who has contributed on a variety of topics ranging from Canadian media theory to graffiti, Soules is now undertaking a project titled The Image Dialogue , which he characterizes as “a web-based application for the study and archiving of political images found in the public sphere in Canada and Cuba” (Soules, 2008).

Across the Georgia Strait at the University of British Columbia, Teresa Dobson, in the Department of Language and Literacy Education, researches digital literacy and engagement, and the integration of digital technologies in secondary and post-secondary humanities education. An INKE researcher, she is also the Director of the new Digital Literacy Centre (focused on research and teaching in the use of digital media in literacy education). At the Department of French, Hispanic and Italian Studies / Département d'études françaises, hispaniques et italiennes, William Winder has taken on roles editorial with the journals Text Technology and the Computing in the Humanities Working Papers, and administrative with SDH/SEMI. Winder’s research addresses the impact of writing technology in French-speaking societies, as well as the implementation and use of computer-assisted interpretation and analysis of language.

Over the Rockies, the digital humanities have flourished at the University of Alberta, thanks in large measure to the founding research and activities undertaken by Stephen Reimer. Reimer received his exposure to the field by attending some of the earliest humanities computing courses ever offered in Canada while attending the University of Toronto. He subsequently took an active role in promoting the field of humanities computing at the University of Alberta, initially by offering a series of graduate seminars on the topic in the mid 1980s. He also played an important role in recruiting personnel such as Terry Butler and Susan Hockey, and in acquiring computing infrastructure which could support important initiatives such as the Orlando Project (Siemens, 2009). The late Terry Butler had a hand in many of the programs and projects now thriving there: he founded the university’s Technology for Learning Centre, now the Arts Resource Centre, and helped develop the university’s M.A. in Humanities Computing program. He also collaborated on TAPoR, Orlando and the Coleridge Notebooks Index Project. He also led the TechEdge Project, which focused on the IT competencies of liberal arts graduates.

Patricia Clements , who arrived at the Department of English in 1970, built a research program and team respectively focused on women’s writing and the development of software tools to support historical and literary analysis, the result of which was the Orlando project. Another key player in Orlando is Isobel Grundy, now Professor Emeritus in the Department of English and Film Studies. Grundy is a specialist on English women writers from the medieval period to the 18th century. That research focus, combined with her interest in digital research methods, induced Grundy to participate in the creation of Orlando’s online textbase. It also lead her to author Vindicating Their Sex: Pre-Victorian Women's Writing in the British Isles, the first volume of The Orlando History of Women's Writing in the British Isles, a three volume set that will be published next year by Cambridge University Press (Susan Brown is the author of Volume Two, Contradictions and Continuities: Women's Writing in the British Isles, 1820-1890, while Patricia Clements, Jo-Ann Wallace, and Rebecca Cameron will contribute Volume III, FreeWoman: Modern Women's Writing in the British Isles). Stan Ruecker, also in the Department of English and Film Studies, contributes a research agenda that draws on his background as a book designer and computer systems analyst. A collaborator on several projects, including the INKE Project, the Monk Project (Metadata Offer New Knowledge), the Orlando Project, and the Humanities Visualization Project, is currently pursuing projects related to human-computer interface design, information visualizations, and information design.

In the Department of History and Classics, Sean Gouglas is applying GIS software to support research in domains ranging from archaeology to historical demography, environmental history and the history of medicine. Of particular note, Gouglas is currently undertaking a project that challenges prevailing interpretations of settlement patterns in Western Canada in the 19th century. The literature has traditionally pointed to cultural and social factors as the governing variables, but Gouglas argues that environmental and demographic variables were as important if not more important in accounting for variations in historic settlement patterns (Gouglas, 2007).Director of the VITA (Visualisation and Information Technologies in the Arts) Research Studio, much of Gouglas’ energy is now focused on the university’s Humanities Computing Programme, where he is the Director. The newest member of this impressive group is the principal investigator for the TAPoR project, Geoffrey Rockwell, who arrived recently from McMaster University. At McMaster, Rockwell was the director of the Humanities Media and Computing Centre. While there, he also developed an undergraduate Multimedia program which was subsequently amalgamated with the university’s interdisciplinary Communication Studies programme to become the Department of Communication Studies and Multimedia. Rockwell also contributed to the development of the undergraduate program the department now delivers. Appointed to the University of Alberta’s Department of Philosophy, Rockwell will continue to lead TAPoR, and help develop a Ph.D. component for the university’s Digital Humanities programme.

In the Prairies at the University of Saskatchewan, Brent Nelson, Department of English, is a specialist in English Renaissance literature, and employs computer-supported text analysis to assist him in his research. A collaborator on INKE, he organized the 2008 Canadian Symposium on Text Analysis conference, and is now editing the proceedings of that conference with Melissa Terras. Nelson is also the editor of a recent collection of papers devoted to the history of the book, which appeared in the 2008 edition of Computing in the Humanities Working Papers.Peter Stoicheff andLin Yiu have also been important drivers of digital humanities scholarship at the U of S. Both, with graduate student Jon Bath, are currently involved in the Ege Project, an initiative to digitally reconstruct 40 medieval books. Early in the 20th century, the collector Otto Ege opted to divide the manuscripts in his possession into separate leaves which were then assembled into boxes, each containing 50 leaves, and sold. To date, Stoicheff and his colleagues have located 33 of the 39 boxes potentially available for retrieval. At the University of Manitoba, the late Paul Fortier, who taught in the Department of French, Spanish and Italian, focused on the statistical and computer-aided techniques for the study of literature, and developed software to support his research.

In Central Canada at the University of Western Ontario, William Turkel, Department of History, is combining his interests in computing with his research in environmental history and the history of science and technology. A Project Director of NiCHE: The Network in Canadian History and Environment , Turkel’s interests have recently turned to hacking, which he defines as the use of computing devices and the physical environment as instruments to support historical expression. From his Lab for Humanistic Fabrication, Turkel is using physical computing devices – such as 3D Printers – and mobile computing devices – such as the computer tablet – to support the generation and display of physical and virtual objects, objects that are used to express location-based historical interpretations that are situated in physical environments. The research of Jason Gilliland, Department of Geography, is similarly cross-disciplinary. Director of the Urban Development Program, he integrates architecture, urban planning and human geography to research the dynamics of social and physical landscape change. GIS forms a central component of this program, as is evident from his and Matthew Novak’s contribution to this collection.

Based at the University of Guelph, Susan Brown, one of the founders of the Orlando Project, with Patricia Clements and Isobel Grundy, is now its Project Director. Since the mid-1990s, Brown’s digital humanities research has been devoted to interface design, tool development, and the collaborative production of born-digital scholarship so well instantiated in Orlando. She is also a specialist in Victorian literature, and is the author, as noted above, of Volume II of the forthcoming The Orlando History of Women's Writing in the British Isles.

McMaster University, and specifically its Department of Communication Studies and Multimedia, has been a central node in Canada’s digital humanities network. Stéfan Sinclair’s research is devoted to the design and development of text analysis tools to support literary research, with a special emphasis on visualization. A collaborator with Rockwell on TAPoR, Sinclair was a founder of the Experimental Reading Workshop while at the University of Alberta. He is also the creator of multiple software tools, including HyperPo, a text exploration and analysis program specifically designed to enable easy navigation between source documents and representations of the same documents generated by HyperPo. Andrew Mactavish, current Director of the Humanities Media and Computing Centre, and another collaborator on TAPoR, specializes in digital games and gaming culture. His research focuses on the affective dimensions of gaming – how visual and audio effects add or detract from the pleasure of gameplay – and the social implications of gaming, how games and gamers “play” with, and within, cultural and social arrangements of power. His current research project G-ScalE: Game Scalability Environment, is dedicated to determining, in his words, “the effects of visual scale on gaming experience and to apply these findings to the development of games that scale from the very small to the very big” (Mactavish 2009)

At the end of the Niagara Peninsula at Brock University, John Bonnett and Kevin Kee are pursuing research interests devoted to the aesthetics, pragmatics and pedagogy associated with computers games and virtual worlds. Drawing on his background as an historian, and as a director and project director at the National Film Board of Canada, Kee’s research is focused on best practices for the design, development and use of computer simulations and serious games for history. Out of his Simulating History Research Lab, Kee has developed virtual environments, online Flash games and mobile geo-spatial interactive tours for both education and public history. In addition to his research program, Kee has worked to build Brock and Niagara as an emerging mode in the digital humanities network by organizing the biennial Interacting with Immersive Worlds conference, and helping to build the university’s new Interactive Arts and Sciences undergraduate program, and the region’s Niagara Interactive Media Generator.

John Bonnett is the developer of the 3D Virtual Buildings Project, an initiative that is designed to do two things: to teach students the methods associated with the reconstruction of historic buildings, using photographs, fire insurance maps, and 3D modeling software; and to develop student critical thinking skills, by showing them that models must be distinguished from the objects they purport to represent (Bonnett, 2003) (Bonnett, 2006). Bonnett is currently pursuing a project titled HistorySpace. Its premise is that historians will eventually opt to use virtual worlds as platforms for scholarly publishing. It purpose is to support that enterprise, by designing and testing tools and workflows that will support expression, documentation and peer review in virtual environments. Bonnett is also pursuing research interests related to the emerging medium of Augmented Reality and, as indicated by his contribution to this collection, High Performance Computing.

Digital humanities scholarship at the University of Toronto has benefited enormously, as we have indicated, from the research and organizational initiatives of Ian Lancashire. Alan Galey, a new arrival to the Faculty of Information, is continuing the university’s commitment to computer-aided textual analysis and expression, with a focus on interface design traditions that are native to the humanities. A specialist in Shakespeare and the Renaissance, he has already designed an interface for the Electronic New Variorum Shakespeare. A member of the INKE project team, he also recently co-edited with Ray Siemens a special issue of the journal Shakespeare devoted to digital methods for teaching and interpreting the Bard’s writings. The special issue was titled Reinventing Digital Shakespeare. At the Ontario Institute for Studies in Education, Ruth Sandwell, a long-time collaborator with John Lutz, was the co-developer of the award-winning website Who Killed William Robinson?, the first in their series of history education websites collectively titled the Great Unsolved Mysteries in Canadian History project.

Chad Gaffield , principal investigator of the Canadian Century Research Infrastructure, has been a pioneer of computationally-supported scholarship among Canadian historians. A historian of nineteenth- and twentieth-century Canada at theUniversity of Ottawa, Department of History, he is also a former president of the Canadian Historical Association as well as theCanadian Federation of Humanities and Social Sciences, and is the current president of the Social Sciences and Humanities Research Council of Canada. Through his leading role in the CCRI and participation in other initiatives such as The Canadian Families Project,The Vancouver Island Project, and the Lower Manhattan Project, Gaffield has lent his considerable organizational talents to support the growth of quantitative, census-based, historical research in Canada. Adam Green, also in the Department of History, is undertaking similar work. A member of the CCRI team since 2004, and a team leader since 2006, much of his work in that initiative – outlined in the paper he co-authors in this collection – centered on developing a project User Guide, one that would make the data, and the infrastructure delivering it, accessible to researchers from multiple disciplines. If one conceives Digital Humanities research in Canada as arrayed across an axis, with historical demography at one end, and text analysis at the other, then the University of Ottawa can lay claim to a position on either side. In addition to the work of Gaffield and Green, Christian Vandendorpe, in the Département de français, has been a long time student of the application of hypertext theory to the history of the book. His most recent monograph, Du papyrus à l’hypertexte, has generated international attention and has been translated into English and Spanish (Vandendorpe, 1999) (Vandendorpe, 2003) (Vandendorpe, 2009).

At the Université de Montréal, as noted above, the tradition that began with the Programme de recherche en démographie historique (PRDH) continues. Lisa Dillon, in the Département de Démographie, has used the rich databases of the PRDH, and specifically their micro data on Quebec, to inform her studies of historical demography of the family and ageing in Canada and the United States. Her writings have especially emphasized the importance of intergenerational relations, marriage timing, and concepts of ageing during the 18th and 19th centuries. The analysis, expression and dissemination of text is also a significant focus at U de M. Jean-Claude Guédon, in the Département de littérature comparée, has been a long-time advocate for open access e-publication projects. He founded the first Canadian scholarly electronic journal, Surfaces, in 1991 and currently sits on the Advisory Board of the Open Humanities Press. Dominic Forest, in the École de bibliothéconomie et des sciences de l'information, is contributing research to the CARAT (Computer-Assisted Reading and Analysis of Texts) research agenda initiated by Meunier. Through applications such as NUMEXCO, Forest is creating software to support inductive text classification and categorization (Forest and Meunier 2005). Michael Eberle-Sinatra, in the Département d’études anglaises, is a specialist in 19th century British Literature, with a specific interest in British Romantic Literature. He has extended his research interests to the Web and digital publications, as a founding editor of the electronic peer-reviewed journal, Romanticism on the Net (now Romanticism and Victorianism on the Net), and as founding editor of two websites, the Leigh Hunt Archive, and the British Women Playwrights around 1800 website. He is President of the Synergies project, described above, and is also the President (French) of the Society for Digital Humanities.

Expression and analysis in digital environments are also a focus on the other side of Mont Royal, at the Université du Québec à Montréal. Bertrand Gervais, Director of Figura: Centre de recherche sur le texte et l'imaginaire, as well as NT2: Le laboratoire de recherches sur les œuvres hypermédiatiques, draws on his research and experience as an essayist and novelist to re-imagine and express hypermedia fiction. His paper in this collection addresses the principles that have informed his work at NT2, and the lessons that he and his team have learned. While Gervais’ work focuses on multimedia expression, Jean-Guy Meunier, Département de philosophie, has emphasized, as indicated above, analysis through CARAT, the computer-assisted reading and analysis of text. CARAT is a method that combines insights from semiotic text interpretation theories with the combinatorial logic of the computer (Meunier, Biskri and Forest, 2005) (De Pasquale and Meunier, 2003). It is a method that can encompass deductive methods for text analysis, but over the past decade Meunier, through the development or co-development of applications such as SATIM and NUMEXCO, has championed and continues to champion an inductive approach, one with the potential to support “lexical analysis, automatic generation of hypertext links, knowledge extraction and thematic analysis” (Forest and Meunier, 2005). Work in digital history has also emerged at UQAM, under the leadership of José Igartua. A historian of Montreal merchants during the Conquest and recently of the evolution of national identities in English Canada after World War II, Igartua has made a formative contribution to history teaching with computers through the development of Web sites including Histoire-Hypermédia, which champions the use of primary source documents to develop student research skills and teach history.

Continuing east to the Université de Sherbrooke, Léon Robichaud in the Département d'histoire is pursuing projects devoted to Web access of primary historical sources, and re-constructions of urban environments using three-dimensional models. A long-time practitioner in the field of history and computing, Robichaud contributed to the historical research and database development for Old Montreal / Vieux Montréal, a 3D interactive model of pre-industrial Montreal constructed in the early 1990s. He subsequently made the project’s database available on the Web. He also collaborated on the Virtual Historic Savannah Project, and was a co-developer of La torture et la verité: Angélique et l’incendie de Montréal / Torture and the Truth: Angélique and the Burning of Montreal , one of the websites affiliated with Great Unsolved Mysteries in Canadian History project.

In the Maritimes, at the University of New Brunswick, Margaret Conrad, Department of History, is the creator and director of the Atlantic Canada Portal, created in collaboration with UNB’s Electronic Text Centre (ETC) (located within the UNB library system). A specialist in the history of Atlantic Canada, Conrad is using the portal to host a comprehensive bibliography of works on the region, virtual archives of primary source material, directory, research forum and listserv, and well as a repository of teaching resources. Conrad is aided in this task by Erik Moore, the Director of the ETC, and Lisa Charlong, the Assistant Director and Coordinator of XML Initiatives. Alan Burke, now retired, was a force behind the expansion of the ETC, and developed software that has been redeployed to other contexts, such as a prototype for finding electronic texts that was incorporated into TAPoR, enabling users to find projects and collections for analysis.

At Acadia University, Richard Cunningham, Department of English, is researching the relationship between the user’s perception and processing of electronic text and neuroplasticity (the adaptation of brain structure to stimuli from the environment). How is human cognition affected by the use of electronic text? The Director of the Acadia Digital Culture Observatory, a usability facility for research into reading and interacting with electronic material, Cunningham is also a project leader with the INKE project, and is the Secretary and Webmaster for the Society for Digital Humanities.

At Dalhousie University, Ronald Tetreault, Department of English, is a founder of the university’s Electronic Text Centre, which provides online access to documents, research projects, and Dalhousie-based electronic journals. Tetreault, a specialist in the history of print culture, Romantic literature, and electronic texts, is also the co-editor of an electronic scholarly edition of Lyrical Ballads by Samuel Taylor Coleridge and William Wordsworth. Dean Irvine, also in the Department of English, is the Principal Investigator of the EMiC Project (Editing Modernism in Canada Project). The principal concern of EMiC is to produce critically edited texts by modernist Canadian authors. The project is comprised of 33 partners, and will be supported by digital humanities centres situated across the country. At the School of Information Management, Fiona Black is also a print culture historian, and uses historical GIS to support her research. With a focus on the late 19th and early 20th centuries, Black is an editor of the History of the Book in Canada project.

At Memorial University in St. John’s, Newfoundland and Labrador, the Department of History is the base for two digital humanities researchers. Sean Cadigan, a social historian and Team Leader on the CCRI project, has used census data to study merchants, family and the state in Labrador. He is also an economic and ecological historian who has conducted studies on marine and fisheries management, and the forestry and pulp-and-paper industry. Robert Sweeny’s geographic focus has been farther afield: industrialization and immigration in Montreal in the nineteenth and twentieth centuries. Through the “Montréal l'avenir du passé - Montreal, The future of the Past” project, which he describes in a contribution to this collection, Sweeny and his collaborators have created a knowledge repository that will enable researchers to link spatial and attribute data at six different points in Montreal’s history, and undertake research projects on topics as diverse as the impact of major urban renewal projects to studies of child mortality and the location of social networks.


3.0 Preview -- Contributions in this Issue

Our intent, thusfar, has been to characterize the past of digital humanities scholarship in Canada. We now turn our attention to its present, and specifically to the contributions that comprise this issue of Digital Studies. In this section, we address three specific challenges that we believe underline with special force our contention that the digital humanities are currently in a state of transition. These are challenges that digital humanities scholars are rising to meet, and challenges which we believe usefully encapsulate the significance of the contributions presented here. These three challenges respectively are:


  • The Challenge of New Platforms
  • The Challenge of the Topographic Revolution
  • The Challenge of Aggregation

3.1 The Challenge of New Platforms

The word “platform” has multiple meanings depending on the context in which it is applied. In computing, it can refer to software such as Java, an operating system such as Windows, or hardware such as a supercomputer. We define “platform” to mean any repository in which content is instantiated, stored and distributed. By this measure, the book is a platform, and so was the scroll. With the advent of computing, new paradigms for computing, and new objects for representation, we suggest that new platforms are emerging, and that more will in future. There will not be one singular platform. There will be many, as researchers devise new technologies, new software, and new combinations of technology and software to express and distribute content.

One new platform that is currently emerging is the electronic book.. It will support the dynamic expression of text. Potentially, it will support much more, including the expression of computer-generated 3D objects (Billinghurst, Kato and Poupyrev, 2001). A second platform will involve location-based media. Computer-generated objects will be integrated with the viewer’s perception of real space, an innovation that will have a bearing on domains ranging from computer gaming and advertising to art and historical expression. A third platform will be virtual worlds, where users will aggregate text, 2D and 4D objects to support tasks in domains ranging from heritage to public safety and emergency preparedness.

One key point that should accompany any discussion of computer-based platforms is that we live in a time of convergence. We live in a time when it is now possible in principle to combine devices like iPhones and electronic paper to create forms of representation that we would not have anticipated or thought of even ten years ago. To be sure, there are limits on what we can do, limits imposed by time, knowledge, imagination, technology and, of course, funding. But we submit that as a field we are nowhere near to approaching those limits. And we would further submit that we would do well to begin exploring where those limits are located.

It is in this context – the formulation of new platforms, and the determination of their capacities and implications – that the contributions of the INKE project, Teresa Dobson, Adam Green and Anthony Di Mascio, Léon Robichaud, Bertrand Gervais and John Bonnett emerge as important frames of departure for digital humanities research. In “Implementing New Knowledge Environments,” Ray Siemens and his collaborators in the INKE project start with a simple proposition: the design of a new platform, any platform, cannot proceed from a vacuum. Its design must be informed by a frame of reference. The frame that INKE project members have chosen is the frame that governs the primary platform we now employ: the book. In the estimation of the INKE project, an international collaboration of 35 researchers from 20 institutions, there is much the book can teach us about the design of new knowledge environments. Some of its conventions will transfer into current and next generation knowledge environments, and persist. And, further, the workflows that presently and historically have governed the production and distribution of books will serve as valuable frames of reference for designing and implementing the operational infrastructure that will be necessary to sustain scholarly communication in digital environments.

In “Interrupting Processes of Inquiry,” Teresa Dobson and Vetta Vratulis consider the pedagogical implications of an important new platform: the Wiki. And in so doing, the two ask readers to consider a fundamental point: when a new invention emerges, the first generation of users rarely understands its importance. Users fail to appreciate how and where the new invention will be applied. And they fail to understand how and where the given invention will change their practices and those of their successors. In their study, the two present findings that suggest social software will have dramatic and disruptive effects on the production of knowledge, within the academy, and without. For social software to function effectively, users will have to be willing to pay a price, the price of foregoing proprietary control over the content they contribute to a given site and a given project. In the age of social software, there is a good chance that in many contexts the conception of authorship will need to shift from the individual to the social level of human organization. As the two authors show, making that shift is not always easy for users, even university students who have not been deeply socialized into the norms of a given professional domain, and who are comfortable with other forms of computer software.

For the digital humanities, the implication of this article is that social and collaborative software will prompt a shift in conceptions of authorship, not only in informal settings such as those employing Wikis, but also in more formal knowledge environments such as the ones Siemens and the INKE project propose to design and build. In many settings, scholars will have no choice. Content creation in digital environments will require the contribution of many hands, as it does in film. To generate digital content and effectively exploit it, scholars in the digital humanities will need to change the working culture that has traditionally prevailed in the humanities. Generally, humanities scholars work in isolation. They prize their independence, and they are not required to learn the social and practical skills required to function in teams. If Dobson and Vratulis are right, that situation will need to change. And the best way to begin effecting the needed cultural change is in the classroom, at all levels of education.

In “The Canadian Century Research Infrastructure: Enabling Humanities and Social Science Research in the Digital Age”, Anthony Di Mascio and Adam Green reflect on the research effort behind the development of the Canadian Century Research Infrastructure (CCRI). The goal of the CCRI is to create a “research infrastructure” – a large-scale database, accessible to the public, of the Canada censuses from 1911, 1921, 1931, 1941 and 1951, which will be made available to Canadians via the Web. The range and diversity of participants behind this initiative are remarkable: a collaboration of over 100 researchers and staff, drawn from history, sociology, geography, demography, economics, statistics and interdisciplinary studies at seven Canadian universities, and in partnership with IBM and Statistics Canada. A secondary goal of the CCRI is to marry its storehouse of quantitative data with qualitative data, such as newspaper articles, in order to provide context to the data’s raw numbers. Over the past five years, the project has digitized nearly 20 million images.

Di Mascio and Green revisit the steps taken to build and populate the database: capturing and then cleaning the data, coding it (linking each response to a code in a specific coding scheme), providing contextual data (such as newspaper coverage or political debates drawn from the Hansard), and finally developing a user guide. As they relate, the capturing of the data proved to be one of the biggest challenges of the project, one which, in the end, forced participants to create their own data acquisition software. Di Mascio and Green trace the evolution of the software’s design, the results obtained from project’s workflow, and the implications that the CCRI’s experience presents for future digital infrastructure projects. As digital humanists formulate new platforms, we will encounter challenges similar to those faced by the CCRI team: the emergence of new technologies that force us to develop new strategies that govern how we work: At what point do we build on what the commercial market has established, or develop, as the CCRI team did, our own software?

This is a question that Léon Robichaud has faced many times. A digital historian since the 1980s, Robichaud has developed databases and interfaces for initiatives such as the Old Montreal / Vieux Montréal and the Répertoire du patrimoine culturel du Québec project. Robichaud is interested in the digitisation of heritage inventories, and their distribution on the Web. He is also interested in the role that historians can play in the development of these tools, and in the way that historians can use these platforms to better connect the public and their heritage. In “L’histoire, le patrimoine et le public: la diffusion de l’histoire grâce aux inventaires numériques disponibles sur le Web” Robichaud refers to several standout heritage inventory projects, situated in Quebec, to articulate best practices for their development and use. Along the way, he explores the larger relationship between historians, built heritage and sites of memory, and the public.

Robichaud finds that effective development and delivery of heritage inventories requires the identification of separate target audiences, including those which make policy and law concerning built heritage (such as the government), researchers, teachers and students, residents and businesses, tourists and their guides, and those audiences that will emerge in the future. The content situated in each inventory, he points out, must be presented in such a way that it meets the user’s need for sufficient, but not too much, information. The database must be flexible, and capable of adaptation. And these must be built to accommodate new uses and platforms (such as GPS-enabled smartphones). Indeed, he points out that an effective heritage inventory will be one that is open to users and platforms previously unimagined.

Finally, in “High-Performance Computing: An Agenda for the Social Sciences and the Humanities in Canada,” John Bonnett considers the implications that Canada’s new High-Performance Computing platform presents for humanities and social science scholarship in Canada. High Performance Computing, or HPC, is a paradigm of computing that can be distinguished from desktop computing by its rapidity and by the quantity of information that it can process. Typically, an HPC computer is not a computer at all, but rather a network of computers that are clustered together by high speed optical lines. Because such clusters typically link hundreds if not thousands of computing processors into a single unit, they can rapidly process files that might take a desktop a week or more to process, and support tera-scale and peta-scale levels of computing, meaning they can respectively process one trillion calculations on one trillion bytes of data per second, or one quadrillion calculations on one quadrillion bytes of data per second (Bonnett, Rockwell, Kuchmey, 2008).

Scholars, put simply, are being presented with more computational power than they have ever had access to before. Bonnett suggests, however, that most scholars in the humanities and social sciences have not deeply considered how to exploit this emerging capacity. HPC will doubtless support important initiatives relating to text and data mining, but Bonnett argues that there are other opportunities that scholars can explore. In the social sciences, he suggests that researchers could combine the computational power of HPC with agent-based simulations to create serious games to support training and policy formation in domains such as public safety and emergency preparedness. In the humanities, with the emergence of new platforms such as Virtual Worlds and Massive Multi-user Online Environments, Bonnett argues scholars could use HPC to support their development as media for scholarly communication. Scholars will need to devise and test expressive, attestive and narrative conventions for digital environments. They will further need to devise and test workflows to support the generation, dissemination and distribution of scholarly content in multi-media environments.


3.2 The Challenge of the Topographic Revolution

While digital scholars will be happily tasked with exploring the applications and implications of digital platforms in the years and decades ahead, we suggest that they will also be faced with a second important challenge: learning how to apply a new class of expressive object, one which will populate present and future generations of digital platforms. This class will have a pronounced impact on the expressive and analytical practices of the humanities, social sciences and sciences. Its import is such that, even now, we feel justified in arguing that the humanities, social sciences and sciences are in the midst of what we refer to as the topographic revolution.

The topographic revolution, we suggest, is the appropriation by research and expressive domains of formalisms that have some or all of the following properties:


  • They are topographic, meaning they have two or three dimensions
  • They are dynamic, meaning they move, and change their morphology or configuration over time
  • And they are autonomous, meaning that they perform prescribed rules of behaviour, but do so independently of the direct manipulation of the author or programmer. The effect is akin to a group of words self-organizing themselves into a sentence.

One way that the topographic revolution has impinged on scholarship is through the development of methods and applications that interpolate perceptual data – such as the spatial data expressed by maps and 3D objects – with abstract data, such as the numeric data contained in databases. Juxtaposing the two forms of data often enables viewers to see patterns and relationships in data they would not otherwise perceive, and master concepts they would not otherwise understand. The simultaneous use of multiple formalisms can be a powerful vehicle to support research and learning in multiple disciplinary contexts.

This insight has proven especially true for disciplines that have opted to use Geographic Information Systems (GIS) to support their research. The software associated with GIS interpolates attribute data – data describing characteristics such as the ethnic or class profile of a given city – with spatial data – spatial coordinates situated on a map – and has proven to be a valuable support for disciplines ranging from print history to historical geography, the research domain that concerns Matt Novak and Jason Gilliland. In “‘Buried Beneath the Waves’: Using GIS to Examine the Physical and Social Impact of a Historical Flood,” the two reflect on their use of historical GIS (HGIS) to better understand the effects of the Thames River flood, which occurred in London West, Ontario on July 11, 1883 and killed 17 or 18 people, leaving many more homeless. Inspired by the leadership of pioneers in the use of HGIS such as Anne Kelley Knowles, Novak and Gilliland explore how its tools might be applied to study natural disasters that previously have been overlooked or underreported by historians, due to a lack of traditional primary sources. In their contribution, the authors interpolate a variety of data sources within GIS to determine the physical and social repercussions of the flood.

HGIS, they point out, can bring together data sources from both the past and present. In their case, Novak and Gilliland created three-dimensional models of the town of London West, raised the level of the water in these 3D models to hypothetical heights, and then compared the results to newspaper reports, historic photographs and archaeological data. By modelling scenarios in this way they were able to determine the extent and severity of the flood’s impact: 90% of the town was affected, due to the river rising six to eight metres over its normal height. Historical GIS, they conclude, can be an important tool when historical records are sparse. By enabling researchers to integrate multiple data sources, and supporting spatial analysis, HGIS can support the reconstruction of historical landscapes and events. The two argue, however, that historical GIS should not be reserved for situations where archival records are no longer available. It can also challenge accounts based on “traditional” sources, such as newspaper articles, which were often written by publishers with goals that extended beyond accurate reporting. It is indeed a new way of “seeing and explaining the past”.

In “Behaviour Space: Simulating Roman Social Life and CIvil Violence,” Shawn Graham mirrors a key point made by Novak and Gilliland: analytic software, properly applied, can open domains of inquiry that were previously inaccessible to scholars. This insight induced Graham to pose two questions which he pursues in this article: what social conditions prompted the civil violence that became a recurring feature of Roman society in the second century A.D.? And, what, if anything, could have been done by Rome’s élite to preserve the Pax Romana, which ended with the death of Marcus Aurelius in 180 A.D.? Traditionally, Graham writes, scholars have not had access to a rigorous method to consider these questions, save for reading and interpreting first-hand reports and archaeological data. That was fine insofar as it went, but historical inquiries of social processes based on traditional hermeneutics suffer from two major shortcomings. First, they offer no way to assess the relative validity of one hypothesis over another. Both can be consistent with the data. And both can say very different things. Second, extant methods of interpretation and exposition do not require scholars to explicitly and rigorously formalize the assumptions underlying their hypotheses. Agent-based simulations, Graham writes, offer a promising way to meet the interpretive, methodological and computational constraints imposed by historical methods that rely wholly, or predominately, on a scholar’s interpretation of textual and material data.

Agent-based simulations are programs that permit scholars to reconstruct a social process by defining the system’s actors -- or agents -- and the rules governing their conduct. They are significant because the agents perform the prescribed rules autonomously, without the direct intervention of the modeller. They are further significant because they represent virtual systems that display emergent properties, or structures. In essence, the programmer defines the “agent,” which can be everything from a microbe to a multi-national corporation to an empire, or some combination of these and other types of agents, and then hits the “on” switch. The modeller activates the simulation to see what kind of pattern emerges. Agent-based simulations are finally significant because they can and do produce emergent patterns that are markedly similar to those produced in real-world systems, be they biological, social, cultural or economic systems. They offer researchers a way to formulate and then test candidate hypotheses. Given his data and problem sets, agent-based simulations are proving to be an ideal method for Graham, who uses them to argue that much of Rome’s unrest can be traced to the specific nature of its economy -- which rested on networks of patrons and clients -- and its specific method of wealth distribution: the sportulae.


3.3 The Challenge of Aggregation

Aside from the dissemination and representation of content, digital humanists face a third important challenge and that is to seriously and critically consider the implications of our increasing power to aggregate content. Due to the Internet, search engines, software and mark-up languages, we now have the power to quickly and easily locate and aggregate text, sound, graphical and three-dimensional objects. For scholars, this development has been a cause for both celebration and concern, and here through the papers of Bruce Robertson, Bertrand Gervais, Robert Sweeny and Steven High and David Sworn, we present contributions that represent both views. Robertson, of course, is well known to digital humanists as the founder of The Historical Event Markup and Linking Project. In earlier work, Robertson devised an XML schema known as HEML (Historical Event Markup Language), a schema designed to quickly locate and aggregate primary and secondary sources related a given date. His first project, HEML-Cocoon also used XSLT style sheets to generate maps, lists and graphical timelines of the documents retrieved by HEML. In his contribution here, “‘Fawcett’: A Toolkit to Begin an Historical Semantic Web,” Robertson reports on the latest efforts of his project to harness the data discovery methods of the Semantic Web, particularly through use of RDF (Resource Development Framework) schemas.

Robertson’s purpose in this article is two-fold. The first is to present the suite of tools associated with Fawcett, tools that convert documents governed by multiple metadata schemas into a common RDF schema, to support the easy aggregation of content. Here, Robertson also describes tools designed to visualize the findings of a search query, to enable the user to rapidly determine the significance of a document relative to its counterparts. But Robertson also has a larger purpose in this article, one that brings it into conversation with Dobson’s and Vratulis’ contribution. Robertson believes that content aggregation can make a marked contribution to history, classics and related disciplines in the historical sciences, but only if scholars change the cultural norms governing their practice. By default, most scholars work in isolation, and they do not share their data. Tools such as Fawcett, he argues, will only work if scholars make it a normative practice to deposit their data on-line so colleagues can use it.

In “Arts et littératures hypermédiatiques: éléments pour une valorisation de la culture de l’écran”, Bertrand Gervais describes the principles behind his development of the Répertoire des arts et littératures hypermédiatiques, a component of the larger NT2: le Laboratoire de recherche sur les œuvres hypermédiatiques at the Université du Québec à Montréal. The mandate of NT2 is the study, reading, creation, and archiving of new forms of works of “hypermedia” (which brings together image, text, video and sound). In this article, Gervais points out that the old descriptive tools, based in literature, cinematography and art, are no longer sufficient to describe the new works of hypermedia now being developed. Gervais uses David Clark’s “Ulysses 101”, a reimagining of James Joyce’s Ulysses (1922) in hypermedia form, as a heuristic; how, he asks, should we describe and analyze this work?

Gervais created the Répertoire in response; it represents an effort to create a new vocabulary for, and offer a critical perspective on, hypermedia works. He outlines the three principle issues faced in the Répertoire’s creation: finding the appropriate multidisciplinary mix to analyze the works under consideration, developing strategies to find and identify the hypermedia productions, and articulating new protocols to accurately describe these. In the end, the Répertoire became much more than a repository: Gervais and his colleagues developed an online community where researchers could work together, and share strategies and tools. This, Gervais underlines, is perhaps the biggest lesson to be drawn from this pioneering project: effective work in the digital humanities involves more than just observing and reporting with computers. Instead, it requires that we participate in the emergence of new forms and practices. We must link theory and practice, research and creation, changing attitudes and making tools.

While Robertson and Gervais plainly believe in the power, place and potential of content aggregation to support historical research and literary expression, their colleagues in this section, Robert Sweeny, Steven High and David Sworn, are more inclined to voice cautions. Their stance is based on a concern that has long concerned historians, when you splice all or part of a document from its original context and place it in a different context, you risk doing violence to that content. You risk ascribing a meaning to it that is at variance with what the original author intended. Robert Sweeny in his contribution, “Rethinking Boundaries: interdisciplinary lessons from the Montréal l’avenir du passé (MAP) project, lays particular stress on this point. The MAP project is devoted to creating a historical GIS of 19th century Montreal, one that will support analyses relating to the social history and historical geography of the city. As originally conceived, participants originally intended to respect the traditional scholarly division of labour with geographers attending to maps and historians attending to the project’s databases. They further intended to centralize all sources from all time periods into a single database.

However, Sweeny reports that as project members engaged with their sources they soon found that their sources revealed distinct constructs of space that hindered their easy integration, and content that weighed against traditional scholarly divisions of labour: “It became clear that both the historical logic of the maps and the spatial dynamics of the sources needed to be respected.” In response, Sweeny and his colleagues opted to change the architecture of their historical GIS, opting for a de-centralized system composed of multiple applications, each one organized around a map from the three periods they are studying: “Each historical source is a standalone database, with its own user interface….It will be possible to create coherent partial datasets from each source, but it will not be possible from within our software to isolate fields from the essential context provided by the whole record…. This design is modular, open-ended and cumulative.”

Steven High and David Sworn, for their part, focus on the use of the computer in the context of oral history. High began capturing interviews with an audio cassette recorder, then moved to a VHS camera, before arriving to today’s Palmcorder, so he is no stranger to the challenges of obtuse and constantly-changing technology. Whatever the device, the authors point out, the result is often the same: historians transcribe the recording to text, and the original capture of the interview is left in a locked file cabinet. High and Sworn are drawn to the “humanity” of oral history, and in “After the Interview: The Interpretative Challenges of Oral History Video Indexing”, they describe their efforts to use the computer to put “a face and a name to the past” - to honour the speakers and their memories.

From the Centre for Oral History and Digital Storytelling at Concordia University, the authors developed an indexed digital video database of oral history interviews focused on the displaced paper workers of Sturgeon Falls, Ontario. They then tested the efficacy of the database with students, not always to great fanfare. That we learn more from our mistakes than our successes is a truism, they point out, but one that researchers sometimes avoid – it is far easier to report on what we did right. Readers have much to gain by High and Sworn’s frank retelling of the process of the development of their database. They realized that the video indexing software that they employed provided access to useful “clips,” but at the cost of seeing the larger context of the interview, and the speaker’s experience. In the end, they decided to create their own. Research in the digital humanities, they point out, necessitates the development of technologies that meet the needs of humanists. Far better, in their case, to “adapt nascent indexing software to historical practice, rather than the reverse.”


4.0 Conclusion

Such, then, are three important challenges that digital humanists face here in Canada and abroad. Doubtless, there are more. And doubtless, more will emerge in the decades to come, as new paradigms for computer, and new methods for generating, expressing and disseminating content emerge to the fore. Whether we like it or not, change – transition – will be our constant condition and constant preoccupation in the years and decades to come. And -- in the end -- that state may not be an altogether unhealthy thing for the humanities generally and the digital humanities particularly.

The communication theorist Harold Innis certainly thought so. In his communication writings Innis argued that there were two stances that cultures could take to knowledge. The first was passive – the Written Tradition – while the second was active – the Oral Tradition. In essence, Innis argued that it is not a good thing for cultures, and scholars particularly, to immerse themselves in interpretation and theory and forget the all important domain of praxis. In Empire and Communications Innis, quoting Nietzche, criticized Alexandrian Greece for devoting itself exclusively to analysis and interpretation and for forgetting that healthy scholarship requires the scholar to take an active, innovative hand in the domain that he or she is criticizing:

The power of the written word made the Alexandrine age one of "erudition and criticism," of specialists rather than poets and scholars. The Alexandrine man was "a librarian and corrector of proofs and who, pitiable wretch, goes blind from the dust of books and printers' errors" (Nietzche). Collectomania and large libraries accompanied taste and respectability. Aesthetic opinions were crystallized and the dilettante appeared (Innis, 1950: 112-113).

Innis' counsel, in the end, was that scholars should subvert and re-formulate aesthetic constructs, that they be practitioners of the Oral, not the Written Tradition. Innis' basis for this stance was his belief that there was a relationship between the way we articulate knowledge and the way we subsequently construct it. When we express content using novel expressive forms, we create new relationships between percepts, a step that supports the creation of new concepts, new understandings of the content or ideas that we are seeking to express. Because ancient Greece was willing to experiment with novel forms of poetry, Innis argued, it retained the cognitive and intellectual flexibility necessary to shift its conception of natural history, from one that placed the burden of proof on theology, to another that placed the burden on science (Innis, 1950: 76-77).

The central theme of this introduction -- in the end -- is that digital humanists should never permit themselves to be too comfortable. They should be practitioners of the Oral Tradition. They should critically engage with changes in thought and techne as and when they encounter them. The substance of this introduction has been that digital humanists are doing precisely that. They are meeting the challenges imposed by new platforms, by the topographic revolution, and by the emerging power of aggregation. And they are considering how new formalisms and new methods can support analysis and expression. Through default or design, scholars are following Innis' advice to conceive anew, and construct anew. In the end, they are recognizing his point that if the humanities are to remain an innovative and creative space, then its practitioners must ensure that their thought and practice continually remain in a state of transition.



Works Cited

Baskerville, Peter A. and Chad M. Gaffield. “The Vancouver Island Project: Historical Research and Archival Practice." Archivaria 17 (1983-84): 173-187. Print.

Billinghurst, Mark, Hirokazu Kato, and Ivan Poupyrev. “The MagicBook: a transitional AR interface.” Computers and Graphics 25.5 (2001): 745-755. Print.

Bonnett, John. “Following in Rabelais’ Footsteps: Immersive History and the 3D Virtual Buildings Project.” Journal of the Association for History and Computing 6.2 (2003). Web. 11 June 2009. <http://mcel.pacificu.edu/jahc/2003/issue2/articles/panel/bonnett.php>.

───. “Mediating the Past in 3D and How Hieroglyphics Get in the Way.” Mind Technologies: Humanities Computing and the Canadian Academic Community. Eds. Ray Siemens and David Moorman. Calgary: U of Calgary P, 2006. 201-224. Print.

───, Geoffrey Rockwell and Kyle Kuchmey. “High Performance Computing in the Arts and Humanities.” SHARCNET Website ( The Shared Hierarchical Academic Research Computing Network) . 2008. Web. 10 June 2009. <http://www.sharcnet.ca/Documents/HHPC/hpcdh.html>.

De Pasquale, Jean-Fréderic, and Jean-Guy Meunier. “Categorisation Techniques in Computer-Assisted Reading and Analysis of Texts (CARAT) in the Humanities.” Computers and the Humanities 37.1 (2003): 111-118. Print.

Forest, Dominic and Jean-Guy Meunier. “NUMEXCO: A Text Mining Approach to Thematic Analysis of a Philosophical Corpus.” Computing in the Humanities Working Papers . A.32. Publ. August 2005. 2005. Web. 10 June 2009. <http://www.chass.utoronto.ca/epc/chwp/Casta02/Forest_casta02.htm>.

Gaffield, Chad and Peter Baskerville. “The Automated Archivist: Interdisciplinarity and the Process of Historical Research.” Social Science History . 9.2 (1985): 167-184. Print.

Gouglas, Sean. “Research: Historic Censuses in Canada: Incorporating Landscape into Statistical Models of Settlement.”Sean Gouglas: Humanities Computing and History and Classics, Personal Web Page. 2007. Web. 10 June 2009. <http://www.ualberta.ca/~sgouglas/research.html>.

INKE. “Implementing New Knowledge Environments: A Project Funded by the SSHRC Major Collaborative Research Initiatives Program.”INKE Website. 2009. Web. 9 April 2009. <http://www.inke.ca>.

Innis, Harold. Empire and Communications. Oxford, UK: Clarendon, 1950. Print.

Internet Shakespeare Editions. “About the Internet Shakespeare Editions.” Internet Shakespeare Editions Website. 2005. Web. 10 June 2009. <http://internetshakespeare.uvic.ca/Foyer/about.html>.

Kondratova, Irina and Ilya Goldfarb. "Design Concepts for Virtual Research and Collaborative Environments." The 10th ISPE International Conference on Concurrent Engineering: The Vision for Future Generations in Research and Applications, July 26-30, 2003 . Eds. J. Cha et al. Lisse, the Netherlands: Swets and Zeitlinger, 2003. 797-803. Print; rpt. 1 April 2008. Web. <http://iit-iti.nrc-cnrc.gc.ca/iit-publications-iti/docs/NRC-45830.pdf>.

Lancashire, Ian. "Text Analysis and Research Innovation. "Mind Technologies: Humanities Computing and the Canadian Academic Community. Eds. Ray Siemens and David Moorman. Calgary: U of Calgary P, 2006. xix-xxxii. Print.

Laslett, Peter. The World We Have Lost. New York: Scribners, 1965. Print.

Lieberman, David. “Reviews: Robert Jay Glickman and Gerrit Joseph Staalman. Manual for the Printing of Literary Texts by Computer . Toronto: U of Toronto P, 1966. Print.

Lutz, John, Pat Dunae and Jason Gilliland. “Locating Race in the Queen City: Towards a GIS of Racial Space in Victoria, British Columbia, 1891.” Canadian Association of Geographers / L’association canadienne des géographes. Congress 2009 General Papers. 2009; rpt. 2009. Web. 10 June 2009. <http://ocs.sfu.ca/fedcan/index.php/cag2009/cag2009/paper/view/1736/0>.

Mactavish, Andrew. “Andrew Mactavish: Home.” Andrew Mactavish, Personal Web Page. 2009. Web. 10 June 2009. <http://www.humanities.mcmaster.ca/~mactavis/>.

McCarty, Willard. Software Fair Guide : A Guide to the Software Fair held in conjunction with the conference “Computers and the Humanities: Today’s Research, Tomorrow’s Teaching.” Toronto, ON: Centre for Computing in the Humanities, U of Toronto, 1986. Print.

───. Tools for Humanists, 1989: Software and Hardware Fair Guide . Toronto, ON: Centre for Computing in the Humanities, U of Toronto, 1989. Print.

───. Humanities Computing . Basingstoke, UK: Palgrave Macmillan, 2006. Print.

Meunier, Jean-Guy, Ismail Biskri and Dominic Forest. “A Model for Computer Analysis and Reading of Text (CARAT): The Satim Approach.” Text Technology 2 (2005): 123-151. Print.

The Orlando Project. “The Orlando Project.” The Orlando Project: A History of Women’s Writing in the British Isles Website. 2000. Web. 10 June 2009. <http://www.ualberta.ca/ORLANDO/>.

SDH-SEMI. “2007 Award for Outstanding Achievement.” Society for Digital Humanities / Société pour l’étude des médias interactifs Website. 2007. Web. 10 June 2009. <http://sdh-semi.org/award2007.php>.

Siemens, Ray. E-mail to Kevin Kee and John Bonnett. 13 May 2009. Web.

─── and David Moorman, Eds. Mind Technologies: Humanities Computing and the Canadian Academic Community . Calgary: U of Calgary P, 2006. Print.

Soules, Marshall. “Vancouver Island University – Media Studies Department – Faculty Profile – Marshall Soules, Ph.D.” Vancouver Island University Website. 2009. Web. 10 June 2009 <http://mediastudies.viu.ca/profile.php?id=1>.

Rockwell, Geoffrey. “Introduction: Thinking through Text Technology.” Hermeneuti.ca: The Rhetoric of Text Analysis Website. By Stéfan Sinclair and Geoffrey Rockwell. 2009. Web. 8 Oct 2009. <http://hermeneuti.ca/rhetoric/intro>.

UCLA Mellon Seminar on Digital Humanities . Digital Humanities Manifesto 2.0. 2009. Web. 8 Oct 2009. < http://www.digitalhumanities.ucla.edu/images/stories/mellon_seminar_readings/manifesto20.pdf>.

Vandendorpe, Christian. Du papyrus à l’hypertexte: essai sur les mutations du texte et de la lecture. Montréal: Boréal, 1999. Print.

───. del Papiro Al Hipertexto: Ensayo Sobre Las Mutaciones del Texto y La Lectura. San Diego: Fondo de Cultura Económica USA, 2003. Print.

───. From papyrus to hypertext : toward the universal digital library . Trans. Phyllis Aronoff and Howard Scott. Urbana, IL: U of Illinois P, 2003. Print.

Weinstein, Fred. "Psychohistory and the Crisis of the Social Sciences." History and Theory 34 (1995): 299-319. Print.


Endnotes

[1] The authors would like to thank the following individuals for their response to our queries, comments on earlier drafts, and sharing of source material. Their generous assistance greatly assisted in the composition of this article: Geoffrey Rockwell, Stan Ruecker, Ray Siemens, Stéfan Sinclair. Thanks also to Nicki Darbyson for her assistance in copy-editing the articles displayed in this issue of Digital Studies. All errors in fact and reportage are, of course, the responsibility of the authors.

Share

Authors

John Bonnett (Brock University)
Kevin Kee (Brock University)

Download

Issue

Dates

Licence

Creative Commons Attribution 4.0

Identifiers

File Checksums (MD5)

  • HTML: aa85b5a391b460b09047ef683b034d9c