Introduction
In an essay entitled “Humanities Computing,” Willard McCarty (2003) talks about a failure in the traditional model of bibliographic scholarship to capture the work of computing humanists. Bibliographies that focus only on publications miss the intellectual work that goes into things like tools, infrastructure, web sites, games and other digital resources. This has made it difficult for the Digital Humanities (DH) to create its own historiography, to know itself through its history of intellectual contributions. This paper is about one type of resource, the tool directory, developed to try to keep track of the tools Digital Humanists have made. We will discuss the problems of tool development, discovery, and preservation; provide a brief history of tool directories; and finally, we will provide an in depth look at the infrastructure and subsequent merging of two tool directories, TAPoR 3.0 (Text Analysis Portal for Research) and the DiRT Directory (Digital Research Tools).
Tool knowledge
Part of the reason instruments have largely escaped the notice of scholars and others interested in our modern techno-scientific culture is language, or rather its lack. Instruments are developed and used in a context where mathematical, scientific, and ordinary language is neither the exclusive vehicle of communication nor, in many cases, the primary vehicle of communication. Instruments are crafted artifacts, and visual and tactile thinking and communication are central to their development and use.”
As Baird points out in Thing Knowledge (2004), tools have been ignored in the Humanities disciplines that deal in discourse. Humanists not only tend to study discourse as a privileged form of expression, but we also think of (print) discourse as the medium for our academic exchanges. This has been a perennial problem in the Digital Humanities because it means that the tools or new media works that we both study and express ourselves in are difficult to value in the academy (Rockwell 2011). A set of tools like Voyant (voyant-tools.org) might have hundreds of thousands of users a year, but it is difficult to formally justify it as a scholarly contribution to a tenure and promotion committee that counts publications.
The problem is not limited to the scholarly value of tool building. Most would agree that tools and their associated documentation can bear meaning, but DH is still struggling with ways to formally evaluate them without the apparatus of journals and peer-review. The problem is the infrastructure of valuation starting with the ways we remember what has been done and why. This is a problem the Digital Humanities shares with overlapping fields like Instructional Technology and Game Studies, both of which also value software things as objects of study and objects of creation (for example, see Newman 2012 on the preservation of games). To properly value software things we need a stack of infrastructure, starting with records of what was done, as software has a way of disappearing so quickly as to be almost ephemeral. For example, FANGORN and SNAP are both historical tools that were designed specifically for Humanists to assist with text analysis, but they are no longer maintained for active use (TAPoR 2019). There are some organizations doing this preservation work. For example, the Internet Archive’s Software Library preserves decades of computer software that can be accessed and used through their JSMESS emulator (Internet Archive 2014).
TAPoR 3.0 and the DiRT (2019) Directory are tool discovery portals for the digital age that try to meet the need for knowledge about tools by recording sufficient information about tools and other resources that can be discovered and surveyed, but, unlike the Internet Archive, they are not preserving the software itself. In this case, they provide access to the metadata and important information about a host of digital tools and software so that researchers can determine the best tool for their project, and also understand where the tools came from by examining the history. Nonetheless, this is only one model for how knowledge about tools can be gathered and organized.
The role of tool directories in the digital age should concern Digital Humanists as tool directories have a long history of supporting and providing recognition for DH software work. While tool directories began as published lists and collections of tools, such as Stephen Reimer’s “TCRUNCHERS: A Collection of Public Domain Software and ShareWare for Writers” (Lancashire 2017), today, tool directories have taken an online format that requires continuous upkeep to maintain accessibility and relevance in a rapidly changing digital context (Dombrowski forthcoming). The DiRT Directory and TAPoR 3.0 are two well-known digital tool directories in the English-speaking DH community. The DiRT Directory evolved from Project Bamboo, which developed “Bamboo DiRT” from Lisa Spiro’s “DiRT Wiki” (Dombrowski forthcoming). Meanwhile, the TAPoR project was initially developed as a full portal that could coordinate text analysis web services (Rockwell 2006). When this proved hard to maintain, the creators of TAPoR developed text analysis tools, such as Voyant, separately from TAPoR which was later redeveloped to support the discovery of text analysis tools and code to help Humanists in their research (Rockwell and Sinclair 2016).
Of the challenges faced by tool discovery portals, sustainability has proven to be the most intractable. In the summer of 2017, Quinn Dombrowski, at that time an IT staff member in UC Berkeley’s Research IT group, approached Geoffrey Rockwell about the possibility of merging the DiRT Directory with TAPoR 3.0. The elimination of funding for Dombrowski’s Digital Humanities-focused position, and her transition into a research computing role, meant that she could no longer dedicate time to maintaining the organizational structure of the volunteer-run tool directory (Dombrowski forthcoming). This decommissioning of DiRT illustrates a set of problems in the field of Digital Humanities around treating tool directories and tool development as academic contributions. Tool development, in general, has not been considered a “scholarly” activity and often suffers from a lack of ongoing support (Ramsay and Rockwell 2012). This is even more the case for the development and maintenance of tool directories, which require a great deal of time and some degree of curation, but do not align well to existing frameworks for incentivizing and rewarding work in a scholarly context. However, when tool discovery portals are no longer maintained, it can lead to a loss of Digital Humanities knowledge and history if the data is not, at the very least, archived.
Archiving data in a widely used text-based format (such as CSV or JSON) may preserve this knowledge for certain kinds of Digital Humanities audiences, but for many Humanities scholars, a mediating web-based interface is a de facto requirement for data to be meaningfully usable. Scholars and developers who are more comfortable working with data (such as a content dump from a defunct tool directory) may be able to restore access to this content for their less-technical colleagues by ingesting it into a new interface, but the amount of work required to do so depends on how well aligned the source metadata is with the data model underpinning the new interface. While volunteer-based directories require less outright funding, managing and motivating those volunteers to ensure that they remain actively involved in directory upkeep requires a vast amount of ongoing work (Dombrowski forthcoming). This raises the question: how can Digital Humanists better support digital tool directories, and, more broadly, tool development?
The Digital Humanities is often described as a type of scholarship that is bound in collaboration and public visibility that is different from traditional Humanities research (Kirschenbaum 2012). This nature is at the heart of discovery portals, which centralize multiple tools and different pieces of code in one place (Dombrowski 2014). Thus, it is not surprising that discovery portals serve as common starting points for scholars new to Digital Humanities, but there has been little discussion of the portals outside of this role. While these portals are generally viewed as useful, the difficulty of maintaining them makes it important to be clear about the nature and extent of their value, who their audience truly is, and under what conditions they can be sustained. Furthermore, Digital Humanists could further engage the scholarly labour of tool directories by recognizing them in their scholarly work and providing links from blogs and library subject guides. While some of this recognition is already happening, it is not a formal practice in the field.
Tool directories: A history
Directories for finding tools are not new to the Digital Humanities. In the “Prospect” of the first issue of Computers and the Humanities the editor talks about reducing the “wasteful duplication of key-punching and programming that exists” by publishing lists of “programs designed to solve humanistic problems” (Prospect 1966, 2). To that end, tool reviews were published in Computers and the Humanities starting with the first issue which had a review of PRORA (Lieberman 1966), a concording tool developed at the University of Toronto.
Another approach to documenting tools that lasted for only a few years was the yearbook. Ian Lancashire (1991) and Willard McCarty published two issues of the Humanities Computing Yearbook, one in 1988 and one for 1989–90. The Yearbook organized descriptions of tools and other resources under disciplines, but also had sections for general tools like bibliographic management tools that crossed disciplines. The Yearbook was probably the most ambitious attempt in print to document the resources, both digital and in print, of interest to computing humanists. Alas, only two issues were published before the task of keeping up across all the disciplines in the Humanities became too difficult, not dissimilar to the struggles of online tool directories. That said, yearbooks published from reputable imprints do have the advantage that they look like publications and can be preserved in libraries.
Lists, reviews and yearbooks are three approaches to documenting tools, another is software exhibits and associated catalogues. One of the most notable was the exhibit and accompanying catalogue at the first joint ACH-ALLC conference held in Toronto in 1989. Along with a conference guide entitled “The Dynamic Text,” Willard McCarty edited a software tool guide called “Tools for Humanists, 1989” which described seventy-four systems that were displayed at the 1989 hardware and software fair by the same name (1989).
With the accessibility of the Internet it became both harder and easier to keep track of tools and other resources. On the one hand, there has been an explosion of DH websites, so the task has expanded, on the other hand, companies like Google have provided useful industrial tools for searching the web. The TAPoR project was originally funded in 2002 to provide a vertical portal that would bring together services with information about available tools, especially those that could be used through the portal as web services. Web directories like TAPoR 3.0 and DiRT are, in principle, easier to maintain and can even be maintained by a community, such as the learning resource directory, MERLOT. MERLOT (2019) enlists editors from the community to curate disciplinary sub-portals for learning tools, resources and documentation. In another example, TERESAH (2019) is a tool registry managed by DARIAH that aims to provide a listing of active tools for Social Sciences and Humanities researchers in Europe. Moreover, the DH Toy Chest offers users an ongoing development of free “guides, tools, and other resources for practical work in the Digital Humanities by researchers, teachers, and students” (Liu 2013). These examples offer similar services as TAPoR 3.0 and DiRT but use different methods to provide access to tools.
Maintaining infrastructure
Directories of tools are essentially a form of infrastructure. They do not present original research, though research may go into their design. If they are to work well they should a) support other activities like research, and b) be maintained over time so they are accessible to scholars. As many have noted, it is challenging therefore to build infrastructure, especially fast changing digital infrastructure, using one-time grants (Bement 2007, Green 2007, Rockwell 2010). The Canada Foundation for Innovation (CFI) program that initially funded TAPoR recognizes this to some extent by asking for a maintenance plan and by providing ongoing funds for up to eight years. For Humanities infrastructure projects, and, for that matter, any successful infrastructure, eight years (including the years of initial development) are too little. This means that infrastructure leads have to continually seek new sources of funding which in turn means adapting to new contexts and partnering with projects that could use the infrastructure. The TAPoR project has now lasted over 15 years from when the CFI grant was first awarded in 2002. The portal has been completely redeveloped (i.e. reprogrammed from scratch) twice, leading to its current designation as TAPoR 3.0.
TAPoR 3.0 and DiRT Directory both represent the latest iterations of past tool directories that have been updated or adapted for new purposes. Most importantly, the continuation of these tool directories in this ad hoc fashion highlights a core issue with their development—the need for uninterrupted support. What this support could look like is not yet clear. Nonetheless, both projects have key lessons and tactics for maintaining directory projects in the long term.
First, keep infrastructure small and simple enough that it can survive during dry funding spells. While the initial idea of the TAPoR “portal” was to integrate various resources from social media, text repositories, tools as web services, and ways of chaining tools in one place, this proved very hard to maintain. There were, and are, better resources available that the TAPoR project was trying to replicate in order to have a full-service portal. In version 2.0, TAPoR narrowed its focus to the discovery of tools. The tools themselves were spun off into projects like TAPoRware, TATToo, and most importantly, Voyant. TAPoRware was a set of tools designed specifically for TAPoR 2.0. They were simple tools that could be deployed as demonstration web services. TATToo was an embeddable toolbar that could be put into other websites where it would operate on the content of whatever page it was on (See Rockwell et al. 2010).
Next, scale infrastructure down to what can be led and maintained by a faculty member with university support. Faculty already have access to a certain number of resources, depending on local computing support. Faculty at most research-intensive universities can get small local grants, involve research assistants, involve students, apply for grants and so on. Infrastructure that is scaled to the support that a faculty member can obtain on their own can survive the dry years; however, this necessitates a faculty lead for the project, rather than a librarian, IT staff (such as Dombrowski), or other alt-ac roles.
Keep infrastructure modular so that it can connect with other projects easily. Rather than trying to create a vertical portal that includes everything and would be complicated to maintain. In version 2.0, TAPoR focused on doing one thing well that others weren’t doing and doing it in a way that could fit with other projects. This can take multiple forms. While DiRT focused on technological integration by developing an API, the TAPoR project took the approach of “political” integration by making it easy to be written into other projects’ grant proposals. The latter approach does not lead to further proliferation of infrastructure that must be maintained, making it, by definition, more sustainable. Do one thing well and then build out features as new opportunities, partners and projects need them. Version 3 of TAPoR began adding features that made sense for the projects like Text Mining the Novel (https://novel-tm.ca/), which was contributing funding. As well, projects could take new features that need to be implemented and implement them in a more broadly reusable and integrated fashion within an existing framework. This may take some rework in the existing code and appears to be more cumbersome, but it avoids bloated code in the long run. TAPoR 3.0 has, wherever possible, been implemented by the University of Alberta’s Arts Resource Center in a way that any custom code can be reused (and maintained) for other projects.
Finally, beware the siren call of crowdsourcing. Since the success of the Suda On Line (Mahoney 2009) there has been the hope that projects could get human labour from the crowd. The DiRT Directory has shown that often the work of motivating and organizing volunteers can be as time consuming as the work those volunteers do. As promising as crowdsourcing is, its value lies more in how it can engage a broader community than how much work it saves (Rockwell 2012). As shown above, there are other ways of securing ongoing support that allow for more ambitious projects, this is how the TAPoR project has survived, and hopefully will continue to survive. With the addition of DiRT Directory’s tool data and a larger mandate, TAPoR 3.0 plans to involve more scholarly associations in the support of the infrastructure which may provide other avenues for sustainability.
DiRT Directory’s successes and failures in this regard may prove informative. In order to reduce the risk of being shut down by UC Berkeley’s central IT division on account of being unrelated to Berkeley-specific IT service offerings, ownership of DiRT Directory was formally transferred to centerNet, an ADHO member organization. In this arrangement, centerNet provided an organizational home for the project, and would coordinate opportunities for partnerships and joint development with other centerNet projects (such as, the project directory DHCommons, which itself faced sustainability challenges similar to DiRT’s). In principle, centerNet’s member centers would serve as an ongoing source of volunteers for maintaining DiRT. In practice, however, the volunteer model is crucially dependent on the active involvement of a project director, and the arrangement with centerNet had no provisions for financially supporting the director position. This could be done through a buy-out of time to ensure the director could continue to work on DiRT even in the absence of a DH-specific position funded by her employer, or for replacing the director if she became unavailable, perhaps through a more financially sustainable dedicated graduate student position. Fundamentally, a tool directory’s survival is dependent on funding—which may be modest but must be fairly consistent—to pay for a position of some sort that can ensure the currency of the listings, either through their own labor or through engaging a community of volunteers.
Absorbing DiRT
The decision to merge the DiRT Directory with TAPoR 3.0 was a difficult one. It not only highlights the lack of ongoing support for tool discovery portals, but it also represents the end of a project (Ruecker et al. 2012). As a part of this project, we specifically aimed to merge the two directories together into a larger tool discovery portal that combined the best parts of both original projects. This process used the following steps:
First, we examined the metadata structure of the DiRT Directory to see what information the site was holding on each tool. At the same time, we explored possible ways to integrate DiRT’s data with TAPoR 3.0’s.
Next, we mapped a crosswalk of the metadata on DiRT and the metadata on TAPoR 3.0. This allowed us to see which fields are shared between the sites and determine which fields would need to be added to TAPoR 3.0, and which fields on TAPoR 3.0 would need to be populated for the DiRT Directory’s tools. This process required meeting with programmers at the University of Alberta’s Arts Resource Centre. Kamal Ranaweera and Omar Rodriguez-Arenas worked with us, providing technical support and advice throughout the project, and, most importantly, completing the actual migration of the data to TAPoR 3.0.
After finalizing the data and fields mapping, we moved on to data cleaning. Quinn Dombrowski provided a spreadsheet in comma-separated-value format of all 988 tools. This file was uploaded to OpenRefine (http://openrefine.org/), which was used to make overarching changes to the data. For example, using our fields maps, we relabelled DiRT Directory’s platform data to match TAPoR 3.0’s web-usable data. The final step of this process was to delete any duplicates or empty tools. This process brought the total tool count to 950.
The final step of the project was to hand over the data to the Arts Resource Centre and allow the ingestion process to begin.
Overall, the process went very smoothly. We began the project in September 2017 and successfully integrated the tools from DiRT to TAPoR 3.0 in May 2018. While the integration process is complete, there is an ongoing data cleaning project as we not only integrated almost a thousand new tools into TAPoR 3.0, but we also added some new fields to the descriptive metadata of the tools, and we expanded the scope of TAPoR 3.0 beyond text analysis. Most obvious, is a lack of consistency in the descriptions across the tool directory. Moving forward, we continue to try and find the most effective way to share important information about tools through trial and error.
Discussion
Records of software take many forms, and in this essay, we have outlined a history of one form, directories of Digital Humanities tools, but there are other types of records like grant proposals, design and development documents, manuals, brochures, web documentation, reviews, conference papers and code. Developing memory infrastructure like directories is not as simple as preserving documentation. It is also a matter of structuring the records of tools so that they can be managed and found (Bowker and Starr 2000). As Bowker (2008) points out, the development of memory infrastructure is a structuring process of developing practices that are supported by infrastructure and in turn reinforce the need for infrastructure. This is where we are in the Digital Humanities; we have experimental infrastructure, but it hasn’t yet been woven into the practices of the field partly because the practices are still emerging. DH has not become disciplinary in the sense of a self-perpetuating field that has stable practices and infrastructure. This means that there isn’t yet the recognition and support for infrastructure like directories and portals. It is possible to get grants to build them as experimental infrastructure, but we haven’t found a way to weave them into a changing discipline so that they are maintained. By contrast, we have developed journals in the field that do have long term support. This paper documents attempts to develop disciplinary infrastructure at, and as a moment of, disciplinary formation. The attempts, failures, and successes say much about our formation.
Inevitably one wonders how directories of tools could be better supported. How might knowledge about tools be preserved and made available? Are directories the best way to do so, or should we give up and depend on Google to manage our history? Some directions suggest themselves: It may be time to go back to including reviews or notices about tools in journals. Journals in the Digital Humanities like DHQ (http://www.digitalhumanities.org/dhq/) have proven maintainable and could integrate tool reviews and support for directories into their online practices. Notices about notable new tools could be included in journal issues and then archived in a tool directory like TAPoR 3.0.
As mentioned above, we could learn from the MERLOT model, where there are editors who get credit for maintaining a sub-portal on best learning resources for a discipline (https://www.merlot.org). Tool directories could develop Associate Editor positions that would offer scholarly credit for curating and managing a list of specified tools. This would help maintain the directory while also providing an opportunity for moving tool directory maintenance into the traditional scholarly outputs that are more easily recognized by tenure and hiring committees. Finally, it may be prudent to recognize that tool directories have a life span tied to funding, thus making long term support unnecessary. The important thing is to find a way to preserve the data so that it can be passed on and reused as new projects arise with new models (Rockwell et al. 2014).
Conclusion
In conclusion, absorbing the DiRT Directory into TAPoR 3.0 forced our team to wrestle with the some of the major problems facing the Digital Humanities as a field. First and foremost, the debate on the importance of tools and tool development, as well as, the role of tool directories in encouraging the maintenance of tools and software. Furthermore, the process of integrating the two directories encouraged us to consider the issues of long-term access and support. This leaves us with an important question to consider: what will happen if TAPoR 3.0 is no longer able to be maintained or supported? If the experience and data is archived in an accessible form, does it really matter if any particular tool like TAPoR 3.0 disappears?
We would like to thank Dr. Andrew Piper (McGill University) and NovelTM: Text Mining the Novel, a project funded by a SSHRC Partnership Grant, for their support with this project.
Competing interest
The authors have no competing interests to declare.
Author roles
Kaitlyn Grant, University of Alberta, kgrant1@ualberta.ca – kg
Quinn Dombrowski, Stanford University, qad@stanford.edu – qd
Dr. Kamal Ranaweera, University of Alberta, kamal.ranaweera@ualberta.ca – kr
Omar Rodriguez-Arenas, University of Alberta, omar.rodriguez@ualberta.ca – ora
Dr. Stéfan Sinclair, McGill University, stefan.sinclair@mcgill.ca – ss
Dr. Geoffrey Rockwell, University of Alberta, grockwel@ualberta.ca – gr
The order of the authors reflects the level of contribution to the overall research project, including the research, programming, writing, and funding of the project.
Conceptualization – kg, gr, qd, ss
Data curation – kg
Formal analysis – kg, gr
Funding acquisition – ss
Investigation – kg, gr, qd
Methodology – kg, gr
Project administration – kg
Resources – kg, gr, qd, ss
Software – kr, ora
Supervision – gr, ss
Writing – original draft – kg, gr, qd
Writing – review & editing – kg, gr, qd, kr, ora, ss
Editorial contributors
Congress 2018 Special Editor: Dr. Constance Crompton, University of Ottawa
Section Editor/Copy Editor: Darcy Tamayose, University of Lethbridge Journal Incubator
Layout Editor: Mahsa Miri, University of Lethbridge Journal Incubator
References
Baird, Davis. 2004. Thing Knowledge: A Philosophy of Scientific Instruments. Berkeley: University of California Press.
Bement, Arden L. Jr. 2007. “Shaping the Cyberinfrastructure Revolution: Designing Cyberinfrastructure for Collaboration and Innovation.” First Monday 12: 6. Accessed October 17, 2019. https://firstmonday.org/ojs/index.php/fm/article/view/1915/1797.
Bowker, Geoffrey. 2008. Memory Practices in the Sciences. Cambridge, MA: MIT Press.
Bowker, Geoffrey, and Susan Star. 2000. Sorting Things Out: Classification and Its Consequences. Cambridge, MA: MIT Press.
DiRT (Digital Research Tools). 2019. Accessed October 18. https://web.archive.org/web/20180716234416/http://dirtdirectory.org/.
Dombrowski, Quinn. 2014. “What Ever Happened to Project Bamboo?” Literary and Linguistic Computing 29(3). DOI: http://doi.org/10.1093/llc/fqu026
Dombrowski, Quinn. Forthcoming. “The Directory Paradox.” Debates in the Digital Humanities. Accessed October 17, 2019. http://dhdebates.gc.cuny.edu/.
Green, David. 2007. “Cyberinfrastructure for Us All: An Introduction to Cyberinfrastructure and the Liberal Arts.” Academic Commons. Accessed September 16, 2019. http://www.academiccommons.org/commons/essay/cyberinfrastructure-introduction.
Internet Archive. 2014. “Software Library.” Internet Archive. Accessed September 10, 2019. https://archive.org/details/softwarelibrary.
Kirschenbaum, Matthew. 2012. “What is Digital Humanities and What is it Doing in English Departments?” Debates in the Digital Humanities. Accessed September 16, 2019. https://dhdebates.gc.cuny.edu/read/untitled-88c11800-9446-469b-a3be-3fdb36bfbd1e/section/f5640d43-b8eb-4d49-bc4b-eb31a16f3d06.
Lancashire, Ian. 1991. The Humanities Computing Yearbook 1989–90. Oxford, UK: Clarendon Press.
Lancashire, Ian. 2017. “CSDH/SCHN Beginnings.” Accessed September 16, 2019. http://csdh-schn.org/2017/05/31/csdhschn-beginnings-reflections-by-ian-lancashaire/.
Lancashire, Ian, and Willard McCarty. 1988. The Humanities Computing Yearbook 1988. Oxford: Clarendon Press.
Lieberman, D. 1966. “Review of Manual for the Printing of Literary Texts by Computer by Robert Jay Glickman and Gerrit Joseph Staalman.” Computers and the Humanities 1(1): 12. DOI: http://doi.org/10.1007/BF00188014
Liu, Alan. 2013. “DH Toychest.” Digital Humanities Resources for Project Building. Updated 2017. http://dhresourcesforprojectbuilding.pbworks.com/w/page/69244243/FrontPage.
Mahoney, Anne. 2009. “Tachypaedia Byzantina: The Suda On Line as Collaborative Encyclopedia.” Digital Humanities Quarterly 3(1). Accessed September 16, 2019. http://www.digitalhumanities.org/dhq/vol/3/1/000025/000025.html.
McCarty, Willard. 1989. Software Fair Guide: A Guide to the Software Fair Held in Conjunction with the Conference Computers and the Humanities: Today’s Research, Tomorrow’s Teaching. Toronto. ON: Centre for Computing in the Humanities, University of Toronto.
McCarty, Willard. 2003. “Humanities Computing.” In Encyclopedia of Library and Information Science, 1224–35. New York: Marcel Dekker.
MERLOT (Multimedia Educational Resources for Learning and Online Teaching). 2019. Accessed July 17. http://info.merlot.org/merlothelp/topic.htm#t=Who_We_Are.htm.
Newman, James. 2012. Best Before: Videogames, Supersession and Obsolescence. New York: Routledge. DOI: http://doi.org/10.4324/9780203144268
“Prospect.” 1966. Computers and the Humanities 1(1): 1–2. DOI: http://doi.org/10.1007/BF00188009
Ramsay, Stephen, and Geoffrey Rockwell. 2012. “Developing Things: Notes Toward an Epistemology of Building in the Digital Humanities.” Debates in the Digital Humanities. Accessed September 20, 2019. https://dhdebates.gc.cuny.edu/read/untitled-88c11800-9446-469b-a3be-3fdb36bfbd1e/section/c733786e-5787-454e-8f12-e1b7a85cac72#ch05.
Rockwell, Geoffrey. 2006. “TAPoR: Building a Portal for Text Analysis.” In Mind Technologies: Humanities Computing and the Canadian Academic Community, edited by R. Siemens, and D. Moorman, 285–99. Calgary: University of Calgary Press.
Rockwell, Geoffrey. 2010. “As Transparent as Infrastructure: On the Research of Cyberinfrastructure in the Humanities.” In Online Humanities Scholarship: The Shape of Things to Come. Proceedings of the Mellon Foundation Online Humanities Conference at the University of Virginia, March 26–28, 2010, edited by J. McGann, 461–87. Houston: Rice University Press.
Rockwell, Geoffrey. 2011. “On the Evaluation of Digital Media as Scholarship.” MLA Profession, 152–68. DOI: http://doi.org/10.1632/prof.2011.2011.1.152
Rockwell, Geoffrey. 2012. “Crowdsourcing the Humanities: Social Research and Collaboration.” In Collaborative Research in the Digital Humanities, edited by M. Deegan, and W. McCarty, 135–54. Farnham, Surrey: Ashgate.
Rockwell, Geoffrey, Shawn Day, Joyce Yu, and Maureen Engel. 2014. “Burying Dead Projects: Depositing the Globalization Compendium.” Digital Humanities Quarterly 8(2). Accessed September 16, 2019. https://era.library.ualberta.ca/items/34ec3dc1-f8b5-481a-826d-2428f5283ce8.
Rockwell, Geoffrey, and Stéfan Sinclair. 2016. Hermeneutica: Computer-assisted Interpretation in the Humanities. Cambridge: MIT Press. DOI: http://doi.org/10.7551/mitpress/9522.001.0001
Rockwell, Geoffrey, Stefan Sinclair, Stan Ruecker, and Peter Organisciak. 2010. “Ubiquitous Text Analysis.” Poetess Archive Journal 2(1).
Ruecker, Stan, Geoffrey Rockwell, Daniel Sondheim, Mihaela Ilovan, Jennifer Windsor, Mark Bieber, Luciano Frizzera, Omar Rodriguez-Arenas, Kamla Ranaweera, Carlos Fiorentino, Stéfan Sinclair, Milena Radzikowska, Teresa Dobson, Ann Blandford, Sarah Faisal, Alejandro Giacometti, Susan Brown, Brent Nelson, and Piotr Michura. 2012. “The Beginning, the Middle, and the End: New Tools for the Scholarly Edition.” Scholarly and Research Communication 3(4). Accessed September 16, 2019. http://www.src-online.ca/index.php/src/article/view/57.
TAPoR (Text Analysis Portal for Research). 2019. Text Analysis Portal for Research. Accessed September 16. http://tapor.ca/pages/about_tapor.
TERESAH (Tools E-Registry for E-Social science, Arts and Humanities). 2019. “About TERESAH.” Accessed July 15. http://teresah.dariah.eu/about.