[3] It brings digital tools and methods to the study of the humanities with the recognition that the printed word is no longer the main medium for knowledge production and distribution.
Here, [they] use the method in a more illustrative and creative sense of showing the range of activities, practices, skills, technologies and structures that could be said to make up the digital humanities, with the aim of providing a high-level map.
[16] Other scholars began using mainframe computers to automate tasks like word-searching, sorting, and counting, which was much faster than processing information from texts with handwritten or typed index cards.
[18][19] As Tara McPherson has pointed out, the digital humanities also inherit practices and perspectives developed through many artistic and theoretical engagements with electronic screen culture beginning the late 1960s and 1970s.
[14] TEI helped shape the field of electronic textual scholarship and led to Extensible Markup Language (XML), which is a tag scheme for digital editing.
Researchers also began experimenting with databases and hypertextual editing, which are structured around links and nodes, as opposed to the standard linear convention of print.
"[30] Open access is designed to enable anyone with an internet-enabled device and internet connection to view a website or read an article without having to pay, as well as share content with the appropriate permissions.
A growing number of researchers in digital humanities are using computational methods for the analysis of large cultural data sets such as the Google Books corpus.
"[47] In this context, the field is sometimes known as computational humanities.Digital humanities scholars use a variety of digital tools for their research, which may take place in an environment as small as a mobile device or as large as a virtual reality lab.
They are the most extensive serial records for the history of Africans in the Atlantic World and also include valuable information on the indigenous, European, and Asian populations who lived alongside them.
[63] A specific example involves the case of initiatives where archivists help scholars and academics build their projects through their experience in evaluating, implementing, and customizing metadata schemas for library collections.
[64] "Cultural analytics" refers to the use of computational method for exploration and analysis of large visual collections and also contemporary digital media.
The lab has been using methods from the field of computer science called Computer Vision many types of both historical and contemporary visual media—for example, all covers of Time magazine published between 1923 and 2009,[65] 20,000 historical art photographs from the collection in Museum of Modern Art (MoMA) in New York,[66] one million pages from Manga books,[67] and 16 million images shared on Instagram in 17 global cities.
How can we "observe" giant cultural universes of both user-generated and professional media content created today, without reducing them to averages, outliers, or pre-existing categories?
WordHoard (begun in 2004) is a free application that enables scholarly but non-technical users to read and analyze, in new ways, deeply-tagged texts, including the canon of Early Greek epic, Chaucer, Shakespeare, and Spenser.
[78] The term is an American neologism first described in a 2010 Science article called Quantitative Analysis of Culture Using Millions of Digitized Books, co-authored by Harvard researchers Jean-Baptiste Michel and Erez Lieberman Aiden.
The study further went on to use more advanced natural language processing techniques to discover macroscopic trends in history and culture, including gender bias, geographical focus, technology, and politics, along with accurate dates for specific events.
MLA Commons[81] offers an open peer-review site (where anyone can comment) for their ongoing curated collection of teaching artifacts in Digital Pedagogy in the Humanities: Concepts, Models, and Experiments (2016).
[82] The Debates in the Digital Humanities platform contains volumes of the open-access book of the same title (2012 and 2016 editions) and allows readers to interact with material by marking sentences as interesting or adding terms to a crowdsourced index.
In 2012, Matthew K. Gold identified a range of perceived criticisms of the field of digital humanities: "a lack of attention to issues of race, class, gender, and sexuality; a preference for research-driven projects over pedagogical ones; an absence of political commitment; an inadequate level of diversity among its practitioners; an inability to address texts under copyright; and an institutional concentration in well-funded research universities".
[88] Similarly Berry and Fagerjord have argued that a digital humanities should "focus on the need to think critically about the implications of computational imaginaries, and raise some questions in this regard.
"[89] Alan Liu has argued, "while digital humanists develop tools, data, and metadata critically, therefore (e.g., debating the 'ordered hierarchy of content objects' principle; disputing whether computation is best used for truth finding or, as Lisa Samuels and Jerome McGann put it, 'deformance'; and so on) rarely do they extend their critique to the full register of society, economics, politics, or culture.
"[90] Some of these concerns have given rise to the emergent subfield of Critical Digital Humanities (CDH): Some key questions include: how do we make the invisible become visible in the study of software?
The literary theorist Stanley Fish claims that the digital humanities pursue a revolutionary agenda and thereby undermine the conventional standards of "pre-eminence, authority and disciplinary power".
Its distinctive contributions do not obliterate the insights of the past, but add and supplement the humanities' long-standing commitment to scholarly interpretation, informed research, structured argument, and dialogue within communities of practice".
[97] There has also been critique of the use of digital humanities tools by scholars who do not fully understand what happens to the data they input and place too much trust in the "black box" of software that cannot be sufficiently examined for errors.
"[99] Similar problems can be seen at a lower level, with databases used for digital humanities analysis replicating the biases of the analogue systems of data.
Amy E. Earhart criticizes what has become the new digital humanities "canon" in the shift from websites using simple HTML to the usage of the TEI and visuals in textual recovery projects.
[103] Works that have been previously lost or excluded were afforded a new home on the internet, but much of the same marginalizing practices found in traditional humanities also took place digitally.
[105][106] As the field matures, there has been a recognition that the standard model of academic peer-review of work may not be adequate for digital humanities projects, which often involve website components, databases, and other non-print objects.