Daily Archives: May 16, 2019

Adult and child working on iPads.
1

Digital Game-Based Pedagogies: Developing Teaching Strategies for Game-Based Learning

Abstract

In this paper, we discuss pedagogical strategies for supporting digital game-based learning in K–12 classrooms, based on a study of 34 teachers. We identify nine strategies, digital game-based pedagogies, that represent common characteristics in exemplary teaching with digital games, and discuss how a professional development session may have aided in the teachers’ use of these strategies. To create effective digital game-based learning environments, we argue, teachers need to be provided with professional development sessions that focus on the cultivation of pedagogical skills.

Introduction

Researchers and enthusiastic practitioners have long been arguing for the effectiveness of digital games as a means for teaching subject-specific skills while also motivating and engaging students (Gee 2008; Annetta 2008; Squire and Jenkins 2003). As games require players solve complex problems, work collaboratively, and communicate with others in both online environments and the physical spaces where gameplay takes place, they are said to support students’ development of twenty-first century competencies (Spires 2015).

Recognizing the teacher’s role in designing and facilitating learning environments that support digital game-based learning (DGBL), including adapting content to suit the needs of diverse learners, is a critical component of effective DGBL. As McCall makes clear, “by itself…a…game is not a sufficient learning tool. Rather, successful game-based lessons are the product of well-designed environments” (2011, 61). Chee, in his book on using digital games in education, argues

It is vital to understand that games do not “work” or “not work” in classrooms in and of themselves. They possess no causal agency. The efficacy of games for learning depends largely upon teachers’ capacity to leverage games effectively as learning tools and on students’ willingness to engage in gameplay and other pedagogical activities—such as dialogic interactions for meaning making—so that game use in the curriculum can be rendered effective for learning. (2016, 4)

On this view, the focus shifts from the games, game systems, and game content to “what teachers need to know” pedagogically (Mishra and Koehler 2006, 1018), including how to create space for digital games in the curriculum, organize classroom activities around the use of games, support students during both gameplay and their engagement with DGBL activities (Sandford et al. 2006; Allsop and Jessel 2015), and, as we have argued elsewhere, assess student learning (Hébert, Jenson, and Fong 2018). Groff, Howells, and Cranmer make clear, “game-based learning approaches need to be well planned and classrooms carefully organized to engage all students in learning and produce appropriate outcomes” (2010, 7).

In this paper we discuss our attempt to articulate a series of digital game-based practices carried out by teachers as they used a digital game in their classrooms. Specifically, we detail nine strategies—what we are calling digital game-based pedagogies—that were common in all classrooms we observed, and utilized to varying degrees of success. As most of the teachers in the study attended a professional development (PD) session, we also draw connections between the content of the PD and these pedagogical strategies. We begin with a literature review of pedagogy and professional development in relation to DGBL, then discuss the structure of the study, and last detail the digital game-based pedagogies identified from that significant qualitative work.

Related Literature: Digital Games, Pedagogy and Professional Development

When learning is reduced to knowledge transmission and a game offered as a medium for merely learning content, the role of the teacher is similarly narrowed to an intermediary, offering the game to students and stepping back in order to let learning through gameplay take place. On this view, the game and its design are a central focus, including “integrating learning objectives with[in] th[is] delivery medium” (Becker 2017, 156). Many studies of game-based learning focus on how a game is designed, with researchers either attempting to streamline best practices for designing games (Aslan and Balci 2015; Arnab et al. 2015; Alaswad and Nadolny 2015; Van Eck and Hung 2010) or discussing the design process of a specific game for use in the classroom (Tsai, Yu, and Hsiao 2012; Barab et al. 2005; Sánchez, Sáenz, and Garrido-Miranda 2010; Lester et al. 2014). We argue that simply focusing on how a game is designed is problematic as it places responsibility for student learning in the hands of designers who “may never have had direct or lived experiences of classroom teaching, [and who] are advocating on behalf of the learning and literacy offered by games without having to take into account the real and varied challenges faced by today’s diverse learners” (Nolan and McBride 2013, 597–98). It also has the potential effect of further exacerbating the divide between games and classrooms, positioning the game as a silo that operates outside of curricular decisions and pedagogical practices.

Absent from these discussions is the pivotal role of the teacher in the classroom. In fact, terms in the literature that might signal a discussion of teaching, such as instructional approaches, instructional methods, pedagogy, pedagogical approach, digital pedagogy, game-based learning techniques, and curriculum development (Charsky and Barbour 2010; Egenfeldt-Nielson, Smith, and Tosca 2016; Becker 2009; Clark 2007; Rodriguez-Hoyos and Gomes 2012; Shabalina et al. 2016) are typically used in DGBL research to refer to the design of the game and accompanying materials that support learning (e.g. quizzes, assessment guides, and other paper and pencil tasks) instead of the actions of a classroom teacher. The assumption here seems to be that games can support student learning despite the role of the teacher, and, importantly, without considerations of the larger classroom environment and curricular structures put into place for digital game-based learning. Baek, for instance, has noted that games must be “mapped into curricula for their maximum effective utilization” (2008, 667). Similarly, Raabe, Santos, Paludo, and Benitti have argued that for DGBL, “the planning of the class is the most important stage and must involve the participation [of] teachers in choosing the content that should be supported by the use of the game…according to the goals [of] learning to be achieved” (2012, 688).

Teaching and pedagogy as they relate to DGBL have been taken up in some of the literature. Nousiainen, Kangas, Rikala, and Vesisenaho discuss teacher-identified competencies around pedagogy essential to game based learning, including “curriculum-based planning”⎯understanding how games can be used within the curriculum, how students can be involved in the curricular design process, and how to “plan game-based activities for supporting students’ academic learning and broader key competencies” (2018, 90). Moving away from pedagogical strategies specifically, Marklund and Taylor outline the roles teachers shift between during DGBL, including: the “gaming tutor,” as teachers aid students with more technologically focused elements of gameplay (e.g. manipulating controls); the “authority and enforcer of educational modes of play,” as teachers monitor student progress toward learning goals and direct play when necessary; and the “subject matter anchor,” as they draw out connections between the game and course content, including calling students’ attention to certain aspects of the game or breaking down complex concepts as they pertain to the game (2015, 363–365). Similarly, Hanghoj offers a series of “next-best practices” for teachers’ use in supporting DGBL, suggesting that teachers might “set the stage” by “providing relevant game information” for students, “recognize and challenge the students’” game experience by articulating different interpretations of a game session,” and “support students in their attempts to construct, deconstruct and reconstruct relevant forms of knowledge—both in relation to the game context, curricular goals and real live phenomena” (2008, 235).

One means of helping teachers consider their role in using games to support learning in the classroom is through professional development that might focus on teaching strategies, alongside creating a classroom ecology for DGBL. And yet, much like the research on DGBL more broadly, professional development for using games in classrooms rarely addresses pedagogy. For example, Ketelhut and Schifter’s research on developing PD for DGBL outlines the types of platforms (e.g. face-to-face, online, blended) used and how they compared to one another rather than discussing the content of the PD and its connection to pedagogy (Ketelhut and Schifter 2011). And while Chee, Mehrotra, and Ong’s PD centered on a particular teaching method⎯dialogic pedagogy⎯the authors examined teacher dilemmas rather than explicit pedagogical strategies reviewed in the PD sessions. At the same time, their findings call attention to the importance of pedagogy as teachers work to shift their teaching for DGBL. They state, teachers were “mostly accustomed to subject matter exposition followed by assigning student[s] worksheets to complete,” but with the digital games, had “to work in real time with the ideas that students were contributing, based on their gameplay experiences” (2014, 429). Consequently, this required shift in pedagogy “unsettled them” (2014, 429).

There are, of course, exceptions. The Software and Information Industry Association’s report on best practices for game use in the K–12 classroom recognizes the significance of pedagogy for DGBL, arguing that teachers should receive at least a half day of PD in order to become familiar with the theoretical underpinnings of DGBL, learn about the specific game they will be using in class, obtain practical information about creating game accounts and manipulating the game mechanics, and gain a better understanding of the “roles and responsibilities of teachers and students” (2009, 25) during gameplay. And Simpson and Stansberry provide an overview of working with teachers on the “G.A.M.E.” lesson planning model, which involves various stages: taking the perspective of the game designer to better understand how and to what extent games are engaging as well as asking students to contemplate their gameplay, “reflect[ing] on the decisions made and evaluat[ing] the consequences” (2009, 182).

As this review demonstrates, there is scant empirical research related to digital game-based pedagogies, and an important and critical need for more discussions of and research on this topic. In the next section, we discuss the study, which examined teachers’ pedagogical practices for DGBL in K–12 classroom spaces and the relationship between these practices and a professional development workshop.

The Study

Timeline of the Research

The project took place over an eight-month period, during the 2015–2016 school year, with data analysis completed at the beginning of the 2016–2017 school year. While the project was initially intended to run over a single school year, a work-to-rule ban on extracurricular activities put forth by the Elementary Teachers Federation in the province delayed the start date of the project by four months. The professional development session took place February 10–11, 2016, followed by observations from February 22–May 16. Interviews overlapped with observations, with teachers whose classes were visited in February beginning interviews in early March, and ran until the end of June. Data analysis also took place synchronously, and was completed in October 2016.

The Game

Two educational games were used in this study: Sprite’s Quest: The Lost Feathers, and Sprite’s Quest: Seedling Saga, aligned with the grade seven and grade eight Ontario geography curriculum respectively. The game was designed by Le centre d’innovation pédagogique in collaboration with the Ontario Ministry of Education and selected as the focus for this study by our funding partner, the Council of Ontario Directors of Education. Both versions of Sprite’s Quest are 2D, platformer games intended to aid in the development of physical and human geography concepts. The games also have accompanying student activity guides and teacher manuals available through an online platform. While the games can be downloaded by anyone through the Apple App Store or Google Play,[1] access to the web version of the game, along with the student activity guides and teacher manuals, is granted through individual boards of education through the Ministry of Education’s e-learning Ontario site. As this article focuses on the professional development element of the project, we do not provide a detailed overview the game, the activity guide or the teacher manuals here, but have elsewhere (Hébert, Jenson, and Fong 2018). None of the teachers had used Sprite’s Quest prior to this project.

Research Question

This study sought to identify pedagogical practices that supported DGBL. We asked: What teaching practices were common to teachers observed in the study?

Participants & Professional Development

Participants were recruited by the funding partner in conjunction with participating school boards. Altogether, 34 teachers (17 female, 17 male) from 10 school boards and 25 schools across Ontario, Canada took part in the study. Sixteen of these teachers taught straight grade 7 classes, seven grade 8, and one grade 9, while a number of teachers, especially those in smaller schools, taught split classes, with one grade 6/7/8 teacher participating, one grade 6/7, and eight grade 7/8. Twenty-eight teachers attended a professional development session that occurred at a university over a two-day period; teachers were released for that time from their classrooms. The two full days were organized and run by the authors (see Appendix B for a detailed schedule of the session). The professional development consisted of three main components:

Becoming Familiar with the Games: Walkthroughs and Content

First, as noted, none of the teachers had seen or played Sprite’s Quest before, and were given time to become familiar with the two versions of the game during the PD session. Because the teachers did not have time to play either of the games in their entirety during the PD session, we produced “walkthroughs” that were reviewed during the PD session. Walkthroughs are textual and visual overviews of key elements of a game. They were made available to teachers throughout their play sessions, during lesson planning, and while teachers were using the games in their classrooms. Second, we drew attention to how the games provided geographic content. For example, we looked at how fact bubbles pop up during play, questions are presented at the beginning of each level, and background information is offered about the geographic location (e.g. the Himalayas) through which the sprite moves. Teachers were also instructed to encourage students to make note of the facts and the answers to the questions and to pay attention to the background of the games when using them to support student learning.

Exploring the Teacher Manual and the Activity Guide

Given that teacher manuals and activity guides for these games were available and in fact had been produced to support the implementation of the game in classrooms, we wanted to ensure that teachers had the time to examine them closely and to draw connections between these resources and their curriculum. To this end, we led teachers through a guided examination of the resources, reviewing the overall structure of the games as they aligned with the sections of the student activity guide and the teacher manual. We also summarized the information made available in the teacher manual and student activity guide and provided the summaries as a supplementary electronic handout.

Discussing Curricular Connections and Collaborative Lesson Planning

There were three key concepts in the games related to physical and human geography⎯place, liveability, and sustainability. The teacher manual and the PD session emphasized these, including drawing direct connections to the Ministry of Ontario grade 7 and 8 geography curriculum. Further, the PD supported collaborative lesson planning which focused on creating learning goals, success criteria, and expectations for and evaluation of students. Finally, teachers were provided time to complete a unit plan, begin constructing individual lessons for the unit, and create assessments to use during the digital game-based unit which they then shared with the whole group.

The remaining six teachers participated in the study, but did not attend the PD session. The teachers who did not participate in the PD session were selected at random. In lieu of PD, they were invited to attend a two-hour meeting at their board office. At the meeting, the teachers were introduced to the study and told how to access the teacher and student activity guides. And they were given time to play the games, but only while the researchers were speaking individually with teachers to organize some of the logistics around classroom visits.

Data Collection and Analysis

This qualitative study included observations of all teachers as they taught the DGBL unit, field notes based on observations, videos, and still photos taken during classroom visits, and interviews with teachers after the unit was completed.

Observations

Researchers visited each teacher’s classroom two to three times during the delivery of the unit, for 45 minutes to 1.5 hours per visit, documenting how they adopted the three central elements of the PD, demonstrating how familiar teachers were with the game, how the teacher and student activity guides were used, and how lessons and assessments created in the PD were taken up. Researchers also made note of the classroom environment created to support DGBL and practices within it. This included, with respect to teachers’ practices in particular, lesson content and connections to the game, how class periods were structured and facilitated by the teacher, teacher focus on student learning including asking questions of students during play and guiding their play toward learning, connections between the game and the curriculum as well as cross-curricular connections outside of geography, and teacher knowledge and understanding of the game. For student activities in the classroom, our observations centered on time students spent on/off task, if, how, and in what ways students were engaged with the game, and conversations among students about the game and/or geography more broadly. Detailed field notes were taken along with videos, audio recordings, and still photos. Field notes were analyzed thematically using NVivo (Clarke and Braun 2017; Nowell, Norris, White, and Moules 2017).

Interviews

Teachers were interviewed at the completion of the study. They were asked to provide information about their curriculum, including lesson sequencing, assessments, time required to plan, and decisions they made about whether or not to use any of the game’s resources located in the activity guides; gameplay, including student experiences and learning, such as interactions with one another, individual students who excelled or struggled, whether students were making connections between the game world and the world outside of the game, and how the room was organized for gameplay; and the PD sessions, including whether they would participate in future PD, feedback on the sessions, and how the PD sessions impacted their use of the game and whether or not they would use it in the future. Interviews ranged in length from 25 to 80 minutes. Common themes were identified that would aid the researchers in their understanding of teachers’ experiences. Interviews were analyzed, thematically, using NVivo. (See Appendix A)

The next section extrapolates from the data and analysis described briefly here and offers a framework for digital game-based pedagogies, based on our nearly 100 hours of classroom observations and over 34 hours of interviews with teachers. The intent is to demonstrate, based on evidence gathered, a pedagogical framework that can be taken up and used by others who might expand on and modify it to best suit divergent contexts.

Digital Game-Based Pedagogies

Having analyzed the data, it was possible to identify with some clarity, approaches to teaching with games⎯or digital game-based pedagogies⎯particularly supportive of DGBL. While the content of the PD and the skills teachers developed within it, including familiarity with the game, their use of activities and the teacher guide, and their adoption of the lessons planned during the PD, was an initial point of interest during observations, what became apparent was how teachers used this knowledge to shape their pedagogical practice. A teacher, sitting off to the side during gameplay, might have demonstrated knowledge of Sprite’s Quest by answering a student’s question when approached, but that teaching practice was less meaningful than that of a teacher who illustrated their knowledge by circulating around the classroom during gameplay, asking questions, directing students’ attention to elements of the game, and providing practical tips on how to navigate specific levels. That teachers used activities from the teacher guide was important, but among those who did, what type of learning activity was selected and how the lesson was structured around that activity told us more about impactful DGBL than simply whether or not an activity was used. And the quality of the lesson and assessment content and the pace of the unit created around the game impacted the nature of the DGBL experience. Consequently, through our observations, we began to identify a set of practices⎯pedagogical strategies that best supported DGBL in the classroom.

What follows are details of these nine digital game-based pedagogies, grouped according to three general categories: gameplay, lesson planning and delivery, and framing technology and the game.

Category Pedagogical Strategies
Gameplay Teacher knowledge of and engagement with the game during gameplay
Gameplay Focused and purposeful gameplay
Gameplay Collaborative gameplay
Lesson Planning and Delivery Meaningful learning activities
Lesson Planning and Delivery Cohesive curricular design: Structured lessons
Lesson Planning and Delivery Appropriate lesson pacing and clear expectations
Framing Technology and the Game Technological platforms not a point of focus
Framing Technology and the Game Game positioned as a text to be read
Framing Technology and the Game Connections to prior learning and to the world beyond the game environment

Gameplay

1. Teacher knowledge of and engagement with the game during gameplay

Teachers demonstrated knowledge of the game in group discussions and one-on-one conversations with students. They regularly spoke of their own experiences during gameplay, including aiding students in challenges with overcoming obstacles in game. Teachers talked with students about how to focus play on the learning task at hand, including what to pay attention to during gameplay. Teachers were also engaged with gameplay. For example, they circulated to ask students questions, direct student attention to various facets of the game, and connect the game to the learning activity.

2. Focused and purposeful gameplay

During gameplay, teachers directed student focus to a specific learning activity. In this respect, gameplay was always purposeful, targeted at the completion of a particular learning activity. For instance, students might play a few levels of the game, directed by the teacher to pay attention to the climate, or to compare regions with respect to vegetation.

3. Collaborative gameplay

Teachers facilitated whole class discussions that focused gameplay, and connected game content to the curriculum more broadly. The teacher encouraged students to play together or to complete learning activities collaboratively. For example, one teacher asked students to work in groups to respond to the question of whether they would like to live in China (one of the locations featured in Sprite’s Quest) based on their experiences playing the game, and another, to respond to discussion questions.

Lesson Planning and Delivery

4. Meaningful learning activities

Teachers assigned learning activities that involved the application of higher-order skills such as analysis or creation. For instance, one teacher asked students to produce a travel video for a specific geographic region in the game, another, to debate the merits of restricting mountain access in a particular region, and a third teacher, to construct an argumentative paragraph about whether hotels should be permitted to privatize beaches. If students were asked to jot down facts or information obtained through gameplay, it was in support of an additional, higher-order learning activity. In some instances, these materials were extracted from the student activity guide, while in others, they were created by teachers.

5. Cohesive curricular design: Structured lessons

Gameplay was integrated into the curriculum by the teacher. An introductory lesson that rooted play to learning preceded gameplay and a learning activity followed play. For example, one teacher facilitated a lesson on garbage disposal practices around the world, before asking students to play a level of the game that focused on garbage disposal in a specific region. The teacher also created a follow up activity wherein students composed a letter about disposal to a government official in the region of the world.

6. Appropriate lesson pacing and clear expectations

Teachers provided students with concrete time frames for the completion of tasks. Often, periods were structured in such a way that multiple activities were to take place. For instance, a 5-minute introductory activity might be followed by 20 minutes of structured and targeted gameplay, with the final 15 minutes of the period allotted to small group discussions around a learning activity. Teachers regularly reminded students of time to complete tasks.

Framing Technology and the Game

7. Technological platforms not a point of focus

Some teachers chose to use the electronic activity guide or board-based platforms for the completion of learning activities. In these cases, the activities, rather than the technology, remained the point of focus. When technology malfunctioned (e.g. students had difficulty logging into the board site or material was not uploading to the activity guide), teachers continued to place emphasis on the significance of the learning taking place, asking students to share resources to complete the activity, or directing them to an alternative materials such as pen, pencil, and paper. In so doing, teachers maintained the pace of the lesson.

8. Game positioned as a text to be read

Teachers framed the game as a text that students could reference in support of their learning in the classroom, extending DGBL beyond learning during play. To do so, they facilitated connections between the game and other material such as videos viewed in class, textbooks, class discussions, so on. For example, to respond to an activity question, one teacher guided students in using material learned both in the game and the textbook to support their answers.

9. Connections to prior learning and to the world beyond the game environment

Teachers connected gameplay to prior learning and to material examined outside of the game context. They reminded students of learning during previous play sessions and of subject-specific and cross-curricular learning during class discussions. For example, one teacher connected a level of the game that explored waste management to a recently completed assignment examining the Great Pacific Garbage Patch. Another teacher connected learning around a water-locked region in the game to a historical lesson about expedition. Teachers also drew parallels between game locations and the local community. For instance, a teacher engaged the class in a heated debate, comparing garbage collection and pollution in certain areas of the game to garbage collection in the school and pollution in the local city and surrounding area.

Possible Impact of Professional Development on Teachers’ Digital Game-Based Pedagogies

What we offer in the previous section are descriptions of exemplary pedagogical practices that support DGBL. Not all of the teachers who participated in the study engaged with these practices in the way described above. In fact, 26% of the teachers were what we would label as highly successful in engaging in the digital game-based pedagogies outlined in this article. Another 29% were somewhat successful, at times adopting some of these pedagogical strategies and not others, creating meaningful learning activities and highly structured lessons that were adequately paced, for example, but then failing to connect the game to prior learning and the world beyond the game environment, not requiring students to collaborate with one another, and not positioning the game as text to be read. And the final 45% were labeled as unsuccessful, adopting DGBL in the classroom in a more haphazard manner, offering pedagogical practices that did not reflect the digital game-based pedagogies detailed in this text. (See Figure 1)

Figure 1: Pie-chart showing teacher alignment with the digital game. The graph indicates that 45% of teachers were unsuccessful, 29% were somewhat successful, and 26% were highly successful
Figure 1. Teacher alignment with digital game-based pedagogies.

Given the significance of the digital game-based pedagogies and the extent to which these practices were common across the classrooms we visited, we wanted to determine if there was any connection between these practices and whether or not teachers had received PD.

In the category of strong alignment, 29% of teachers who received PD employed pedagogical strategies that matched that criterion, compared to 17% of the teachers who did not receive PD. For moderate alignment, 32% of the teachers who received PD were grouped in this category compared to 17% of the teachers who did not receive PD. And finally, 39% of the teachers who received PD were weakly aligned with these practices compared to 66% of the teachers who did not receive PD. (See Figures 2 and 3)

 

Figure 2: Pie-chart showing alignment for teachers who did receive professional development. The graph indicates that 39% of teachers had weak alignment, 32% had moderate alignment, and 29% had strong alignment.
Figure 2. Teachers who received PD.

 

Figure 3: Pie-chart showing alignment for teachers who did not receive professional development. The graph indicates that 66% of teachers had weak alignment, 17% had moderate alignment, and 17% had strong alignment.
Figure 3. Teachers who did not receive PD.

 

Conclusion

The limitations to this research are: 1) it was not possible to spend more than three to four hours in each classroom given the geographical scope of the project and the number of participants, however, the study could have benefited from observation of the entire unit as it was delivered; and 2) we did not have a powerful enough number of participants to generate meaningful quantitative comparative data, and certainly that could be of interest in future. There is very rich data, of course, that due to word limits we were not able to detail further. However, we hope to have highlighted the importance of pedagogy for creating environments conducive to DGBL, calling attention to best practices around structuring and conceptualizing gameplay, planning, and delivering content, and framing both technology and a game. While these pedagogical strategies were not the focus of our professional development session, it is clear from our observations of participants’ teaching after the PD that the session impacted their teaching as it pertained to these best practices. More broadly and considering gaps in the practices of teachers involved in our study, these digital game-based pedagogies provide a framework for better understanding not only what good teaching with games looks like but also areas where teachers require additional support.

We have argued that very little research on digital game-based learning examines teacher pedagogies and that even fewer studies of professional development for teachers on DGBL either focus on pedagogy or study the impact of professional development on teacher practice. By observing thirty-four teachers in their classrooms after providing a professional development session, we identified a common set of digital game-based pedagogies that supported digital game-based learning. While our professional development session did not explicitly address these pedagogical strategies, either through discussion or modeling, we did recognize areas in which the content of our professional development session overlapped with some of the strategies teachers employed. This research can inform future PD that attempt to better understand the impact of modeling and discussing these digital game-based pedagogies within professional development sessions. This work, importantly, offers a potential framework for providing teachers with the practical skills required to support students in digital game-based learning in classroom spaces. It also signals a need for future studies that focus specifically on pedagogies that best support DGBL.

Bibliography

Alaswad, Zina, and Larysa Nadolny. 2015. “Designing for Game-Based Learning: The Effective Integration of Technology to Support Learning.” Journal of Educational Technology Systems 43 (4): 389–402. doi:10.1177/0047239515588164.

Allsop, Yasemin, and John Jessel. 2015. “Teachers’ Experience and Reflections on Game-Based Learning in the Primary Classroom.” International Journal of Game-Based Learning 5 (1): 1–17. doi:10.4018/ijgbl.2015010101.

Annetta, Leonard. 2008. “Video Games in Education: Why They Should Be Used and How They Are Being Used.” Theory Into Practice 47 (3): 229–39. doi:10.1080/00405840802153940.

Arnab, Sylvester, Theodore Lim, Maira B. Carvalho, Francesco Bellotti, Sara De Freitas, Sandy Louchart, Neil Suttie, Riccardo Berta, and Alessandro De Gloria. 2015. “Mapping Learning and Game Mechanics for Serious Games Analysis.” British Journal of Educational Technology 46 (2): 391–411. doi:10.1111/bjet.12113.

Aslan, Serdar, and Osman Balci. 2015. “GAMED: Digital Educational Game Development Methodology.” Simulation 91 (4): 307–19. doi:10.1177/0037549715572673.

Barab, Sasha, Michael Thomas, Tyler Dodge, Robert Carteaux, and Hakan Tuzun. 2005. “Making Learning Fun: Quest Atlantis, a Game without Guns.” Educational Technology Research and Development 53 (1): 86–107. doi:10.1007/BF02504859.

Becker, Katrin. 2009. “Video Game Pedagogy: Good Games=good Pedagogy.” In Games: Purpose and Potential in Education, edited by Christoper Thomas Miller, 73–125. New York: Springer. doi:10.1007/978-0-387-09775-6_5.

———. 2017. Choosing and Using Digital Games in the Classroom: A Practical Guide. Switzerland: Springer.

Boyatzis, Richard. 1998. Transforming Qualitative Information: Thematic Analysis and Code Development. Thousand Oaks, CA: Sage Publications, Inc.

Charsky, Dennis, and Michael K Barbour. 2010. “From Oregon Trail to Peacemaker: Providing a Framework for Effective Integration of Video Games into the Social Studies Classroom.” In Proceedings of Society for Information Technology & Teacher Education International Conference, edited by D. Gibson and B. Dodge, 1853–60. Chesapeake, VA: Association for the Advancement of Computing in Education.

Chee, Yam San, Swati Mehrotra and Jing Chuan Ong. 2014. “Professional Development for Scaling Pedagogical Innovation in the Context of Game-Based Learning: Teacher Identity as Cornerstone in ‘Shifting’ Practice.” Asia Pacific Journal of Teacher Education 43(5): 423–437.

Clark, Richard. 2007. “Learning from Serious Games? Arguments, Evidence, and Research Suggestions.” Educational Technology May-June: 56–59.

Clarke, Victoria, and Virginia Braun. 2017. “Thematic Analysis.” Journal of Positive Psychology 12 (3). Routledge: 297–98. doi:10.1080/17439760.2016.1262613.

Egenfeldt-Nielson, Simon, Jonas Heide Smith, and Susana Pajares Tosca. 2016. Understanding Video Games: The Essential Information. 3rd ed. New York: Routledge.

Gee, James Paul. 2008. “Learning and Games.” In The Ecology of Games: Connecting Youth, Games, and Learning, edited by Katie Salen, 21–40. Cambridge, MA: The MIT Press.

Groff, Jen, Cathrin Howells, and Sue Cranmer. 2010. “The Impact of Console Games in the Classroom : Evidence from Schools in Scotland.” Future Lab. http://www.futurelab.org.uk/resources/documents/project_reports/Console_Games_report.pdf.

Hanghoj, Thorkild. 2008. “Playful Knowledge: An Explorative Study of Educational Gaming.” PhD diss., University of Southern Denmark.

Hébert, Cristyne, Jennifer Jenson, and Katrina Fong. 2018. “Challenges with Measuring Learning through Digital Gameplay in K–12 Classrooms.” Media and Communication 6 (2): 112–25. doi:10.17645/mac.v6i2.1366.

Ketelhut, Diane Jass and Catherine C. Schifter. 2011. “Teachers and Game-Based Learning: Improving Understanding of How to Increase Efficacy of Adoption.” Computers and Education 56, 539-546.

Lester, JC, HA Spires, JL Nietfeld, and J Minogue. 2014. “Designing Game-Based Learning Environments for Elementary Science Education: A Narrative-Centered Learning Perspective.” Sciences, 1–29. http://www.sciencedirect.com/science/article/pii/S0020025513006385.

Marklund, Björn Berg and Anna-Sofia Alklind Taylor. 2015. European Conference on Game-Based Learning.

Mishra, Punya, and Matthew J Koehler. 2006. “Technological Pedagogical Content Knowledge: A Framework for Teacher Knowledge.” Teachers College Record 108 (6): 1017–54.

Nolan, Jason, and Melanie McBride. 2013. “Beyond Gamification: Reconceptualizing Game-Based Learning in Early Childhood Environments.” Information, Communication & Society 4462 (June 2013): 1–15. doi:10.1080/1369118X.2013.808365.

Nousianien, Tuula, Marjaana Kangas, Jenni Rikala and MikkoVesisenaho. 2018. “Teacher Competencies in Game-Based Pedagogy.” Teaching and Teacher Education 74: 85-97.

Nowell, Lorelli S., Jill M. Norris, Deborah E. White, and Nancy J. Moules. 2017. “Thematic Analysis: Striving to Meet the Trustworthiness Criteria.” International Journal of Qualitative Methods 16 (1): 1–13. doi:10.1177/1609406917733847.

Oblinger, Diana. 2004. “The Next Generation of Educational Engagement.” Journal of Interactive Media in Education 8: 1–18.

Rodriguez-Hoyos, Carlos, and Maria Joao Gomes. 2012. “Beyond the Technological Dimension of Edutainment: An Evaluation Framework with a Curricular Perspective.” In Handbook of Research on Serious Games as Educational, Business and Research Tool, edited by Maria Manuela Cruz-Cunha, 818–37. Hershey, PA: Information Science Reference.

Sánchez, Jaime, Mauricio Sáenz, and Jose Miguel Garrido-Miranda. 2010. “Usability of a Multimodal Video Game to Improve Navigation Skills for Blind Children.” ACM Transactions on Computing Education 3 (2): 7:1–7:29. doi:10.1145/1857920.1857924.

Sandford, Richard, Mary Ulicsak, Keri Facer, and Tim Rudd. 2006. “Teaching with Games.” Future Lab. Vol. 112. http://www.groupe-compas.net/wp-content/uploads/2009/08/untitled1.pdf.

Shabalina, Olga, Peter Mozelius, Pavel Vorobkalov, Christos Malliarakis, and Florica Tomos. 2016. “Creativity in Digital Pedagogy and Game-Based Learning Techniques; Theoretical Aspects, Techniques and Case Studies.” In IISA 2015 – 6th International Conference on Information, Intelligence, Systems and Applications. doi:10.1109/IISA.2015.7387963.

Simpson, Elizabeth and Susan Stansberry. 2009. “Video Games and Teacher Development: Bridging the Gap in the Classroom.” In Games: Purpose and Potential in Education, edited by Christopher Thomas Miller. Boston, MA: Springer.

Software and Information Industry Association. 2009. “Best Practices for Using Games & Simulations in the Classroom.” Guidelines for K–12 Education. Vol. 8. doi:10.1080/15332690902813786.

Spires, Hiller A. 2015. “Digital Game-Based Learning.” Journal of Adolescent & Adult Literacy 59 (2): 125–30. doi:10.1002/jaal.424.

Squire, Kurt, and Henry Jenkins. 2003. “Harnessing the Power of Games in Education.” Insight 3: 5–33.
Tsai, Fu-hsing, Kuang-chao Yu, and Hsien-sheng Hsiao. 2012. “Exploring the Factors Influencing Learning Effectiveness in Digital Game-Based Learning.” Educational Technology & Society 15 (3): 240–50.

Van Eck, Richard, and Woei Hung. 2010. “A Taxonomy and Framework for Designing Educational Games to Promote Problem Solving.” Videogame Cultures & the Future of Interactive Entertainment Annual Conference of the Inter-Disciplinary.Net. http://www.inter-disciplinary.net/wp-content/uploads/2010/06/eckpaper.pdf.

Appendix A

Interview Questions

  1. How long have you been teaching?
  2. Do you have a master’s degree?
  3. Have you done any administrative work?
  4. Have completed any additional qualification (AQ) courses? If so, which courses?
  5. When did you complete your Bachelor of Education (BEd) degree?
  6. What subject did you major in in university? What are your teachables?
  7. Can you please give us some information about your school? What is the student population? The socioeconomic status of students?
  8. Can you please tell us a bit about your class? How many students do you have on IEPS? With behavioural issues? Do you have any support for these students in the form of EAs or pull out programs?
  9. The students in your class on IEPs and/or those on the autism spectrum? What are they normally doing? Do they often play games in the classroom (when other students are not?)
  10. What types of things did you have to wrangle to do this project? (e.g. booking labs or computer carts, speaking with other teachers, requesting exclusive internet use in the school)
  11. Did you use iPads? Computers? When students completed the activities, did they use iPads and computers? iPads and paper? Just iPads or computers? What was the reasoning behind this choice?
  12. Did you look at any of the teacher resources? The activity guide? What parts did you use? Was anything helpful particularly? Would you like to have seen something that wasn’t included in the guides?
  13. Walk us through your lesson sequencing – what did students do? What was the pace? What was the culminating activity? Did they complete a final project?
  14. How long did you initially plan for? How long did you end up spending? What changed (if anything)?
  15. How did you evaluate the unit?
  16. What was your best and worst day with the game? What would you change about the worst day with the game?
  17. We weren’t there every day. How many times did things not work (internet down, couldn’t access computers, lab not available, etc.)
  18. Can you please talk about a few students who exceled with the game? A few who didn’t do well? A student who you had a set of expectations about (thought they would love or hate the game) and who acted contrary to your expectations?
  19. Did you get the sense that students were making connections between the game and the real world?
  20. What did you notice about student interactions with one another?
  21. How did you organize the room for the game play?
  22. Given everything that happened with this project, would you consider doing something like this again? Why or why not?
  23. Would you have used this game in the classroom if it were not for this workshop/project and why?
  24. Would you use Sprite’s Quest in your class in the future?
  25. Can you talk a bit about your experience with the workshop? What you liked, what you didn’t like, what you would change, what you found helpful…
  26. What kinds of supports would you need in the future to make using games in the classroom possible?
  27. What is your teaching philosophy? What is your responsibility to your students?  What is your relationship with the parents of the students in your class like?

Interview Questions [PDF]

About the Author

Cristyne Hébert is Assistant Professor, Assessment and Evaluation in the Faculty of Education, University of Regina. Her research focuses on assessment and evaluation, new media and technologies, and curriculum in teacher education and K-12 education in Canada and the United States.

Jennifer Jenson is Professor, Digital Languages, Literacies and Cultures in the Languages and Literacy Department, Faculty of Education, The University of British Columbia. She has published on digital games and learning, gender and videogames, and technology policies and practices in K-12 education. She currently is the lead researcher on a large, international research partnership grant, “Re-Figuring Innovation in Games” (www.refig.ca) that is examining inequities in digital game industries and cultures.

Final rendering of 3D model of Bethel Seminary.
2

Creating Dynamic Undergraduate Learning Laboratories through Collaboration Between Archives, Libraries, and Digital Humanities

Abstract

In an environment of rapid change in higher education in which institutions strive to lure prospective students with unique curricula, there is a growing need to provide innovative pedagogical experiences for students through collaborations among archives, libraries, and digital humanities. Three colleagues at a small Liberal Arts university—a digital librarian, a historian-archivist, and a historian-digital humanist—planned an integrated set of assignments and projects in an “Introduction to Digital Humanities” course that introduced students to archival management and digitization of archival material. This article demonstrates how we developed this signature course and curriculum on a limited budget in the context of a liberal arts university, and illuminate how it capitalized on relationships forged among the archives, the library, the history department and the digital humanities program. We first describe our collaborative workflow, and how we involve undergraduate student-workers in these efforts. Next, we provide a detailed lesson plan for an Introduction to Digital Humanities course that integrates traditional archival materials, in this case photographs and blueprints of campus structures, into a digital archive. Finally, we share how our students converted these photographs and blueprints into digital 3D models via Sketchup, a powerful architectural modeling software.

Introduction

In an environment of rapid change in higher education in which institutions strive to lure prospective students with unique curricula, there is an increasing need to provide innovative pedagogical experiences for students through collaborations among libraries, archives, and digital humanities. There is also a growing body of literature—on research support for scholarship, curriculum development, collaborative publishing, and on shared values across these organizations and disciplines—about how historians, librarians, archivists, and digital humanists can forge mutually supportive relationships (Locke 2017; Middleton and York 2014; Rutner and Schonfeld 2012; Svensson 2010, para. 39; Vandgrift and Varner 2013). Kent Gerber (Digital Library Manager) Diana Magnuson, (archivist at the History Center and historian), and Charlie Goldberg (Digital Humanities coordinator and historian), are colleagues who set out to do just that at Bethel University, a small Christian Liberal Arts university in St. Paul, Minnesota. Applying insights from these literatures to the ever-evolving landscape of humanities teaching in higher education, the three planned an integrated set of assignments and projects that spanned a new “Introduction to Digital Humanities” course. “Introduction to Digital Humanities” was the first course in the new Digital Humanities major, and was designed to: engage and motivate students early in the curriculum with “hands-on, experiential, and project-based learning … where students think critically with digital methods” (Burdick et al. 2012, 134); “develop a broader set of skills … essential to students’ success in their future careers” (Karukstis and Elgren 2007, 3); and give students meaningful experiences and agency as a form of “professional scholarship” rather than placing them in a position of fulfilling “menial labor in a large-scale project” (Murphy and Smith 2017, para. 8). Our thinking about the design of this course was influenced by the pedagogical theory of Brett D. Hirsch, Paolo Freire, and Claire Bishop (Murphy and Smith 2017). Collaborative teaching always poses special challenges, but we anticipated that our diverse backgrounds and training would result in a rewarding and distinctive experience for our students.

This article will explain how we developed this signature course and curriculum in the context of a liberal arts university, and illuminate how it capitalized on relationships forged among the archives, the library, the history department and the digital humanities program. Built on the foundation of the material holdings of the History Center (Magnuson), the Digital Library (Gerber) was able to grow connections and extend the reach of these materials through an infrastructure of digital skills and collections. This combination provided a robust environment for the campus community to seek and eventually establish a Digital Humanities program, including a new major and a new faculty member (Goldberg) to develop and coordinate the program. The curriculum developed along the lines of Cordell’s four principles of how to incorporate digital humanities into the classroom, including starting small, integrating when possible, scaffolding everything, and thinking locally (2016). These relationships and principles enabled the development of a course, “Introduction to Digital Humanities,” which engages the archives, the digital library, and digital humanities domains in a mutually supportive and emergent cycle of learning and research.

Opportunities for undergraduate digital humanities scholarship and pedagogy are burgeoning, particularly at more prestigious liberal arts institutions. Occidental College in California, Dickinson College in Pennsylvania, and Hamilton College in New York all maintain well-funded centers for either Digital Liberal Arts or Digital Humanities focused on undergraduate students. There are also several noteworthy inter-institutional collaborations among liberal arts schools—COPLACDigital (comprised of more than twelve schools), the Five Colleges of Ohio (Oberlin College, Denison University, Kenyon College, Ohio Wesleyan University, and the College of Wooster), the Five Colleges Consortium in New England (Amherst, Hampshire, Mount Holyoke, and Smith Colleges, and the University of Massachusetts Amherst) and St. Olaf, Macalester, and Carleton Colleges in Minnesota have all received collaborative grants from the Andrew W. Mellon Foundation.

At schools (like our own) that lack these resources, it can sometimes feel like the unique pedagogical opportunities afforded by this support are beyond the reach of faculty and staff. Our aim here is to describe our collaborative experiences and provide other scholars with a model of how the archive can intersect with digitization efforts and undergraduate pedagogy at smaller institutions of higher education. Together, these assignments and projects produced learning outcomes related to concepts in the humanities, archival research methods, digital competencies, information literacy, and digital humanities tools and software (Association of College and Research Libraries 2016; Bryn Mawr College n.d.).

History Center

The History Center, the archives at Bethel University, contains the institutional records of the university and its founding church denomination, known as Converge (formerly the Baptist General Conference). The History Center provides stewardship of manuscript and digital materials, collects historically relevant materials, curates three-dimensional objects, offers access to special collections, assists researchers, documents the story of its institutions and supports the mission of Bethel University and Converge. The types of collections housed at the History Center include but are not limited to: institutional records of Bethel University (college and seminary); Baptist General Conference (and all its iterations) minutes and annual reports; conference and university publications; church and district records (from both active and closed churches); home and foreign mission records; Swedish Bibles and hymnals; bibliographic records on conference pastors and lay persons; photographs and other media.

The director of archives at Bethel University (Magnuson) is a part-time position created in 1998 and held by a full time faculty member in the history department. This dual appointment provides unique positioning for the faculty member to provide a bridge for students between academic and public history. Students in her classes work with a variety of primary source materials, regardless of the level of the history course. Through the faculty member’s engagement with students in the classroom, Magnuson identifies students with proclivity for detail, curiosity about archival work, and willingness to explore a variety of primary source material. Sometimes, just by working with primary sources, or hearing a description of archival work or records management, a student reacts enthusiastically to the physical and intellectual encounter: “This is so cool, where can I have more of this kind of experience?” Over and over again, the experience of encountering a primary source in its original form is at once awe inspiring and profoundly transformative for the student. It is one thing to read about one of the first professing Baptist believers in nineteenth century Sweden and the impact this life had on Swedish Baptists in America, but it is quite another kind of experience to encounter in person the artifact of his diary (Olson 1952).

Two or three students each year are invited to work with the director of archives as student archive assistants. Students with a major or minor in history are given preference in the application process. Once hired, over the course of an academic year, students are exposed to and trained in: initial stages of archival control; digital inventory projects; arrangement and description; digital metadata entry; and patron assistance. For example, our students have contributed to developing collections focusing on photographs, film, artifacts, institutional records such as catalogs, yearbooks, and student publications.

In 2009 Bethel University hired someone for the newly created position of Digital Library Manager (Gerber). Since then, both the History Center and the Digital Library have transformed into dynamic learning laboratories for our undergraduate students to experience first-hand the tools of the professions of history and digital librarianship. The now nearly decade-long partnership between the History Center and the Digital Library is characterized by lively and productive collaboration on a number of fronts. For example, students hired by the history department are trained and work with both the Director of Archives and the Digital Library Manager. Major equipment that benefits both the History Center and the Digital Library has been purchased through mutual consultation and contribution of funding, such as an overhead book scanner and 3D scanner. Monthly meetings identify and move forward projects, workflow, grant applications, institutional initiatives, web presence, and troubleshooting as the need arises. At the behest of the Director of Archives and the Digital Library Manager, two foundational committees were formed to anchor our institutional conversations about our cultural heritage: the Cultural Heritage Committee and the Digital Library Advisory Committee, respectively. These committees support the History Center and the Digital Library through institution-wide input, drawing committee members from faculty, staff, and administration. In tandem we are significantly growing the breadth, depth, and reach of our collections, not only to our Bethel community, but to the world.

The Digital Library as Infrastructure and Bridge between the Archive and the Classroom

Of the Bethel Digital Library’s twenty-six collections spanning five major themes—Bethel History, Art and Creative Works, Faculty and Student Scholarship, Natural History, and the Student Experience—the majority of the content comes from the cultural heritage materials held in the History Center. Digitization of these unique materials broadens their availability to the community for teaching and research while simultaneously preserving the originals from wear because they do not need to be handled as frequently. Regular conversation between the Digital Library Manager and the Director of the Archives developed the library and archive as an infrastructure of values, practices, and workflows enabling a deeper understanding of Bethel’s cultural holdings and a broader reach of those materials to the Bethel community and beyond (Gerber 2017; Mattern 2014). In one of his series of four seminal articles on digital humanities, Director of the HUMLab in Umea University, Patrick Svensson discusses how the research-oriented infrastructure of technology, relationships, and practices, called “cyberinfrastructure” can be built specifically for humanities teaching and research (2011). Magnuson and Gerber’s collaboration developed a cyberinfrastructure at Bethel with the scanners, software, networked computing, meetings, digital collections, and committees mentioned above. The shape and scale of these resources influences a broad range of digital humanities literacies and competencies, as Murphy and Smith point out in their introduction to the special issue of Digital Humanities Quarterly focused on undergraduate education (2017, para. 7).

Bringing student workers into this cyberinfrastructure of the Digital Library and History Center also continued this cooperation and cross-pollination of knowledge and skills, and introduced them to information literacy skills and digital competencies. The first set of concepts these students learn are aspects of the Association of College and Research Libraries Framework for Information Literacy in Higher Education. Information literacy is “the set of integrated abilities encompassing the reflective discovery of information, the understanding of how information is produced and valued, and the use of information in creating new knowledge and participating ethically in communities of learning.” Knowledge of these skills has particular potency due to the influence of Mackey and Jacobson’s (2014) concept of “metaliteracy,” which expanded on information-literacy abilities with respect to the networked digital environment, rapidly changing media, increased consumption and production of media, and critical reflection upon one’s self and the information environment (ACRL 2015, para. 5). The Framework consists of six frames, or core concepts, of information literacy, which are marked by certain knowledge practices and dispositions when a learner moves through a threshold of awareness from novice to expert. The six frames are, in alphabetical order: 1) Authority is Constructed and Contextual; 2) Information Creation as a Process; 3) Information Has Value; 4) Research as Inquiry; 5) Scholarship as Conversation; 6) Searching as Strategic Exploration.

Digital competencies, as developed at Bryn Mawr College, are a useful complement to information literacy, spanning media and disciplines, specifically focused on the digital environment, and developed within the context of a small, liberal arts college. This model of skills is organized into five focus areas and can be used as learning objectives or as descriptions of skills one already has. The five focus areas are: 1) Digital Survival Skills; 2) Digital Communication; 3) Data Management and Preservation; 4) Data Analysis and Presentation; and 5) Critical Making, Design, and Development (Bryn Mawr, n.d.).

Informed by their work in the archives with historical materials, student workers in the archive and the Digital Library are exposed to and develop skills and competencies related to the above ACRL information literacy frames and the Bryn Mawr digital competencies. They accomplish this through learning the processes of digitization, learning how to use scanning equipment and image manipulation software, writing descriptive metadata, and encoding finding aids in a version of XML called Encoded Archival Description (EAD) for public display. Going through these processes introduced students to information literacy frames of “Information Creation as a Process” and “Information has Value” as well as digital competencies like “Digital Survival Skills,” “Data Management and Preservation,” and “Data Analysis and Presentation” (Association of College and Research Libraries 2016; Bryn Mawr, n.d.). As students begin work with the Digital Library, they begin to realize the limit of their own skills and abilities with technology and recognize how they can grow their awareness and competencies with “Digital Survival Skills,” particularly in the subcategory of “metacognition and lifelong learning.” The competency of “Data Management and Preservation” included learning more sophisticated hardware like flatbed scanners and the software environment of spreadsheets. The flatbed scanner process involved scanning at a high enough resolution for the resulting image to represent the original in print or digital formats as well as enable the ability to zoom in for very close examination afforded by a digital format. Some students had not used spreadsheets before and learned how to navigate a spreadsheet and use them to organize different categories of data and store multiple records of items. Students were introduced to the domain of “Digital Analysis and Presentation” through classification methodologies and learned how to navigate a digital archive to research a topic of interest. The skills students learned from these experiences motivated them to learn more and prepared them for further study in graduate school or employment in the cultural heritage sector. This built a culture of trust, common understanding, and shared competencies between both units and set a foundation for further integration of Bethel’s cultural heritage in the classroom and the establishment of the Digital Humanities major (Bryn Mawr College n.d.).

While these competencies and literacies were building in student workers, it was necessary to integrate the learning of these concepts more broadly into the general curriculum so that more students could benefit. Some classroom opportunities emerged as a result of the digitization activities in various classes and disciplines. Students researched historical events and trends through their increased access to documents within a collection like the historical student newspaper collection for their journalism projects, engaging the frames of “Research as Inquiry” and “Searching as Strategic Exploration.” Students in a computer science course on data mining were also able to use the corpus of metadata from collections like the student newspaper, college catalogs, and faculty research as an object of study in their projects to identify trends in course offerings, changes in campus space, changes in school mascots through the years, and profiles of particular individuals in Bethel’s history. These assignments and experiences also built some familiarity with ways to engage archival material in classes other than a history class. Some of these students were excited to make these discoveries and had a heightened interest in the history of the institution, but their ability to pursue it in any depth was limited by the length of one single assignment.

In 2017, two developments in Bethel’s cyberinfrastructure improved the scope and scale for student learning anchored in these concepts: the launch of Bethel’s Makerspace in the Library and the creation of a Digital Humanities major. Bethel’s Makerspace is the result of a purposeful design discussion consisting of a cross-disciplinary group of faculty, staff, and administrators. This discussion resulted in a technology-infused space in the Library to explore innovative, creative technologies and encourage collaboration and experiential classroom experiences through the use of 3D scanners, 3D modeling and media production software, photo studio equipment, movable furniture, 3D printers, and meeting space for groups and classes. With the Digital Humanities program in place beginning in 2017, a new opportunity emerged for students to use the Makerspace as a lab to learn information-literacy concepts and digital competencies as demonstrated by other programs (Locke 2017, para. 8–49; White 2017, 399–402), and to engage more fully in the physical archive and the digital collections.

The Archive in Digital Humanities Pedagogy

Powerful technology has never been more accessible to educators, even, as we describe above, to educators at smaller schools like our own. Yet there remains the assumption that the digital humanities are best left to R1 institutions with deep pockets and deep rosters of instructors and support staff (Alexander and Frost Davis 2012; Battershill and Ross 2017, 13–24). However, there is a growing conversation and community of practice for undergraduate and liberal-arts–oriented digital humanities education, like the Liberal Arts Colleges section of the Digital Library Federation, that seeks ways for smaller institutions to thrive (Buurma and Levine 2016; Christian-Lamb and Shrout 2017; Locke 2017, para. 7). Bethel has been able to do this through incremental financial investments in technology and intentional partnerships like the efforts of the History Center and Digital Library mentioned above. In 2016–2017, Bethel designed and launched a new undergraduate Digital Humanities major informed by concepts from digital humanities pedagogy that capitalized on the existing technical and relational investments that can be available to faculty at most institutions with limited means (Brier 2012; Cordell 2016; Wosh, Moran, and Katz 2012).

We have benefited greatly by our archival holdings in the History Center. A particular challenge to incorporating digital humanities in the classroom is avoiding the technological black hole, whereby the technology used to make something becomes the focus of the thing itself, demanding the attention of both instructor and student at the expense of the humanistic subject. The archive, as an essential repository of humanistic data, can help anchor the traditional humanities at the center of digital humanities pedagogy. Here, we share an example of a lesson plan that aims to do just this—to craft an undergraduate archival project that is at once technologically sophisticated yet true to traditional humanistic values—all without the use of expensive equipment.

This project was inspired by a research trip Goldberg made to Rome as a graduate student at Syracuse University. On a day off from research, he visited Cinecittà, a large film studio just outside the city that housed the set for the 2005 HBO series Rome. The studio still maintains the set, featuring a scale replica of the ancient Roman forum, and allows visitors to traipse the grounds as part of a tour. As a Roman historian, standing in a replica of the forum was a powerful experience for Goldberg, and delivered a new sense of historical place and space that examining traditional scholarly materials—maps, plans, and written descriptions—couldn’t match.

When Goldberg arrived at Bethel in the Fall of 2016 and began designing the Digital Humanities curriculum, he looked for ways to emulate his experience abroad. Digital 3D modeling, including virtual reality applications, can provide such an immersive experience for the viewer, and holds a special value for bringing archival materials to life (Goode 2017). Working in tandem with Magnuson and Gerber, it became apparent that the History Center archive contained a treasure trove of materials pertaining to the university’s spatial past: photographs of historical groundbreaking ceremonies, architectural blueprints, and design sketches. Particularly alluring were plans and renderings for campus expansions that never panned out; such materials suggested alternative campus realities that would have fundamentally altered the contexts of how students, faculty, and staff interact with one another on a daily basis.

During a summer meeting in the History Center, we began to design a six-week lesson plan for Goldberg’s semester-long Introduction to Digital Humanities course. Our primary pedagogical goals were twofold: 1) to introduce students to archival digitization practices, culminating with the creation of digital records for traditional archival materials; and 2) to create immersive, experiential worlds based on the History Center’s architectural records. We determined that Trimble’s Sketchup, a 3D modeling program used by architects, interior designers, and engineers, was the best software tool for goal #2. Even more, Trimble provides 30-day trial versions of its Pro software for educators and students, long enough to cover the three weeks dedicated to 3D modeling in this assignment. They also now offer Sketchup for Web, an entirely online, cloud-based version of the software, which eliminates the need to install the software on campus or student computers, though it does lack certain key features of the Pro version.

For this assignment, students chose a building, actually built or only existing in design plans, from the campus’s present or past. They chose two photographs or other visual records of it from the History Center (such as blueprints or design illustrations), and were tasked with incorporating these as entries into the Digital Library. This aspect of the assignment was structured over three weeks, and gave students an introduction to many of the professional archival practices and digitization fundamentals described above, providing a hands-on “experiential” learning opportunity that immersed them in the fabric of our institutional history. Finally, students were to create 3D digital models of their structure using Sketchup. This final step also took three weeks.

As Digital Library manager, Gerber took the lead in the first half of the assignment. Because most students were freshmen, we assumed no previous exposure to the archival setting. We therefore took a field trip to the History Center, where Magnuson gave an overview of her work there and introduced students to basic archival practices. We then reassembled as a class to learn some basic digital competencies like how medium impacts the experience and the meaning of an archival item by comparing and contrasting physical and digital versions of the same object. Once introduced to this framework, the class focused on how any object, be it a photograph, document, or physical object, possesses a range of features unique to it and how to attach a description of it to a digital file in order to be intelligible and findable by both humans and computers. To use a nonarchival example, an action figure is made of a certain material (e.g., “plastic”), is a certain size (e.g. “8 inches tall”), made by a certain company (e.g., “Mattel”), in a certain year. This basic principle is a crucial aspect of proper digital asset management, and allowed us to introduce the concept of “metadata,” or information about an object that describes its characteristics. We stressed the importance of metadata in the archival setting and introduced our students to the Dublin Core Metadata Initiative, an international organization dedicated to maintaining a standard and best practices for describing and managing any kind of information artifact including archival material. At its heart, Dublin Core consists of fifteen common elements necessary to describe the metadata of any archival object (e.g., “Title,” “Creator,” “Subject,” “Description,” etc.). We then looked at how items catalogued in the Digital Library store this metadata and apply local standards, like the Bethel Digital Library Metadata Entry Guidelines, adapted from the Minnesota Digital Library Metadata Entry Guidelines, to determine what kind of information is needed in each element. We focused particularly on the purpose of the Title and Description elements in the Historical Photographs Collection and analyzed the quality of the entries based on how well they provided context and facilitated discovery of the item by a potential researcher. For example, the Title element for the image of Esther Sabel, a prominent woman in Bethel’s history, was used to demonstrate levels of quality seen in Table 1: Poor – “Woman,” Good – “Portrait of Esther Sabel,” Better – “Portrait of Esther Sabel, Head of Bible and Missionary Training School.” Finally, based on this scaffolding, students were given the assignment of analyzing two images in the Historical Photographs Collection with insufficient or erroneous metadata, and improving the records in this Metadata Improvement Worksheet using a shared Google Spreadsheet.

Poor Descriptive Titles Good Descriptive Titles Better Descriptive Titles
These titles lack specificity and do NOT assist users in finding materials. These are examples of basic descriptive titles. These titles provide users with more specific information and relay exactly what is in the image.
Woman Portrait of Esther Sabel Portrait of Esther Sabel, Head of Bible and Missionary Training School
Crowd of People Group of students sitting on grass Group of seven students outside signing yearbooks
Table 1. Excerpt from Bethel Digital Library Metadata Entry Guidelines.

In the second week, we gathered several folders of photographs, blueprints, and architectural renderings awaiting catalogue entry in the History Center, and had our students spend some time perusing their contents. Since the assignment was quite long at six weeks and culminated in a large finished digital work that some students found intimidating, we found that students greatly appreciated this unstructured exploration, or “tinkering” (Sayers 2011) time. These photographs provided intimate glimpses into the university’s past and unrealized future(s), and motivated our students to find out more about the students who came before them. Students then selected two images for entry into the Digital Library, and received their second assignment: tracking down the necessary metadata. Their submissions would become a publicly-available part of the Historical Photograph Collection, adding a “real-world” application incentive to this assignment. Some images were easier to provide metadata for than in others, with dates or a list of subjects written on the back. Photographs of ground-breaking ceremonies could be dated by looking up construction dates for buildings on campus. Others required reasoned speculation. Dates for difficult photographs could be estimated by the style of clothing of the people photographed, for example.

In the third week, students wrote a two- to three-page blog post synthesizing Digital Library records into a narrative of a past campus event. Some students chose to write on their dorms or the campus building they had previously studied, while others wrote on a key historical event, such as Martin Luther King Jr.’s scheduled visit to campus in the 1960s. This aspect, since it required close reading of a text or texts, was the component of the assignment most aligned with the traditional humanities, and helped alleviate some anxiety in the instructors that this digital-centered project might stray from core humanities values.

In the second half of the assignment, students created three-dimensional digital models of their campus building using Trimble’s Sketchup. Sketchup is a popular software tool with an active and enthusiastic online support community. Having access to a wide range of tutorial walkthroughs and videos greatly reduced the learning curve for acclimating both instructor and student to the software. There are also several guides and tutorials written specifically for those in the digital humanities community, which provide helpful tips for applying it to the humanities classroom. In particular, Goldberg benefited from the step-by-step guide for creating 3D models from historical photographs written by Hannah Jacobs at Duke University’s Wired! Lab, as well as Kaelin Jewell’s use of Sketchup to bring medieval building plans to life (Jewell 2017). We have made Goldberg’s intro and advanced tutorials available online. The fourth week was devoted to installing Sketchup on student computers and learning the basics. Students with experience playing video games tended to get up to speed faster than others, as the software’s simulated three-dimensional environment can be disorienting at first. Students, and instructors, should be encouraged to simply search Google if there is a particular process they are struggling with, since there are many helpful tutorials on YouTube.

After we learned the basic functionality, we began to translate our photographs into architectural models in Sketchup. The program allows the user to upload an image and transform the two-dimensions represented within it into three dimensions of digital space. It does this by insinuating axis lines on the image, and “pushing” the façade of the building back into a third dimension, as demonstrated in these two photos:

Figure 1. Screenshot of a building in Sketchup showing how 2D images are projected into 3D space.
 
Figure 2. Screenshot of a building in Sketchup showing early stages of 3D modeling from a 2D photo of a building.
 

Because this process involves transforming a two-dimensional image into three-dimensional space, it is imperative to start with the right kind of image. The one used above demonstrates the proper perspective; essentially, the image must contain a vanishing point. Head-on images do not allow the user to determine how deep the actual physical building is and are therefore not usable in this process.

Next, the user can begin to add features to their model, referring back to the two-dimensional image as necessary. In our class, we allowed students two additional weeks to complete this process. We found this to be necessary since none of our students were previously familiar with Sketchup. This time therefore allowed them to troubleshoot errors as they came up. Class time was dedicated to working on our models together. Students and instructors collaborated with one another and shared strategies and tips. Finally, the completed models were rendered with V-Ray, a plugin for Sketchup which places the models in simulated environments, adding convincing lighting and other scenery effects.

Many projects succeeded. Graham McGrew, for example, started with an unbuilt plan from the 1960s for an A-frame building to house the university’s seminary chapel, as shown in this final rendered image:

Figure 3. Final rendering of un-built “A”-frame chapel made by Graham McGrew.
 

Another student, Bobbie Jo Chapkin, chose to model the existing Seminary building, as shown in this final rendered image:

Figure 4. Final rendering of 3D model of Bethel Seminary.
 

There are clear challenges to incorporating archival practices into digital humanities pedagogy. Regarding our lesson here, students lacking familiarity with video games or other three-dimensional computing tools may find orienting themselves to Sketchup challenging. And, as with any large project, the quality of the final products will depend entirely on the effort and energy students put in. Still, this project successfully combined a focus on the humanistic value of the archive with a modern software application to create a sophisticated experience that recreated an episode from our past campus.

Expanding from this specific project to consider the collaborative efforts described here generally, the intersection between three diverse academic disciplines might be thought to be a difficult place for three busy researchers and teachers to land upon. However, we feel that the best strategy for effecting meaningful interdisciplinary pedagogy in the archive and the humanities is to encourage organic opportunities to develop at their own pace, and to scaffold larger projects such as this one upon the foundations already laid. Our efforts were long in the making—Bethel’s archivist position was created in 1998, its digital librarian position in 2009, and its Digital Humanities position in 2016. Rome, even as a modern HBO set, wasn’t built in a day. Though incorporating the traditional archive into digital undergraduate pedagogy is a relatively recent effort, it still rests primarily on tried-and-true humanities principles like thoughtful reading, analysis, and attention to detail.

Bibliography

Alexander, Bryan and Rebecca Frost Davis. 2012. “Should Liberal Arts Campuses do Digital Humanities? Process and Products in the Small College World.” In Debates in the Digital Humanities, edited by Matthew K. Gold. Minneapolis: University of Minnesota Press.

Association of College and Research Libraries. 2015. “Framework for Information Literacy for Higher Education.” American Library Association, Association of College and Research Libraries Division. February 2, 2015. Accessed October 29, 2018. http://www.ala.org/acrl/standards/ilframework.

Battershill, Claire and Shawna Ross. 2017. Using Digital Humanities in the Classroom: A Practical Introduction for Teachers, Lecturers, and Students. London; New York: Bloomsbury Academic.

Brier, Stephen. 2012. “Where’s the Pedagogy? the Role of Teaching and Learning in the Digital Humanities.” In Debates in the Digital Humanities, edited by Matthew K. Gold. Minneapolis: University of Minnesota Press.

Bryn Mawr College. n.d. “What Are Digital Competencies?” Accessed October 29, 2018. https://www.brynmawr.edu/digitalcompetencies/what-are-digital-competencies.

Burdick, Anne, Johanna Drucker, Peter Lunenfeld, Todd Presner, and Jeffrey Schnapp. 2012. Digital_Humanities. Cambridge, MA: MIT.

Buurma, Rachel Sagner and Anna Tione Levine. 2016. “The Sympathetic Research Imagination: Digital Humanities and the Liberal Arts.” In Debates in the Digital Humanities, edited by Matthew K. Gold. Minneapolis: University of Minnesota Press.

Christian-Lamb, Caitlin and Anelise Hanson Shrout. 2017. “‘Starting from Scratch’? Workshopping New Directions in Undergraduate Digital Humanities.” Digital Humanities Quarterly 11 (3). http://www.digitalhumanities.org/dhq/vol/11/3/000311/000311.html.

Cordell, Ryan. 2016. “How Not to Teach Digital Humanities.” In Debates in the Digital Humanities, edited by Matthew K. Gold. Minneapolis: University of Minnesota Press.

Gerber, Kent. 2017. “Conversation as a Model to Build the Relationship among Libraries, Digital Humanities, and Campus Leadership.” College & Undergraduate Libraries 24 (2–4): 418–433. https://doi.org/10.1080/10691316.2017.1328296.

Goode, Kimberly. 2017. “Blending Photography and the Diorama: Virtual Reality as the Future of the Archive & the Role of the Digital Humanities.” MediaCommons Field Guide: “What is the Role of the Digital Humanities in the Future of the Archive.” May 4, 2017. Accessed October 25, 2018. http://mediacommons.org/fieldguide/question/what-role-digital-humanities-future-archive/response/blending-photography-and-diorama-virtu.

Jewell, Kaelin. 2017. “Digital Tools and the Pedagogy of Early Medieval Visual Culture.” Peregrinations: Journal of Medieval Art and Architecture 6 (2): 30–39.

Karukstis, Kerry K. and Timothy E. Elgren. 2007. Developing and Sustaining a Research-Supportive Curriculum : A Compendium of Successful Practices, edited by Kerry K. Karukstis, Timothy E. Elgren. Washington, DC: Council on Undergraduate Research.

Locke, Brandon T. 2017. “Digital Humanities Pedagogy as Essential Liberal Education: A Framework for Curriculum Development.” Digital Humanities Quarterly 11 (3). http://www.digitalhumanities.org/dhq/vol/11/3/000303/000303.html.

Mackey, Thomas P. and Trudi E. Jacobson. 2014. Metaliteracy: Reinventing Information Literacy to Empower Learners, edited by Trudi E. Jacobson. Chicago: ALA Neal-Schuman.

Mattern, Shannon. June 2014. “Library as Infrastructure.” Places Journal. https://doi.org/10.22269/140609.

Middleton, Ken and Amy York. 2014. “Collaborative Publishing in Digital History.” OCLC Systems & Services: International Digital Library Perspectives 30 (3): 192-202. https://doi.org/10.1108/OCLC-02-2014-0010.

Murphy, Emily Christina and Shannon R. Smith. 2017. “Introduction.” Digital Humanities Quarterly 11 (3).
http://www.digitalhumanities.org/dhq/vol/11/3/000334/000334.html.

Olson, A. 1952. A Centenary History: As Related to the Baptist General Conference of America. Chicago: Baptist General Conference Press.

Rutner, Jennifer and Roger C. Schonfeld. 2012. Supporting the Changing Research Practices of Historians. Ithaka S+R Report.

Sayers Jentery. 2011. “Tinker-Centric Pedagogy in Literature and Language Classrooms.” In Collaborative Approaches to the Digital in English Studies, edited by Laura McGrath. Computers and Composition Digital Press. https://ccdigitalpress.org/book/cad/Ch10_Sayers.pdf.

Svensson, Patrik. 2010. “The Landscape of Digital Humanities.” Digital Humanities Quarterly 4 (1). http://digitalhumanities.org/dhq/vol/4/1/000080/000080.html

Svensson, Patrik. 2011. “From Optical Fiber to Conceptual Cyberinfrastructure.” Digital Humanities Quarterly 5 (1). http://digitalhumanities.org/dhq/vol/5/1/000090/000090.html.

Vandegrift, Micah and Stewart Varner. 2013. “Evolving in Common: Creating Mutually Supportive Relationships between Libraries and the Digital Humanities.” Journal of Library Administration 53 (1): 67–78.

White, Krista. 2017. “Visualizing Oral Histories: A Lab Model using Multimedia DH to Incorporate ACRL Framework Standards into Liberal Arts Education.” College & Undergraduate Libraries 24 (2-4): 393–417.
https://doi.org/10.1080/10691316.2017.1325722.

Wosh, Peter J., Cathy Moran Hajo, and Esther Katz. 2012. “Teaching Digital Skills in an Archives and Public History Curriculum.” In Digital Humanities Pedagogy: Practices, Principles and Politics, edited by Brett D. Hirsch. 79–96. Cambridge: Open Book Publishers. http://books.openedition.org/obp/1620.

Appendix – Technology Tools

Spreadsheets (Google Sheets, Excel)
CONTENTdm
Epson Expression XL 10000 Flatbed Scanner
Sketchup
V-Ray plugin for Sketchup

About the Authors

Kent Gerber, the Digital Library Manager at Bethel University, is responsible for the library’s digital collections, the Makerspace, and collaborative digital scholarship projects. He holds an MLIS and Certificate of Advanced Studies in Digital Libraries from Syracuse University and focuses on how libraries engage with technology, teaching, research, cultural heritage, and digital humanities through facilitating conversation. He serves on the Operations Committee for the Minnesota Digital Library and co-designed new Bethel programs including the Digital Humanities major and the Makerspace.

Charlie Goldberg is Assistant Professor of History and Digital Humanities Coordinator at Bethel University. He helped design and currently oversees Bethel’s undergraduate Digital Humanities major. He has a Ph.D. from Syracuse University, and his primary research pertains to gender and politics in ancient Greece and Rome.

Diana L. Magnuson is Professor of History at Bethel University and Director of Archives, History Center of Bethel University and Converge. She holds a Ph.D. from the University of Minnesota and teaches courses on American history, introduction to history, and geography. As Director of Archives, Magnuson stewards and provides access to manuscripts, media, three-dimensional objects, and digital materials that document the institutional history of Bethel University and Converge. Magnuson also curates the institutional history of the Minnesota Population Center at the University of Minnesota, Twin Cities Campus.

A closeup of a circuit board with several visible chips.
2

Teaching with Objects: Individuating Media Archaeology in Digital Studies

Abstract

Media archaeology presents a framework for understanding the foundations of digital culture in the social histories of technological media. This essay argues that a pedagogy focused on individual, physical artifacts of technological media involves students in constructing a constellation of insights around technology’s mineral, global, and human history as well as its ecological future. By describing and reflecting on a series of assignments and exercises developed for my “Introduction to Digital Studies” class, I show how the intimacy of specific devices can connect to the exigencies of technological media through the lens of media archaeology. The core of this experience is a group project where students take apart an artifact like an old smartphone or game console, attempt to locate the origins of each component in that artifact, and present those origins in a map and timeline. The risks and rewards of this assignment sequence actively engage students in designing their own learning and encourage them to think critically and ethically about the media they consume, the devices that provide the foundation for that consumption, and the global economy of human labor that makes it all possible. In a step-by-step consideration, I consider how the practical and logistical challenges of this assignment sequence support the learning goals I identify as crucial to Digital Studies.

The study of digital culture includes many approaches to understanding the ways that technology has impacted and shaped human society. While a material turn within digital studies has—through subfields and discourses like software studies, platform studies, and media archaeology—effectively moved beyond or “behind” the screens of digital objects as the primary site for inquiry, the hidden, human costs of the labor required to produce these machines has remained relatively invisible. As Lisa Nakamura has explored in her analysis of Fairchild Semiconductor’s plant in Shiprock, NM, that invisibility has intersectional origins and consequences. Nakamura writes that “looking inside digital culture means both looking back in time to the roots of the computing industry and the specific material production practices that positioned race and gender as commodities in electronics factories” (Nakamura 2014, 936).

Within the field of media archaeology, many scholars have demonstrated the value of analyzing the underlying physical artifacts of technological media as culturally situated objects of inquiry in their own right. In introducing his book What is Media Archaeology, Jussi Parikka offers the practical explanation for the term “media archaeology” as “a way to investigate the new media cultures through insights from past new media, often with an emphasis on the quirky, the non-obvious apparatuses, practices and inventions” (Parikka 2012, 2). These insights can be seen as archaeological in Foucault’s sense of “digging into the background reasons why a … media apparatus or use habit is able to be born and be picked up and sustain itself in a cultural situation” (Parikka 2012, 6), where “background reasons” here are situational, cultural, and may extend from the pressures exerted by privilege, class, and capital that organize society.

But despite its significance to these “background reasons,” the individual, personal, human labor that forms a part of that background remains particularly well hidden. In arguing for circuit-bending as a method of applied media studies, Nina Belojevic observes that the selective documentation of digital history creates wide blind spots, but that we can begin to fill those in through considered individuation of technological media. As Belojevic writes, “a close, material study with an awareness of singularity recognizes the significance of physical differences and how they affect our understanding of technologies” (2014), and understanding those differences allows the haecceity of specific objects to orient an inquiry toward those specific, hidden origins. Even when the economics of manufacturing mean that, “in the case of (video game) console production, material parts indicate that while users may be able to trace the circuit board back to the manufacturer, it becomes nearly impossible to find out where every element on a particular circuit board comes from” (Belojevic 2014), understanding and making contact with the fact that those parts came from somewhere is a meaningful reorientation of inquiry, especially for students.

In this article, I discuss how I have approached this reorientation of inquiry into digital culture through an assignment where I direct students to investigate and, therefore, individuate specific artifacts of technological media. What follows is a step-by-step guide to this assignment and its related activities, but it is also my argument that there is pedagogical, historical, and cultural value in shifting the object of inquiry from, for example, “the Game Boy” as designed by Gunpei Yokoi that dominated 90s youth culture to, instead, “this Game Boy” as given to me as a Christmas present in 1990. Focusing on the individuality of the way a digital object is used and received helps highlight the human dimensions and circumstances of a digital object’s manufacture and assembly. In developing this assignment over multiple semesters, I continue to find that the specificity of this approach, which the hands-on methodology necessitates, helps drive home ideas about the contexts and consequences of digital materiality in a way that a series of readings never quite have.

At the University of Mary Washington, I teach a class called “DGST 101: Introduction to Digital Studies” where the goal is to engage students in using digital tools creatively, in understanding digital culture, and in employing digital methods to solve problems and answer questions. Teaching from the framework of media archaeology with an individuated exploration of technological media allows me to trace a thread through each of those three major, overarching outcomes of the class. The following sequence focuses students on a specific physical artifact of technological media as a way of helping them think about the materiality of digital technology in ways that hopefully illuminate the social challenges involved in the labor and mineral-extraction practices that make our devices possible, as well as the environmental legacy of technological obsolescence.

I call this unit of the class “Digital Archaeology,” as opposed to media archaeology, because while I do ask students to think about technology as cultural artifacts, as Parikka suggests, I also direct them to use the web to discover the origins of specific artifacts, which requires them to use investigative skills and reflect on the structure of the internet as a primary source and repository for cultural knowledge. In doing this work, especially when students reach an inevitable dead end, their investigations may encourage them to consider the web as yet another socially constructed, multi-layered repository of narratives about technological media, fraught with contingencies and gaps—as opposed to simply the supplier of answers. The failure of a Google search to yield a straightforward answer can be a powerful teachable moment.

The logistics of the assignment sequence are deceptively simple: I ask students to work in groups to 1) take apart (“sacrifice”) some technological artifact; 2) research the origins of the various components inside that artifact; and 3) use Omeka with Neatline to tell the story of their artifact via a map of its components’ origins. Despite this apparently simple outline, the nuances of this process pose several challenges to students and to me as the instructor. Mainly, since I’m asking students to investigate devices that they provide, I do not have a preconceived sense of what they should be able to learn about a particular device, which puts a significant amount of responsibility on the students in evaluating the knowledge that they are creating. This open-endedness underscores the networked nature of the digital field of inquiry I am asking them to enter into through this assignment.

By offering this assignment to the readers of the Journal of Interactive Technology and Pedagogy, I don’t mean to suggest that what follows is the best or most obvious approach to implicating individual technological media in a digital studies pedagogy, nor do I want to presuppose that the solutions I present to some of the challenges I have faced in the years developing this assignment are necessarily the best solutions. Instead, I hope to offer a useful starting point for other instructors who may be interested in incorporating media-archaeological projects into their classes, and I look forward to hearing what other approaches and problems others have come across.

The narrative and sequence of steps that follows is a “best-case scenario” account of this assignment sequence, avoiding the real-life complications of scheduling and organizing student groups. Most of the examples below are generic or hypothetical, or in specific cases, I include them here with the permission of the students I am citing or describing.

Learning Goals

By completing the assignments and activities in this unit, students will encounter several significant concepts and develop important, transferable skills. Morally, students may find themselves thinking about the economics of globalization and the way their position as consumers of technology is constructed through the labor and often exploitation of workers in other parts of the world. Epistemologically, students will learn to think about the web beyond the first page of Google as they find they must use critical and careful reasoning in the course of their investigation. As they think about these concepts and develop these skills, the secondary texts we’ll use to access these conversations will engage different modalities, encouraging students to think about how one makes an argument in different media. Finally, students will gain some experience working with bibliographic and spatial metadata to tell the story of what they found in their research.

Step 1: Four Phases

The organizing narrative template that this assignment is interested in are the four phases of an electronic device’s life cycle as demonstrated in Molleindustria’s persuasive game, “Phone Story” (2011):

  1. Resource extraction
  2. Manufacturing
  3. Consumption
  4. Recycling

For each of these phases, “Phone Story” presents a short, relatively easy arcade-style game that highlights a key problem within each of these areas. For “production,” players must act as soldiers threatening enslaved children in a “whac-a-mole”–style game mechanic. The factory stage adopts a mechanic similar to the classic Activision game, Kaboom! (1981), where in this case players must operate nets to catch suicidal factory workers. The consumption phase lets players slingshot new “iThings” at consumers, and in the final “recycling” phases, players must sort components by color into the appropriate recycling mechanism.

Four images side by side depicting four levels of the game Phone Story: a top down view of slaves mining in the dirt, a face on view of a factory where workers are tumbling from a building, a retail storefront where workers fling smartphones at consumers, and a conveyor belt where workers separate color coded material into appropriate piles.
Figure 1. Screenshots of the four levels of “Phone Story.”

Like other works by Molleindustria, Phone Story gets its point across with a jarring blend of cheerful imagery and sound with disturbing contexts and implications. In this case, the game mechanics require players to occupy the position of the oppressor, or at least the one benefiting directly and measurably from the exploitation and abuse of fellow humans.

As Parikka describes it, “Molleindustria’s painfully simple game creates [a] map of this darker side of media materiality. This map is about nonorganic and organic materialities: mining, suicides, electronic waste, and planned or meticulously scheduled obsolescence form the perverted side of the attractive, entertaining end device” (Parikka 2015, 89).

Because of the game’s polemical stance and its cartoony tone, student responses to this game vary. Some will be offended by the way it treats violence so casually and trivially, and still others have found the opening level so disturbing that they refused to play the rest of the game.[1]

The debriefing discussion of this game is always interesting. In addition to setting up the basic topic of this unit, it also invites students to think about how the game makes its argument with the procedural rhetoric of each level’s design. We can and often do debate the efficacy of the specific arguments presented here, but acknowledging and being able to describe the way that this game makes arguments differently than a text-based work is one of the core digital studies outcomes for this class.

In the suicides level, for example, a rhetoric of failure makes a particularly chilling point. As Ian Bogost describes it, “if procedural rhetorics function by operationalizing claims about how things work, then videogames [that employ a rhetoric of failure] can also make claims about how things don’t work” (Bogost 2007, 85). As players rush to save falling workers with a trampoline-style net, they may notice that it is nearly impossible to save everyone who jumps, and yet, the threshold for success required to pass the level does not actually require that players save everyone. Success in this level means that the procedural claim here—that safety nets are a reasonable means to prevent suicide—is belied by the operational reality simulated within the game whereby there is some acceptable number of worker suicides that will not harm the company’s outcomes. This can be an uncanny realization, as players are invited to reflect on their assumptions about worker suicide and how the game’s latent encoding of a specific response to that question, “How many suicides are acceptable?” may or may not align with a player’s intuitive sense of morality or justice.

Ultimately, this game is valuable because it delineates a technological object’s four-stage life cycle by illustrating the human side of each. The connection with materiality is indelible, as Parikka concludes: “like labor, [information technology] is material. This materiality is made of components—mineral and chemical—and will some day end up somewhere. It won’t just disappear; both ends of this simple chain include labor and organic bodies, each of which are the registering surfaces for effects and affects of media” (Parikka 2015, 93).

Step 2: “Your Phone Was Made By Slaves”

After this initial introduction, we move into considering the four specific phases of a technological artifact’s life cycle, starting with the original phase where minerals are extracted from the earth, often under brutal and exploitative conditions. For this conversation, we read an excerpt from Kevin Bales’ Blood and Earth: Modern Slavery, Ecocide, and the Secret to Saving the World posted at LongReads.com under the provocative title, “Your Phone was Made By Slaves” (Bales 2016).

In this essay, Bales describes the cycles of abuse and corruption that make cheap minerals possible, and as with the Phone Story game, it is useful in class discussion to analyze the rhetorical moves in this essay. Why, for example, does the essay start with a story about tombstones? What is the link between tombstones and smartphones? How does the author create the sense of place for the Democratic Republic of the Congo, where his story takes place? Why is the author ultimately optimistic by the end of the essay, and is the solution he proposes to these problems actually plausible?

The specific mineral this essay focuses on is coltan, which we learn is used in all kinds of electronics. In class, students can do some initial web research to try and figure out how much Coltan is used in their model of smartphones, and share those results. The inevitable conclusion of this conversation and exercise—so what do we do about it?—is a compelling moment to consider what responsibilities and power we do and do not have as consumers.

Step 3: The Apple Factory

The “production” level of “Phone Story” is based on the widely circulated now well-known story (Merchant 2017) about factory dormitories installing suicide prevention nets. This is a vivid image, but the truth, as usual, is more complicated.

To learn more about the truth of this phase of a product’s life cycle, we listen to two episodes of the podcast This American Life. The first, “Mr. Daisey and the Apple Factory” (Glass 2012) can only be read as a transcript because, as the later episode, “Retraction” (Glass 2012), explains, the original story was retracted after editors at This American Life found problems in Mike Daisey’s original story. Daisey, a monologist, tells a compelling story about visiting Foxconn factories in Guangzhou, but as a narrative artist, Daisey fails to take appropriate journalist care in making several key claims about that trip, including the claim that he met with underage workers at a Foxconn plant, that he met with a secret union of blacklisted workers, and that he encountered armed guards at one factory. The narrative is dramatic and compelling, but as host Ira Glass explains in the “Retraction” episode, many of those specific claims could not be verified.

In that follow-up episode, as we hear Glass and his producers confront Daisey about the claims in his story, they invite an important conversation about the ethics of representation in drama and journalism. Many students feel conflicted: it is easy to be angry with Daisey, but his point that his story, while not factually true in some cases, still has a truth to it that needs to be told. Indeed, the alleged labor abuses of workers at Foxconn have been reported elsewhere, and Apple’s increasing efforts to document via their “Transparency Report” indicates that the public awareness of situations like Daisey’s story is leading to some positive change.

There’s also a striking moment in the “Retraction” episode, at about 29:05, when Ira Glass asks Mike Daisey, “So why not just tell us what really happened?” The excruciating 13 seconds of silence which follow this question perfectly demonstrate the specific modality of audio narrative, and parallel to the discussion of procedural rhetoric in Phone Story, this audio piece invites a conversation about how different media can use different affordances to connect with an audience.

Figure 2. Clip of Ira Glass confronting Mike Daisey in This American Life episode 460, “Retraction.”

Ultimately, this is a frustrating story, and I’ve wrestled with the decision to include it at this point in the assignment, because there’s a risk of students “overcorrecting” to the inaccuracies presented by Daisey. “If,” they might say, “all this is made up, then I can probably just dismiss any reporting about inhumane conditions at Chinese factories.”

Nakamura points out that, despite the flaws in Daisey’s approach, his story did lead to subsequent investigations by the New York Times which revealed other abuses like excessive, mandatory overtime (Nakamura 2014, 941).

As such, one of the lessons of Mike Daisey’s story is that this is difficult work. “Digital labor is usually hidden from users in closed factories in Asia, visible to us only as illegally recorded cell phone video on YouTube or through the efforts of investigative reporters who overcome significant barriers to access—again, nothing to see” (Nakamura 2014, 938). Furthermore, it is always important to document one’s sources; when it comes to making claims of this magnitude— show your work. Hopefully, students bear this in mind as they prepare to conduct their own fact-finding in subsequent steps.

Step 4: Disassembly

This is my favorite part of this project: the day where we bring in the sacrificial devices and take them apart. To prepare for this, students have looked in their closets and trunks for discarded technology, or made trips to the local Goodwill to decide on an object to sacrifice. The advantage of taking apart their own devices is that they already know some parts of that object’s life story, which will be useful later on when they tell the object’s story. Generally, these are old smartphones, iPods, iPads, and other e-readers. The disadvantage of working on this type of device is that their parts tend to be rather smaller, and getting them apart can be somewhat trickier. It may require using a heat gun to detach the screen, the screws are likely going to be very, very small, and the component parts are likely all connected to very thin and fragile printed circuit boards (PCBs). Goodwill-sourced items on the other hand have tended to be old radios, VCRs, and printers, and they are somewhat easier to work with. I simply insist that students choose some sort of technological media device and let them evaluate the pros and cons within their group.

closeup of an open Super Nintendo case, including the main circuit board. A student is using a tool to pry it apart.
Figure 3. Students disassembling a Super Nintendo on deconstruction day.

Setup

The room I teach this class in is the “active learning classroom” at UMW’s Convergence Center. This is a large room with six tables spread out along the walls. Each of these tables includes a PC and a monitor, and seat up to six people comfortably, with a large working area where objects can be arrayed. This arrangement of tables has actually influenced the design of the assignment, since I teach up to 30 students in this class, dividing them into six groups of five each seems to be about right for the amount of work and division of labor I’m requesting.

Supplies

To prepare for disassembly day, I make sure I have on hand a few basic supplies:

  • First-aid kit: This usually isn’t necessary (see the “Safety” section below), but I think having it on hand helps convey to students that they need to be careful as they work.
  • Nitrile gloves: Some devices have sharp edges and messy components as they come apart (printers and VCRs are usually well-lubricated) so it’s useful to have these as an option.
  • Paper towels and cleaning wipes: It’s hard to predict what sort of mess you’ll make, and you always want to leave the classroom cleaner than we found it.
  • Sealable plastic bags: Students will be taking components with them to study, so it’s helpful to have a convenient way to keep their parts organized.
  • Tools. Small-tipped Phillips-head screwdrivers are especially convenient, but some Nintendo devices may have proprietary bits and tri-wing screwdrivers. A kit of these specialty screws is relatively cheap. A set of small hex wrenches and torx bits will also frequently be useful, along with a set of needle-nose pliers and wire-cutters. A plastic bicycle tire remover can be a good way to gently pry things open.

These supplies have normally been plenty, but for a few specialized cases requiring more robust efforts, UMW’s maker space, the “ThinkLab,” is equipped with rotary tools, heat guns, and soldering irons.

Safety

Because students are working with sharp tools to destroy complicated objects, it’s important to take some practical steps ensuring everyone’s safety. Some devices that students may wish to take apart are simply too dangerous to work on, while others will need to be treated with particular care lest they also become dangerous. For one categorical rule, I insist on no objects with CRT screens. Old TVs and PC monitors may be interesting from an artifactual or historical standpoint, but the cathode-ray tube poses a risk of implosion when damaged. And since these devices generally draw a large amount of power, there’s a risk of electric shock from the power supply or flyback transformer, which can hold a charge even if the device has been powered off for many weeks.

A similar hazard with newer devices is that they likely employ a lithium battery, which supposedly can ignite if it is punctured. As students energetically probe their sacrificial technology with screwdrivers and pliers, this is a real risk. For this reason, I simply insist that their first step in taking apart their device—and I supervise this—is to isolate and remove the battery.

In the several years I’ve done this assignment, no one has been injured other than the occasional bloody knuckle.[2]

Process

The disassembly begins with an explanation of the available tools, a sternly-worded reminder about safety, and a general pep-talk about what they’re about to do. One member of each group is designated as a photographer, and it is their job to capture the original, un-disassembled device and each stage in its dissection. Documenting these layers is useful if students ever want to put the device back together again (they rarely do), but since they will eventually be researching individual components, it is important to document the original context where those components resided in their device.

Photographs are also important because some components—a ROM chip, for example—may be interesting in their own terms as discrete parts, but are simply too hard to extract from a PCB without using a soldering iron. The manufacturer’s logo and other information printed on the face of it may provide all the information a researcher needs to discover its origin.

Components

Ideally, each student leaves disassembly day with two or three components to learn about. The specific number of components will vary depending on the device, and herein lies the first of many ontological discussions about what constitutes a “component.” Is the molded plastic case a component, or a support for components? Is the PCB one component, or many? Is the design of the PCB specific enough to count as a component of its own? If there are two memory chips, are they individual components, or collectively a single component? Is each key on a keyboard a component, or is the keyboard itself a single component?

I simply define a component as something that can be individuated by tracing it to a specific material origin, but in some cases, when a device is relatively simple on the inside, I’ve had to help student groups think creatively about their objects. Whether or not students will succeed in tracing that origin remains to be seen in the next step.

Step 5: Research

This is the most valuable step in this assignment, and to students, often the most frustrating. Their task is to take their components and determine as specifically as possible when and where they were made. The amount of reasonable specificity will vary, but the objective is an exact date and street address. Our goal here is to understand the device, but more importantly, we want to have some way of thinking about the human labor behind our devices. By locating a factory with Google Street View, students find that the lives of these workers become much visible. This is only effective once we refuse to accept “some time in the late 90s in China” as an acceptable account of a component’s origin.

The other thing I love about this moment in the project is that it reveals to students something about how their knowledge is constructed from search engines and how much they take that for granted. Presented with a black rectangle with some numbers and symbols printed on it, the next steps to finding out more about it aren’t exactly clear. When simply Googling those numbers doesn’t give them the answer, several students have commented on how jarring this moment can be: the sudden awareness that the answer to a question is not immediately forthcoming can be an important epiphany.

Example: Sharp LH5264N4

What students find they must do is take what they know for sure about their device and use that knowledge to help make educated guesses about things they know less well. This is a process with several steps, which I will illustrate with an example taken from my early-90s Game Boy.

a closeup of a chip on a circuit board.
Figure 4. Sharp chip LH5264N4, from the author’s Game Boy’s circuit board.

When

I know the chip in Figure 4 is from a Game Boy, which were made in this body style from 1989 through the mid-1990s. This Sharp chip includes a four-digit code, “9209”, which by itself may not mean much, and searching for it doesn’t yield anything relevant. However, in context (see Figure 5) on two adjacent chips I see “9211” and another “9209.” The CPU chip includes a copyright date of 1989, but that isn’t as helpful since it only refers to the product’s design, not its manufacture. The “92” is the salient pattern here, and it turns out this is part of an industry-standard manufacturing date code (Woerner 2001) that here identifies the 9th and 11th weeks of 1992. I now have a pretty high degree of confidence about these three chips’ dates of origin.

a closeup of a circuit board with several visible chips
Figure 5. Game Boy DMG-1 PCB.

Who

Finding the location it was manufactured is a bit more challenging, but again, I can expand what I know via logical inference. I have an explicit country of origin, Japan, but I need to do better than that, which starts by identifying the manufacturer. Conveniently, these chips are large enough to include the full name of their manufacturer, “Sharp,” but in many other cases, chips will include a small logo. In the example shown in Figure 6—a 3-pin positive voltage-output regulator from a Super Nintendo console—the stylized “M” inside a rounded square identifies the manufacturer.

A black, rectangular chip with three leads.
Figure 6. 3-pin positive voltage-output regulator from a Super Nintendo, manufactured by Matsushita Panasonic.

There are a number of websites that list and identify these logos, so after skimming the reference page at SiliconInvestigations.com, I can confirm that this particular voltage regulator was made by Matsushita Panasonic.

What

Returning to my Sharp chip, the part number LH5264N4 is another clue, and because the Game Boy is such a popular item for collectors and hobbyists, this is a rare case where a web search actually yields useful information about this component. A database hosted by Joonas Javanainen lists these LH5264 chips as either VWRAM or WRAM, and the list includes several Game Boys (also known as “DMG” for “Dot-Matrix Game”) with similar chip date codes.

For chips with a less famous application, part numbers can still lead to useful information about what a particular chip actually does, particularly if one finds datasheet references available at websites like AllDatasheet.com. There is quite a lot of technical data in these documents that goes far beyond what we need for this inquiry, but in this case, the data sheet for the related LH51264 chip series, which appears in some slightly earlier Game Boys, turns out to be 64K of static RAM.

Where

The Sharp Corporation is a huge company with a long history, and as part of its public relations to investors, the Sharp Global website proudly tells the story of that history. Using the information on the “Sharp Journey” chronology page, I can identify the plants that would have been operational in 1992 when these chips were manufactured: Tanabe, Hirano, Yamato Koriyama (now Nara), Hiroshima, Tochigi, Shinjo (now Katsuragi), and Fukuyama.

Since Tanabe, Hirano, Hiroshima, and Tochigi all seem to be created to produce consumer electronics, that leaves just three plants to choose from. A site-search for information about each of these three yields some more detailed history, including the fact that the Shinjo plant was created to manufacture solar technology. A bit later, in 1985, the Fukuyama plant began production on integrated circuit and semiconductor chips of the sort I’m researching. Based on this information, I can infer with some confidence that this is the plant that produced my chip, located at 1 Asahi, Daimon-cho, Fukuyama-shi, Hiroshima Prefecture 721-8522, Japan, which I can find in a satellite view on Google Maps (see Figure 7).

A screenshot of an aerial or satellite view of a factory in Japan. Nearby there is a river and a residential area.
Figure 7. Screenshot of Google Maps’ satellite view of Sharp’s Fukuyama plant, the probable origin of the RAM chips in my Game Boy.

This example is the best-case scenario for this research: I’ve learned what this chip is and when and where it was manufactured; I can explore the area on Google Street View to learn more about the neighborhood and get a sense of the community; and I can use this location data later on to tell the archival story of my Game Boy.

Students often are not always this lucky, since some manufacturers will be harder to track down than others, and some countries (China, mainly) are far less accessible via Google Satellite. For the purposes of the assignment, however, it’s the process that’s most important, so I demonstrate this process with my chips and make sure that students document each step in their research so that when they inevitably hit a dead end, they can at least describe the steps that they took to get there.

Step 6: Recycling and e-Waste

After we’ve disassembled the sacrificial devices and students have begun their individual research, we turn to a consideration of e-Waste or electronics recycling. This is an appropriate point in the assignment because the acts of deconstruction we’ve just committed are very similar to the first stages of processing e-Waste for recycling. To learn more about this process, we look at two videos which document some of the challenges that come with e-Waste. The first, a PBS NewsHour segment featuring an investigation by Jim Puckett who tracks electronics submitted for recycling to a final destination in China where they are being processed in unsafe, illegal conditions (PBS 2016).

The narrative of the piece is dramatic, but it ends on an optimistic tone as it describes safer processes being developed and an encouraging response from Goodwill Industries about the findings of the piece. This association with Goodwill has some extra weight as, often, students will have “rescued” their items from Goodwill, thus preventing them from entering this problematic supply chain to China.

A man (Jim Puckett) looks at a pile of electronic waste.
Figure 8. Jim Puckett of the Basel Action Network investigates an illegal recycling facility in China in a 2016 NewsHour segment.

A second documentary film, E-Wasteland (2012), directed by David Fedele takes a different approach. With intertitles instead of voice-over narration or commentary, the filmmakers choose to simply present footage of workers processing electronic waste for recycling in Agbogbloshie, Accra, Ghana. Young men burn bundles of wire to extract the copper, a man rests in a shelter made from old refrigerators, and children smash apart a CRT TV with hammers and rocks. The tone is stark and almost eerie, and like the discussions of Phone Story’s procedural rhetoric and “Retraction”s audial specificity, the comparison of these two films’ styles helps draw students’ attention to the modality of the visual medium and the different ways in which these two films’ different genres fulfill different expectations for their audiences.

Two children in a junk yard pry the screen off of a television or computer monitor.
Figure 9. Children smash apart a CRT TV in E-Wasteland (2012).

The PBS segment is critical but informative and optimistic, while the documentary’s flat presentation and general quietness makes it more emotionally compelling to many students. It also raises the question of what we will do with our devices at the conclusion of this project.[3]

Step 7: Working with Omeka and Neatline

Finally, after gathering data, comparing notes via their group’s Slack channel, and getting as close as possible to chronological and geographic specificity regarding the origins of their components, it is time to share that data via a website, which we create using Omeka. In recent years, I have directed students in using the Neatline plugin, but most recently, I found Omeka S‘s Mapping Module to be sufficient.

Working with Omeka can be a challenge, but after trying more accessible but incomplete alternatives like StoryMap, TimeMapper, and Timeline JS, I’m convinced that Omeka S is the best choice for this project for several reasons. Though it currently lacks a timeline presentation, the map function is essential to the key learning outcomes of the assignment, the page editor and site builder encourages groups to think of their projects in a sequential, narrative form, and most importantly, using Omeka directly for data entry (as opposed to importing a spreadsheet) encourages student groups to normalize their metadata. Working in a single platform also makes it easier for me to find my students’ work and to let them access each other’s.

While the basic task of entering data in Omeka is relatively simple, the challenge for many students comes in thinking through the metadata of their artifacts should be. The terminology in Dublin Core can be vague and contradictory, and we have to do extra work to understand what a field like “Creator” means for an artifact of digital technology: Is that the designer? The distributor? The factory where it was made?

I’ve found it difficult to push students through that kind of thinking at the same time as they’re learning a new platform, so recently, I’ve given each group a Google Spreadsheet with headings for each kind of metadata we’re interested in as well as a sample row of data and comments to explain what kind of information each field is capturing. Letting groups work in a shared Google Spreadsheet also helps them see each other’s work, and compare notes as they work simultaneously.

At this point in the semester, students have been working with WordPress consistently, and most find Omeka relatively easy to use, even if it’s not quite as intuitive. Ultimately, while it is our goal to create a public-facing resource that is useful to interested audiences, the learning in this assignment comes more from the process than in the value of the final product. Therefore, in terms of evaluating students’ work, I have realistic expectations for the quality of their maps. That said, many have been excellent. The projects from the most recent semester (Spring 2019) are available for browsing at devices.dgst101.net.

Student Responses and Outcomes

As I noted earlier, students’ most frequent comment about this assignment sequence acknowledges its difficulty, some of which comes from the usual headaches endemic to requiring group work. The more germane frustration students describe is with the difficulty of the task I have set before them in this assignment: I’ve asked them to discover things that I don’t know, so they have an unusual responsibility to evaluate their own work. As they go down the rabbit hole of their research, they have to decide when the answers they think they’ve found are accurate, credible, and good enough to bring back to their group’s project.

For this reason, I find that the difficulty-level for this assignment sequence is the kind of challenge that students feel proud to have completed. Not only do students learn about the material history and legacy of their devices, they gain a transferable skill in conducting research with the web that is more like investigative reporting than the research papers they’re accustomed to. In a comment that exemplifies this perspective, Stephanie writes,

After many long nights spent with countless tabs open, and even more pinned tabs, I feel that this project is finally coming to an end. While the work my group and I conducted wasn’t easy and challenged my faith in universal search engines, I feel as though I’ve learned a lot from this process.

Not only do I now have way more information about the processor inside of a Nintendo Game Boy Advance than I ever thought I would, but the concept that some of this information was so hard to find, or turned up absolutely nothing really left me wondering. Why? Why is it so hard to find this information?

Several students have emailed or even called companies to track down the information they needed. One student, Nikolas, found himself following the trail of Congo-originating Coltan into a rabbit-hole that ended up in an encrypted chat with someone claiming to be an accountant at a firm that managed supplies for the chip manufacturer Nikolas was researching.

After successfully connecting, this mysterious man behind the curtain told me that he was an accountant for Holtek in Taiwan named John. … I asked him if he worked at the HsinChu City location, but he wasn’t comfortable giving out that specific of information. After talking about the political climate in China, and how much censorship affects web usage in all of Asia, I got back on topic to coltan. I wanted to know if all the materials that Holtek used came without slave labor… John informed me that, for that product and the year it came out, the coltain [sic] most likely came from Australia.

It’s impossible to verify the authenticity of this informant, but the fact that Nikolas went so far into the data that he found himself in conversation with a potential primary source is an excellent example of the way networked digital tools enable the kind of inquiry necessary to investigate, understand, and act on the many issues and problems that converge on the material histories of digital artifacts.

Lisa Nakamura has articulated a need within digital cultural studies and media archaeology to look within the histories of our platforms to understand the configuration of labor as it exists today. In a study of the way Fairchild Semiconductors employed Navajo women at their plant in Shiprock, New Mexico, Nakamura writes, “immigrant women of color were hailed as the ideal workforce because they were mobile, cheap, and above all, flexible; they could be laid off at any time and could not move to look for alternative forms of work. … The notion that Indians were ‘inherently flexible’ both racializes and precedes the idea of flexible labor that informs much of the research on globalization in the information age” (Nakamura 2014, 926). Looking at the ways race, gender, geopolitics, and income inequality impact the economic model we are all implicated in as consumers of cheap electronics is one way to begin a conversation toward more equitable and humane solutions.

The deceptively simple task set forth in this sequence of conversations, assignments, and projects for my students in Introduction to Digital Studies—to take something apart and find out where its parts were made—leads students toward that critical evaluation of technological artifacts as archives of social meaning. In doing so, students gain experience in conducting web-based research, in analyzing different modalities of digital rhetoric, and in communicating complex information within a spatiotemporal content management system.

Notes

[1] Regrettably, some have taken that trivialization to heart and assume that these problems must not really be that bad if it’s OK to make a game about it. A refusal to play for moral reasons is always a justifiable excuse, in my view, but it is worth noting that some students will also find that they cannot play this game for technical reasons. It uses Flash, so most browsers now require additional configuration that some students may not realize they need to do. Because of the way the game mechanics seek to persuade by implicating the player in violence, I insist that students make their best effort to play it themselves, rather than watching someone else play and talk about it in a YouTube video.
[2] In a recent example, a student cut himself when he insisted on using a pocket knife he’d brought with him to cut some wires, instead of just using the wire cutters I provided. He just needed a small band-aid, fortunately.
[3] I volunteer to collect materials that students don’t plan to or know how to recycle. Some of this material has found secondary afterlives in students’ art projects.

Bibliography

Bales, Kevin. 2016. “Your Phone Was Made By Slaves: A Primer on the Secret Economy.” Longreads (blog). March 8, 2016. https://longreads.com/2016/03/08/your-phone-was-made-by-slaves-a-primer-on-the-secret-economy/.

Belovic, Nina. 2014. “Circuit Bending Videogame Consoles as a Form of Applied Media Studies.” NANO: New American Notes Online, no. 5. Accessed March 27, 2019. https://nanocrit.com/issues/issue5/circuit-bending-videogame-consoles-form-applied-media-studies.

Bogost, Ian. 2007. Persuasive Games: Videogames and Procedural Rhetoric. Cambridge, MA: The MIT Press.
cbgames. 2009. Kaboom Commercial [Activision] for Atari 2600. https://www.youtube.com/watch?v=FkNztWEVxD4.

Cook, Stephanie. “Digital Archaeology [Reflection] – Stephanie’s Space.” 2017. Accessed June 11, 2018. http://blog.stephaniemc.com/digital-studies/digital-archaeology-reflection/.

Duhigg, Charles, and David Barboza. 2012. “Apple’s IPad and the Human Costs for Workers in China.” The New York Times, January 25, 2012, sec. Business Day. https://www.nytimes.com/2012/01/26/business/ieconomy-apples-ipad-and-the-human-costs-for-workers-in-china.html.

Fedele, David. 2012. E-Wasteland. Documentary, Short, News. http://www.imdb.com/title/tt2414034/.

Foucault, Michel. 1972. The Archaeology of Knowledge. Translated by A. M. Sheridan Smith. New York: Pantheon.

Glass, Ira. 2012. “Mr. Daisey and the Apple Factory.” This American Life. Accessed June 4, 2018a. https://www.thisamericanlife.org/454/mr-daisey-and-the-apple-factory.

———. 2012. “Retraction.” This American Life. Accessed June 4, 2018b. https://www.thisamericanlife.org/460/retraction.

Javanainen, Joonas. 2018. “Game Boy (DMG).” Game Boy Hardware Database. June 3, 2018. https://gbhwdb.gekkio.fi/consoles/dmg/.

Kaplan, Larry and David Crane. 1981. Kaboom! Activision, Inc.

Kottke, Jason. 2012. “The Silence of Mike Daisey.” Kottke.Org (blog). Accessed June 4, 2018. https://kottke.org/12/03/the-silence-of-mike-daisey.

“LH5164A Datasheet(PDF) – Sharp Electrionic Components.” n.d. AllDatasheet.Com. Accessed June 11, 2018. http://www.alldatasheet.com/datasheet-pdf/pdf/42972/SHARP/LH5164A.html.

Merchant, Brian. 2017. “Life and Death in Apple’s Forbidden City.” The Observer, June 18, 2017, sec. Technology. http://www.theguardian.com/technology/2017/jun/18/foxconn-life-death-forbidden-city-longhua-suicide-apple-iphone-brian-merchant-one-device-extract.

Nakamura, Lisa. 2014. “Indigenous Circuits: Navajo Women and the Racialization of Early Electronic Manufacture.” American Quarterly 66 (4): 919–41.

Parikka, Jussi. 2012. What Is Media Archaeology. 1st ed. Polity.

———. 2015. Geology of Media. Minneapolis: University of Minnesota Press. http://ebookcentral.proquest.com/lib/umw/detail.action?docID=1983521.

PBS NewsHour. 2016. “Watchdog Group Tracks What Really Happens to Your ‘Recycled’ e-Waste.” https://www.youtube.com/watch?v=3sUT1u4WMP4.

Phone Story. 2011. Android / iOS / Flash. En. Molleindustria.

Sharp Corporation. “1980-1984 : Becomes Comprehensive OA Manufacturer.” n.d. Sharp Corporation. Accessed June 11, 2018. http://www.sharp-world.com/corporate/info/his/h_company/1980_1984/index.html.

———. “1985 : Making Products That Match Individual Lifestyles.” n.d. Sharp Corporation. Accessed June 11, 2018. http://www.sharp-world.com/corporate/info/his/h_company/1985/index.html.

———. “A Sharp Journey: Chronology.” n.d. Sharp Corporation. Accessed June 11, 2018. http://www.sharp-world.com/corporate/info/history/chronology/.

Silicon Investigations, Ltd. 2018. “Silicon Investigations Integrated Circuit Manufacturer Logo Identification Page.” Silicon Investigations. May 16, 2018. http://www.siliconinvestigations.com/Logos/logos.htm.

Woerner, Jeorg. 2001. “Date Codes.” Datamath Calculator Museum. January 13, 2001. http://www.datamath.org/Datecodes.htm.

About the Author

Zach Whalen is an Associate Professor at the University of Mary Washington where he teaches courses in Digital Studies at the intersections and overlaps between digital creativity, digital culture, and digital methodologies. His courses span topics such as electronic literature, videogames, graphic novels, writing for digital media, and creative coding. He has published journal articles and book chapters in many of these areas coedited Disability in Comic Books and Graphic Narratives (2016) with Chris Foss and Jonathan W. Gray and Playing the Past: History and Nostalgia in Video Games (2008) with Laurie N. Taylor.

Image of the Voyant-Tools interface.
0

“Imagining What We Don’t Know”: Technological Ignorance as Condition for Learning

Abstract

This article explores how digital interfaces matter in literary analysis. It takes “critical reading interfaces” as a set of criteria that engage a text’s formal aspects in order to facilitate the close reading of literature in electronic environments. The author examines two digital projects, Women Writers Online and Voyant-Tools, which change the way that readers “see” literary texts to reveal new interpretive possibilities. This examination finds that the pedagogical benefits of digital resources stem from an interface that both makes explicit the text’s formal elements and encourages the reader to interact and experiment with these elements. It also finds that, though exposure to the project’s technical modeling (its digital encoding and formatting) allows readers to gain purchase over the formal structures that determines meaning-making, technological ignorance (if harnessed thoughtfully) might propel readers toward novel and unforeseen interpretations about a text’s formal aspects. The differences between the two projects present a space for teachers to consider the effects of critical reading interfaces in the English classroom.

Introduction

In Radiant Textuality: Literature after the World Wide Web, Jerome McGann presents a fictional dialogue between two allegorical figures, “Pleasure” and “Instruction.” Instruction complains of the hypocrisy of teaching poetry, particularly in its “pretense to freedom of thought,” in which the “discussion” between teacher and students is often guided by the agenda of the teacher (McGann 2001, 33). In response, Pleasure suggests a return to Susan Sontag’s proposal in her famous essay “Against Interpretation” which calls for an “erotics of reading” through the act of recitation: “Let’s get back to the words, to the language—to the bodies of our thinking. I’m ‘against interpretation,’ I’m for recitation” (McGann 2001, 33). Here, rather than ask students to explain what they think the poem means (which, according to Pleasure, is often a guessing game about what the teacher wants to hear), the students spend most of their class time reading the poems out loud. By “performing” the poem through recitation, Pleasure explains, students manifest the poem’s formal arrangement in sound, “something like the way musicians interpret a piece of music by rendering the score” (McGann 2001, 31). As the first chapter of McGann’s book, which sets the stakes of his intervention, this dialogue demonstrates quite literally how form—“the words … language … bodies of our thinking”—matters in interpretation (2001, 33). By making the students complicit in its re-creation, Pleasure and Instruction finally agree, performing poetry is the best way to teach it.

The practice of recitation, of reading poetry out loud, is a time-honored hermeneutical engagement between the student and literature in print. It is widely deployed in English classrooms to scaffold the activity of close-reading, or the thoughtful, critical analysis of a text that builds from significant details or patterns toward meaning. This paper questions how teachers might facilitate this kind of close analysis in the digital age. If rendering a musical score is an apt analogy for reading printed poetry out loud, a suitable analogy for engaging with texts on a screen might be the act of looking through a microscope or a telescope, which drastically changes the viewer’s vantage of the textual object. According to some literary critics, however, the shift in perspective provided by electronic formats evacuates the reader’s critical encounter with text. Mark Hussey, one of the editors of the digital archive Woolf Online, warns that many digitization efforts run the risk of disenfranchising readers who are limited by the way they interact with texts on a screen. Hussey explains, “The activity of reading is altered when the embodied process of turning the page is replaced by swiping, clicking, searching, or typing” (Hussey 2016, 268). Though he appreciates the accessibility that digital archives (like Woolf Online) provide, other kinds of digital projects “require vigilant implementation to avoid the potentially reductive effects of readers becoming users” (264). Hussey’s concern that engaging with texts through a screen will make reading too easy is complemented by an alternative concern—that it will make things too difficult. Many instructors remain reluctant to incorporate tools that create steep learning curves for their students. According to the Pew Research Center, the worry over the “digital divide,” which has largely focused on access to technology (those who have versus those who have not), now includes a new metric, digital readiness (Horrigan 2016). As technology continues to proliferate, the question increasingly shifts to whether digital resources are sufficiently intuitive for learning. For instructors then, technological useability is a double-edged sword—either digital resources will be so user-friendly that they evacuate the demands of close-reading, or they will be too complex to offer pedagogical benefits.

One way instructors in English might grapple with the useability concern is to approach the digital interface itself as site for criticism. Offering their own telescope metaphor, Geoffrey Rockwell and Stephen Ramsay explain that digital projects and tools might function as “telescopes for the mind” (Rockwell and Ramsay 2012). By presenting information in a certain form, they explain, such projects make interventions on the act of seeing, and turn vision into a framework for theorizing. With digital archives and editions, for example, the display and appearance of the interface inflects the user’s interpretation of its contents. Such an interface might include an option for annotating the text and reading another’s annotations, or it may include a search bar that allows readers to search for keywords. These “social-reading” or “deep-searching” interfaces visually filter the textual object to both guide and provoke analysis on the text. Adopting Rockwell and Ramsay’s optical analogy, then, teachers might use digital resources for the way they change the way that readers “see” literary texts to reveal new hermeneutical possibilities.

However, Rockwell and Ramsay are quick to point out that the opacity of technological processes presents an obstacle for assessing how such processes work. Their argument extends the concern about digital readiness in the classroom to one about critical potency of digital tools. When a digital tool’s construction is opaque to the user, who operates at several levels of remove from the underlying computational processes, it presents a paradox for practicing criticism: “The only way to have any purchase on the theoretical assumptions that underlie a tool would be to use that tool. Yet it is the purpose of the tool (and this is particularly the case with digital tools) to abstract the user away from the mechanisms that would facilitate that process” (Rockwell and Ramsay 2012). Here, the very technology that allows the scholar to “see” the object in a new way also prevents her from examining how this “visual” process works. The authors offer the example of a simple concordancing tool, which aggregates repeated words in a text into frequency lists. Although this word aggregator makes an implicit argument about the unity of a text and the author’s intentions, its underlying technology remains inaccessible to anybody using the interface (Rockwell and Ramsay 2012). It seems that, to be pedagogically effective, digital resources should be intuitive to use and offer insight into the technology used to create the resource. The question then becomes how one might experience the shift in perspective that a tool provides while accessing the underlying technical processes that power the tool. For close-reading instruction, then, the resource needs to present an intuitive interface that addresses its own construction.

This paper assesses digital projects in English for how they deploy critical interfaces for reading that make explicit both the formal qualities of the source text and the computational tools or methods that engage those formal qualities. I examine the fine line between critically inhibitive and critically productive interfaces of two very different digital projects: first, a full-text collection of transcribed writings, encoded for smooth search and online reading, and second, a text analysis tool that instantly visualizes textual data into graphs, charts, and lists. I will frame my discussion on the pedagogical benefits of each project with more of McGann’s speculations from Radiant Textuality about the potential of close and critical engagements with texts in online environments. McGann’s first point, implied by the dialogue between Pleasure and Instruction, is already assumed as my first criterion—that digital projects in English studies ought to explicitly engage the formal (and therefore embodied) qualities of literature, attending to what Sontag calls the “erotics of reading.” Additionally, his points about deformance, “quantum poetics,” and speculation (examined below), will situate my approach for using online resources in the teaching of close-reading. These points together suggest how an interface can model and activate textual form for literary criticism, presenting a “telescope for the mind” that offers glimpses into its underlying technological processes.

Toward a Quantum Poetics

McGann, who writes nearly 20 years before Rockwell and Ramsay, has a positive take on the issue of technological opacity. Rather than aim to understand how a digital tool works, McGann emphasizes how the tool unleashes meaning. The unique affordance of digital environments, according to McGann, is that they allow for numerable interventions upon the textual object. Just as reading poetry out loud “embodies” its formal qualities through vocal activity, so digitizing a text opens it up to various levels of formal manipulations. In the chapter “Deformance and Interpretation,” McGann and Lisa Samuels coin the term “deformative criticism,” which describes any activity that distorts, disorders, or re-assembles literary texts to discover new insights about its formal significance and meaning. They offer Emily Dickinson’s proposal of reading a poem backwards as a key example of deformative criticism. Here, “the critical and interpretive question is not ‘What does the poem mean?’ but ‘How do we release or expose this poem’s possibilities for meaning?” Active first and contemplative later, deformance aims to “disorder … one’s senses of the work,” estranging the reader from the familiarity of the text (McGann 2001, 108). By privileging performance over intellection, this method regards “theory” as secondary, demoting it to an afterthought of practice.

McGann’s proposed framework of deformance opens the text to new interpretive possibilities. In his chapter, “Visible and Invisible Books in N-Dimensional Space,” McGann speculates that engaging with texts on a computer could be as intimate a process as engaging with them on paper, but with more sophistication and efficiency. Readers may not only have a “handle” on the object, but also be able to manipulate and transform it in virtually infinite ways. Ideally, the tool should work alongside the reader’s intuition, as a “prosthetic extension of that demand for critical reflection,” by which the reader is able to feel her way through the text (McGann 2001, 18). The tool could also be equipped to process literature from a variety of perspectives, addressing the different qualities of the text that emerge during the reading process. McGann introduces the term “quantum poetics” to indicate the volatile potentiality for meaning contained in every element of a literary text. He explains that, “Aesthetic space is organized like quantum space, where the ‘identity’ of the elements making up the space are perceived to shift and change, even reverse themselves, when measures of attention move across discrete quantum levels” (McGann 2001, 183). The meaning of particular words in a literary text depends upon a multitude of factors, from antecedent readings and pathways through that text, to the significance of immanent elements such as typography and blank spaces, all of which the reader can only process a limited amount. In its potentiality, McGann asserts, “Every page, even a blank page … is n-dimensional” (2001, 184). Accordingly, digital tools could expose literature’s inherent potentialities by carving new paths across familiar texts. In this way, McGann argues for tools that facilitate tactile and intuitive engagements of texts within an environment that opens itself up to multiple dimensions of reading.

Digital tools that “deform” text to reveal its quantum potential operate within a speculative mode of criticism. In his chapter, “Editing as a Theoretical Pursuit”, McGann explains how his development of a print edition of Lord Byron’s works and the digital edition of the Rossetti Archive influenced his thinking about the different critical possibilities between print and digital editions. Paper-based editions, according to McGann’s experience, are inadequate and limited, and newer editions often “feed upon and develop from [their] own blindness and incapacities” (McGann 2001, 81). By contrast, digital editions can be designed for complex, reflexive, and ongoing interactions between reader and text. Indeed, “[a]n edition is conceivable that might undertake as an essential part of its work a regular and disciplined analysis and critique of itself” (McGann 2001, 81). Here, McGann explains that, because transforming print text into electronic forms necessarily “deforms” the text, changing one’s view of the original materials, each act of building the edition calls its original purpose into question. McGann points out that his work on the Rossetti Archive brought him to repeatedly reconsider his earlier conception and goals, asserting that the “Rossetti Archive seemed more and more an instrument for imagining what we didn’t know” (2001, 82). The technical experience of editing electronic texts encourages the speculation on new potentialities about its presentation.

If the digital tool attends to a text’s form or the potentiality for formal manipulations, the result of interacting with it will always be unpredictable. In allowing the reader to “imagine what [she] do[esn’t] know” (McGann 2001, 18), deformative criticism relies on speculation, which implies a certain level of ignorance. The reader does not need to know where her deformative experiments will lead, nor does she need to understand the underlying computational processes that facilitate her experimentation. Instead, her focus ought to be on the present unfolding of the text (what Pleasure calls “recitation”) in the current unleashing of its quantum poetics. Thinking back to the telescope metaphor—as long as a telescope allows for experimentation, for switching in and out of focus, one need not break it open to see how it works. The reader’s ignorance here, if addressed thoughtfully, can be conducive to learning: free of the technical details about the text, she can focus on how the process of reading unleashes new meaning. As I begin to review of the following digital projects and tools, therefore, I will examine how the reader’s experience with the interface hinges on the balance between ignorance and discovery. My attention to the text’s current deformance, rather than an “ideal” or comprehensive version of the text, will suggest further ways that editions might fulfill their objectives or enhance the formal unities that are already evoked. What do these projects reveal about the “n-dimensional” qualities of their textual objects? How do their formal unities productively play on ignorance to lead readers to imagine what they don’t know? And how might the editors and developers enhance this revelatory process? My examination finds that the pedagogical benefits of digital resources stem from an interface that both makes explicit the text’s formal elements and encourages the reader to interact and experiment with these elements. In some cases, such benefits rely on the extent to which the projects engage not only with the source text, but with their own technical modeling.

Women Writers Online

The first project is Women Writers Online (WWO), a digital archive by the “Women Writers Project,” which collects unfamiliar texts by underrepresented women writers between 1400 and 1850 (Women Writers Project 2016). Active for nearly 30 years, the project is maintained by Julia Flanders, Director of the Digital Scholarship Group at Northeastern University, and its senior programmer, Syd Bauman. In 2002, Flanders remarks that “Even now, most people—certainly most students—could name, if asked, only a handful of women writers active before 1830” (Flanders 2002, 50). WWO aims to correct this deficiency by offering hundreds of transcribed and encoded primary text documents, which users can search and read directly on their screens. WWO’s most impressive feature is this browsing interface that facilitates corpus navigation across various “panes” from the large textbase to the individual object (Figure 1). The “search pane” contains a search box and filters that allow the user to narrow results according to genre and date. The “results pane,” in the middle of the page, contains a list and timeline of the search results, which appear as full texts on the “text pane,” on the right side of the page. In moving from the left to the right side of the screen, the user progresses from keyword to specific text. This browsing interface allows her to sift through a large corpus in order to find something very specific.

 

Search results for "virginity" on the WWO interface
Figure 1. Search results for “virginity” on the WWO interface. Source: wwo.wwp.northeastern.edu.

Considering this search capability, the project’s most effective feature is its facilitation of discovery. Here, the handling of the search results, which eases the reader’s “zooming in” across results and chronology to the individual text, enacts the critical shift in perspective described by Rockwell and Ramsay’s telescope metaphor. Browsing through these search results engages the reader in online research that is smoother and more controlled than most academic or archival databases. To demonstrate the scholarly potential of the search functionality, a number of supplementary resources of the project operate alongside main website. Flanders asserts that,

The pedagogical aims of the Brown University Women Writers Project (WWP) go back to its very origins; at its core, the WWP aims to improve teaching. In a deeper sense, however, this attempt arises from a relocation of the ground of teaching. That is, the project attempts to make student work more like that of professionals in the field; it attempts, in short, to make learning more like research. (2002 49)

To “make student work more like that of professionals,” the WWP collects and publishes projects based on student research on WWO. It presents various “Exhibits” of work that supplement and contextualize the topics and texts in the database. These projects highlight fledging scholars’ engagement with the results of their searches, using the texts as a starting point for further research. Another resource, the “Lab,” is an “experimental area” for the WWP developers, where they can explore the encoded XML data in the form of visualizations. The resulting “prototypes” consist of graphs, diagrams, and maps of the texts, accompanied by a description of the original source, computational processes, and suggestions for further research. In one example that compares the ratio of male to female speakers in two seventeenth-century plays, the author explains that, “these at-a-glance comparisons can serve as the starting point … perhaps prompting questions about the different motives, audiences, and dramatic conventions shaping the two works” (“Visualizing Speakers in Drama by Gender” n.d.). The editors explain that some of these prototypes may eventually be incorporated into the functionality of the WWO website. (“WWO Lab” n.d.).

Despite these resources, the WWO interface misses an opportunity to fully engage its potential for discovery by obscuring its own encoding. The editorial statement emphasizes the source texts as static, informational objects: “We treat the text as a document more than as a work of literature … As a result, we do not emend the text or create critical or synthetic editions; each encoded text is a transcription of a particular physical object” (“Methodology for Transcription and Editing” n.d.). The editors further assert how the interface presents the various texts on an even level, resisting traditional organizational schemes. As Flanders and Jacqueline Wernimont point out, one of the major interventions of the project is the way it dissolves generic categories: “The WWO search tools bring into a single view texts that conventionally fall into political, literary, dramatic, and imperial history genres … generat[ing] a different, more intimate experience of boundaries and their constructedness” (Wernimont and Flanders 2010, 428–9). While this documentary and democratic approach for presenting texts facilitates the navigation from corpus to object, the interface might account for the relationships within the corpus in ways that further the project’s goal of resisting traditional categorization for its individual texts. Instead of presenting the archive according to a tree structure, which progresses from keyword to results to individual texts, the project might engage contextual information within the individual texts, reflecting a more complex network of information about them. To explore these contexts, the project ought to reveal some of the information encoded by the editors onto the text. Encoded information includes anything from descriptions of authors, cultural references, or key features of the text. For those who are unfamiliar with encoding, and particularly with TEI encoding (the method used by the WWP), it is a standard method for marking up texts in large-scale digitization projects in the humanities. By making explicit the underlying models that structure the individual texts, the interface would encourage the reader to sustain a deep reading within the primary document itself. Doing so would also allow the project to take full advantage of the extensive editorial work already begun by the WWP, who go through the long and arduous process of encoding a text for digitization.

In particular, revealing the TEI would enhance the activity of close-reading within the individual texts. Considering the project’s aim to recover women’s writing, it ought to attend to how the intellectual and technical labor that comes with creating such a resource determines a reader’s engagement with that resource. Because the encoding work for WWO is inaccessible behind a clean interface, readers cannot know the many of the key features about the source text. Drawing attention to the encoding (which remains completely obscure to the casual user), would inform how modeling these texts for digital formats implicitly affects meaning-making. For example, using TEI, editors “tag” the structural, renditional, and conceptual elements of text, including elements such as paragraphs breaks, emendations, and personal or cultural references. In an editorial statement on the WWP website, the editors explain that “A single encoded transcription can be used to produce many different forms of output, and in a sense many different ‘editions.’ The current presentation of these texts represents one of these possible editions” (“Display Conventions for the Women Writers Online Interface” n.d.). The editors assert that this presentation prioritizes “readability,” and as such, suppresses many elements in the visual output which are originally encoded into the text:

In general, we have tried to present the information needed to grasp the visual language of the text—font shifts, alignment, and so forth—in a way which will also provide flexibility for an effective and consistent display on the web. The display you see in WWO represents some aspects of the source text fairly closely (such as capitalization and italics) but regularizes or suppresses others. (“Display Conventions for the Women Writers Online Interface” n.d.)

Besides “printed page numbers and most forme work,” the exact nature of the suppressed elements remain obscure to the reader (“How to Use the WWO Interface” n.d.). The editors do not fully account for them in their editorial statements, nor do they make publicly available any of the XML files that describe the encoding. These files likely contain various elements would bear on interpretation and thus would be of use for close-reading activities. In fact, in a 2013 article for this journal, Kate Singer demonstrates how TEI can be used to teach close-reading to undergraduates. Her students’ painstaking work in tagging elements such as metaphor “reframe[s] reading as slow, iterative, and filled with formal choices” (Singer 2013). In modeling textual elements, encoding makes explicit the ways that digitizing and editing a text instills a specific reading or interpretation of that text. Speaking on TEI projects more general, Singer points out that, “Because XML encoding often hides itself from view, TEI editions can give students a type of double-blind reading, where they can see a supposedly transparent document and then examine the editing and marking underneath” (Singer 2013). By revealing some of this underlying encoding work, WWO would more effectively facilitate an engagement with what McGann calls a text’s “quantum poetics.” The search function of the interface would unravel new avenues for discovery within the individual texts in the corpus.

Voyant Tools

Diverging significantly from archival forms, Voyant Tools is a web-based application that facilitates text analysis in real time, in the form of instant, dynamic visualizations of textual data (Rockwell and Sinclair 2016b). Developed by Geoffrey Rockwell and Stéfan Sinclair, both humanities professors, this website offers a powerful, open-source tool that processes text into a variety of visualizations on word frequencies, contexts, and networks. In keeping with the free and open principles of software development, Voyant Tools synthesizes software from existing open-source libraries, and the final product has affinities with older text exploration and analysis projects developed by Rockwell and Sinclair, Hyperpo and TAPoRware, respectively. Voyant also offers extensive documentation, including a statement of design principles, tutorials for major features, individual descriptions of each tool, and directions for how to export and reference work. The major principles included in the design statement are “scalability,” to facilitate large corpus sizes and processing speed, “ubiquity,” for quick and convenient integration, and “referenceability,” to encourage attribution and incorporation of the tool in scholarly work (Rockwell and Sinclair 2016a). In keeping with open-source principles, the project is “extensible,” allowing for the addition of new tools as well as the adaptation of existing ones (Rockwell and Sinclair 2016a). Overall, Voyant shows a concern not only for functionality and ease of use, but also for placing the tool within a larger critical conversation and developmental trajectory in textual analytical methods.

The first page of visualizations on Voyant Tools
Figure 2. The first page of visualizations on Voyant Tools. Source: www.voyant-tools.org.

However, the tool’s primary interface on the homepage obscures this documentation to encourage immediate experimentation with the tool. The link to the documentation is small and understated, tucked beneath the blank text box that takes up the center of the page, along with a bright blue button that reads “Reveal.” The design of the interface thus reinforces McGann, Rockwell, and Ramsay’s insistence on praxis as a hermeneutic. This prioritization of the text box prompts exploration, where users are invited to jump in without fully knowing how the tool functions. Additionally, a banner across the top of the page reads “see through your text,” heralding the mysterious results of computational text-analysis. In leveraging the user’s ignorance about how the tool works (through complex javascript code), the interface draws her attention toward the possibility for seeing text in a new way. The Voyant developers explain that the tool ideally functions within a larger hermeneutical process: “We feel strongly that text analysis tools can represent a significant contributor to digital research, whether they were used to help confirm hunches or to lead the researcher into completely unanticipated realms” (Rockwell and Sinclair 2016a). Through an interface that catapults the user onto a page of near-instant visualizations of the text, the intricate workings of the tool fade beneath its dazzling effects.

Voyant therefore resembles, in the words of Rockwell and Ramsay, a “‘telescope for the mind’ that presents texts in a new light” (Rockwell and Ramsay 2012). In doing so, however, the tool also subjects itself to accusations of what Dennis Tenen calls “blind instrumentalism”; Tenen asserts that “tool[s] can only serve as a vehicle for methodology. Insight resides in the logic within” (Tenen 2016). To clarify what he means, Tenen offers his own telescope metaphor, which resurrects the familiar problem of opacity as a barrier to critical thinking. Tenen supposes that a group of astronomers use a telescope without fully understanding how it works. Due to their ignorance, they fail to notice that it is broken, and can only reveal faulty images of the heavens, which the astronomers take as fact. He concludes: “To avoid receiving wondrous pictures from broken telescopes … we must learn to disassemble our instruments and to gain access to their innermost meaning-making apparatus” (Tenen 2016). According to Tenen, the user must understand the workings of the tool in order to learn from it. Those who do not understand how a tool functions remain disempowered, reduced to the motions of the tool. They resemble something like Hussey’s readers of digital texts—critically limited by the vacuous activities of clicking and swiping.

However, in the case of Voyant, the attention to technical inner-workings actually precludes the subtle and embodied interaction with the tool. Tenen’s warning about “blind instrumentalism” might be followed by McGann’s point about the critical value of ignorance, which can actually propel the user toward more complex and insightful meditations. We may thus revise Tenen’s telescope metaphor: though a broken telescope might mislead the viewer as to the location of the stars, the process of using it could reveal something about the workings of light. In this sense, the significance of the telescope is not what it does to the viewer’s perspective, but how it engages the user in process of discovery (Rockwell and Ramsay 2012). One of the most compelling benefits of Voyant is how it defamiliarizes habits of reading, particularly close-reading. Compared to the WWO interface, Voyant more directly facilitates reading as a meaning-making activity that relies on formal manipulations of text. As soon as the user uploads the text onto the site, she can interact with the visualizations by moving, adding, or deleting elements as she pleases. Reading then becomes a modeling activity, which engages the user in a sustained act of formal experimentation—the digital equivalent of Pleasure and Instruction’s “reading out loud” as a pedagogical strategy. Here, Voyant assists close-reading by drawing attention to the formal elements of the text as a foundation for critical analysis.

Conclusions

For those who really want to learn about the inner workings of the tool, they can always refer to the extensive documentation. But for those who want to experiment right away, the obscurity of computational methods opens a space for critical interventions, in which experimentation resembles something akin to criticism. My examination of Women Writers Online and Voyant Tools suggests that the most effective tools can make productive use of the user’s unfamiliarity with technology, as long as these tools thoughtfully deploy their underlying technical process to engage the formal qualities of the text at hand. For digital projects in English studies, there is a fine line between obscuring and harnessing the technical construction of these resources, which relies on the extent to which these projects use their interfaces to address textual form. In the case of WWO, offering access to the encoding models would enhance the already robust interface to engage the implicit formal qualities of the digitized text. Voyant, by contrast, builds its critical interventions directly into the deployment of a highly sensitive and easy to use interface. The differences between the two projects present a space for teachers to consider the effects of inhibitive and productive interfaces in the English classroom.

In particular, teachers need to be clear about the formal aspects of literature that they want to teach. In assessing whether the interface stifles or engages these aspects, they might look to the ways that it evokes technical modeling or active experimentation as methods for close-reading text. In some cases, they might approach technological opacity as an opportunity for learning about textual form. Accordingly, they ought to consider questions that might not immediately occur to an English instructor: What level of comfort or knowledge with technology is necessary for their students? Do students need to see (or gain some exposure to) the encoding/coding that underlies the reading surface? How might this exposure change traditional close-reading pedagogy? I offer two suggestions, seemingly contradictory. First, by seeing the code—having access to both the linguistic form of the text and its theoretical and technical underpinnings—the students gain purchase over the structures that determine meaning-making. This method relies on modeling. Second, by not seeing the code, students harness their own ignorance as a condition for learning, an ignorance that propels them toward the new and unforeseen. This method relies on experimentation, and like modeling, it hinges on the close attention to textual detail. After all, as Pleasure and Instruction remind us, human beings can only consciously process one thing at a time. Reading poetry out loud works well as a pedagogical strategy because it forces the student to focus her attention on her present, unfolding path through the text. Why engage digital interfaces to do the same?

Bibliography

“Display Conventions for the Women Writers Online Interface.” Women Writers Project, Women Writers Project, Northeastern University. Accessed Nov. 15 2018. https://www.wwp.northeastern.edu/wwo/help/textual_note.html.

Flanders, Julia. 2002. “Learning, Reading, and the Problem of Scale: Using Women Writers Online.” Pedagogy 2, no. 1: 49–59. Print.

Horrigan, John B. 2016. “Digital Readiness Gaps.” Pew Research Center. Accessed Nov. 15 2018. http://www.pewinternet.org/2016/09/20/digital-readiness-gaps/.

“How to Use the WWO Interface.” Women Writers Project, Women Writers Project, Northeastern University. Accessed Nov. 15 2018. https://www.wwp.northeastern.edu/wwo/help/wwo_interface.html.

Hussey, Mark. 2016. “Digital Woolf.” in A Companion to Virginia Woolf, edited by Jessica Berman, 263–275. Chichester, UK: John Wiley & Sons, Ltd. Print.

McGann, Jerome J. 2001. Radiant Textuality: Literature After the World Wide Web. New York: Palgrave. Print.

“Methodology for Transcription and Editing.” Women Writers Project, Women Writers Project, Northeastern University. Accessed Nov. 15 2018. wwp.northeastern.edu/about/methods/editorial_principles.html.

Rockwell, Geoffrey and Stephen Ramsay. 2012 “Developing Things: Notes toward an Epistemology of Building in the Digital Humanities” in Debates in the Digital Humanities, edited by Matthew Gold. Minneapolis: Univ Of Minnesota Press. Accessed Nov. 15 2018. dhdebates.gc.cuny.edu/debates/part/3.

Sinclair, Stéfan and Geoffrey Rockwell, 2016a. “About.” Accessed Nov. 15 2018. https://voyant-tools.org/docs/#!/guide/about.

Sinclair, Stéfan and Geoffrey Rockwell, 2016b. Voyant Tools. voyant-tools.org.

Singer, Kate. 2013. “Digital Close Reading: TEI for Teaching Poetic Vocabularies.” The Journal of Interactive Technology and Pedagogy 3. Accessed Nov. 15 2018. https://jitp.commons.gc.cuny.edu/digital-close-reading-tei-for-teaching-poetic-vocabularies/.

Tenen, Dennis. 2016. “Blind Instrumentalism: On Tools and Methods” in Debates in the Digital Humanities, edited by Matthew Gold. Minneapolis: Univ Of Minnesota Press. Accessed Nov. 15 2018. dhdebates.gc.cuny.edu/debates/part/10.

“Visualizing Speakers in Drama by Gender.” Women Writers Project, Women Writers Project, Northeastern University. Accessed Nov. 15 2018. http://www.wwp.northeastern.edu/wwo/lab/speakers.html.

Wernimont, Jacqueline, and Julia Flanders. 2010. “Feminism in the Age of Digital Archives: The Women Writers Project.” Tulsa Studies in Women’s Literature 29, no. 2: 425–35. Print.

Women Writers Project. Women Writers Project, Northeastern University, 1999–2016. Accessed Nov. 15 2018. https://www.wwp.northeastern.edu.

“WWO Lab.” Women Writers Project, Women Writers Project, Northeastern University. Accessed Nov. 15 2018. http://www.wwp.northeastern.edu/wwo/lab.

About the Author

Filipa Calado is a doctoral student in English at the Graduate Center, CUNY. Her research examines queer modernist literature and theories of cognition and affect from a digital lens. As an English instructor at Hunter College, she incorporates social reading practices, particularly digital annotation, to engage affect in close reading.


Screenshot of University of Mary Washington Libraries Digital Collections homepage.
1

What Do You Do with 11,000 Blogs? Preserving, Archiving, and Maintaining UMW Blogs—A Case Study

Abstract

What do you do with 11,000 blogs on a platform that is over a decade old? That is the question that the Division of Teaching and Learning Technologies (DTLT) and the UMW Libraries are trying to answer. This essay outlines the challenges of maintaining a large WordPress multisite installation and offers potential solutions for preserving institutional digital history. Using a combination of data mining, personal outreach, and available web archiving tools, we show the importance of a systematic, collaborative approach to the challenges we didn’t expect to face in 2007 when UMW Blogs launched. Complicating matters is the increased awareness of digital privacy and the importance of maintaining ownership and control over one’s data online; the collaborative nature of a multisite and the life cycle of a student or even faculty member within an institution blurs the lines of who owns or controls the data found on one of these sites. The answers may seem obvious, but as each test case emerges, the situation becomes more and more complex. As an increasing number of institutions are dealing with legacy digital platforms that are housing intellectual property and scholarship, we believe that this essay will outline one potential path forward for the long-term sustainability and preservation.

As a leader in what is called the Digital Liberal Arts, we at the University of Mary Washington are facing the unique challenge of archiving our early digital output, namely, UMW Blogs. Started in 2007, UMW Blogs contains 11 years of digital history, learning, and archives. Although we are best known today as the birthplace of Domain of One’s Own, UMW Blogs was a testcase for showing the viability of such a widely available online platform for faculty, staff, and students.

After three years in which Division of Teaching and Learning Technologies (DTLT) staff and a few UMW faculty experimented with blogs in and out of the classroom (Campbell 2009, 20), UMW Blogs launched in 2007. It provided the campus with a WordPress installation that allowed any student, faculty, or staff member to get their own subdomain (e.g. mygreatblog.umwblogs.org) and WordPress site, administered by DTLT. Since then, the 600 blogs of 2007 has grown to over 11,000 blogs and 13,000 users as of 2018! Each site has any number of themes, plugins, and widgets installed and running, creating a database that is exponentially larger and more cumbersome than the user numbers suggest at first glance.

The viability and popularity of a digital platform available to the UMW community convinced the administration that we should be providing faculty, students, and staff not only with a space on the web, but with their own web address, hosting capabilities, and “back-end” access to build on the web beyond a WordPress multisite installation. Domain of One’s Own was born, where anyone with a UMW NetID could claim their own domain name and server space on the web, and where they could install not just WordPress, but also platforms like Omeka, docuwiki, or even just a hand-coded HTML website.

As a result, we now have two “competing” platforms—one legacy, one current—to administer and maintain.

Maintaining UMW Blogs today can be quite a challenge, and as the administrators we frequently alternated between idyllic bliss and mass panic. It’s not very heavily used (most users have moved to Domain of One’s Own instead), but when something does go wrong, it goes really wrong, bringing down every site on the system. And with a number of sites that haven’t been updated since the twenty-aughts, there are many that are poised to cause such problems: too many sites using too many outdated themes and plugins, leaving too many security vulnerabilities, and impacting the overall performance of the platform.

And while there was the initial expectation that the sites would be left up on UMW Blogs forever, the changing nature of the web and our understanding of digital privacy and data ownership has evolved as well. We have an open, online platform featuring works by former faculty and students that are over a decade old, many of which are inaccessible to the original creator of the content to delete. Content they may no longer want on the web. How do we balance preservation and privacy?

Of course, we can’t just pull the plug—well, okay, we could, but for many faculty, this would be unacceptable. Some of our faculty and students are still using UMW Blogs, and many of the sites no longer being maintained are important to our institution and its history—whether it’s an innovative (for its time) course website, an example of awesome student collaboration, or an important piece of institutional history. Former students, as well, may still be using content they have created on UMW Blogs in their job search. We want to ensure the UMW Blogs system works and that those important pieces of our institutional history and students’ intellectual property don’t become digital flotsam.

With that in mind, DTLT in collaboration with UMW Libraries have embarked on a major project to ensure the stability of our legacy system and the long-term preservation of UMW’s digital history. We are going to chronicle some of those efforts, both for the benefit of the UMW community and for those at other institutions who find themselves in a similar situation, or soon will.

Outline of the Problem

UMW Blogs contains some stellar content. A group of students (some of whom are now UMW staff) catalogued historical markers and other landmarks throughout the Fredericksburg/Spotsylvania area, mostly from the Civil War, providing important historical context. A student wrote love letters to his girlfriend at another university regularly for several months, leaving her coded messages and invitations to dinner dates (“don’t forget the coupon!”). Two colleges on campus hosted their Faculty Senate sites there. Student government leaders (and campaigns) hosted sites on UMW Blogs. And there are historical sites from many student clubs, activists, and research groups. And who can forget Ermahgerd Sperts, or possibly the most creatively unimaginative username: umwblogs.umwblogs.org.

While most faculty, students, and staff have migrated to Domain of One’s Own (DoOO), there are always those who remain on the the platform they are most familiar with. As a public liberal arts, teaching-intensive institution, many upper-division courses are only taught on a three-year rotation, meaning that course sites built in UMW Blogs remain inactive for two or three years until the course itself is once again offered. While the course sites could (and often eventually are) migrated into DoOO, the way that faculty and students then interact with those sites inevitably shifts, causing some degree of anxiety from faculty members, who thus delay the migration process.

In other words, in the faculty’s mind, if it isn’t broke, don’t fix it. Except, of course, it does break. Often. Leaving their course sites down.

In addition to valuable contributions to UMW history, scholarship, and archives, UMW Blogs also contains about 700 sites that were last updated on the same date they were created. (“Hello, World!”… and nothing since.) A number of sites have “broken” since they were last maintained, mostly as a result of using themes and plugins that have not been updated by their developers to retain compatibility with upgrades to the WordPress core platform. And then there are sites that, while valuable to some at the time, have been neither updated nor visited in a long time. This leaves broken and vulnerable sites, compromising those who are currently using the platform.

One of the challenges we are facing in the process of archiving the sites is the ethos under which the project was created, of openness and experimentation. The original Terms of Service for UMW Blogs reads:

UMW Blogs is an intellectual and creative environment, owned and maintained by the University of Mary Washington’s Division of Teaching and Learning Technologies. Users of the system are expected to abide by all relevant copyright and intellectual property laws as well as by the University’s Network and Computer Use Policy.

Users are encouraged to use UMW Blogs to explore the boundaries of Web publication in support of teaching and learning at the University, with the understanding that UMW may decide to remove at any time content that is found to be in violation of community standards, University policy, or applicable federal or state laws.

As participants in a public Web space, users must also understand that the work they publish on UMW Blogs generally may be browsed or viewed by anyone on the Web. Some features are available to users who wish to protect content or their own identity. Information about protecting content and/or your identity within the system can be found at the following address:[1]

While the TOS capture the ethos and spirit of UMW Blogs and prompt users to think about privacy, they don’t prompt users to address their own IP and copyright. This oversight is partially a reflection of the approach to the Web as open. Nevertheless, it leaves us, now, wondering what we can actually do with student work, former faculty and staff work, group blogs, long-term collaborative projects between faculty, staff, and students.

The intention was always that copyright would remain with the creator of the content (which was made explicit in the Domain of One’s Own Terms of Service). But as we archive sites, we have encountered a number of issues regarding whose permission we need to move these sites into the (public) archive, to which the original creators will no longer have access. This is particularly difficult for collaboratively created sites, where contributors to the site are not owners of the site.

There’s another related issue that has been weighing on our minds. Past members of DTLT (none of whom are still administering the platform) told users that their UMW Blogs sites would be hosted in perpetuity, but that presents a major data ownership and privacy issue. The internet is a different place than it was in 2007. According to Paul Mason, the entire internet in 2007 was smaller than Facebook is today (Mason 2015, 7)! And that’s to say nothing of the changing ways in which we view our personal data, even our public creative work, since GamerGate, Ferguson, and Cambridge Analytica. And as the birthplace of Domain of One’s Own, UMW (and DTLT in particular) has focused increasingly over the past decade on the ownership aspect of writing and working on the web—empowering students to make critical decisions about what they put on the web, what they don’t put on the web, and what they delete from the web.

We’ve also received a number of requests from alumni asking us to remove their blog from UMW Blogs, to remove a specific post they created on a faculty course site, or even to remove specific comments they left on a classmate’s blog as part of an assignment. We are well aware of the vulnerabilities that working in public can create, as well as the ways in which we as people change and grow, leaving behind aspects of the (digital) identity that we once shared with the world.

And so, beyond the need to streamline the platform, we think it’s important that we take the initiative to remove old content from our public platform, and to pass it along to former students and faculty so they can decide what should be public and where it should be hosted.

After everything is archived locally and before anything is deleted from the platform, DTLT will be reaching out to those former students, faculty, and staff, letting them know our plans, and providing them the opportunity (and documentation) to export their data and preserve it publicly or privately, in a place of their choosing. This not only helps those currently on the platform have a better experience, but it helps our former community members once again reflect critically on their public digital identity and take a bit more ownership over their data and what’s done with it.

As proponents of “digital minimalism,” we often tell our students and colleagues that what we delete is as important a part of curating our digital identity as what we publish. We want to encourage students (and faculty and staff) to think about how large a digital footprint they are leaving, and help devise strategies everyone can use to minimize traces of themselves online. And our freedom to delete increases our freedom to experiment. As the attention economy and algorithmically driven content discovery have radically changed the internet since the early days of UMW Blogs, it’s worth rethinking both what we as an institution hold onto, and what we as individuals decide to keep in public venues.

Another challenge was that at the start of this project, we at UMW did not currently have any policies governing data storage, collection, and deletion. Alumni could keep their email addresses, the only time we ever deleted a course in the LMS was when we moved from one to another, and we do not have a enterprise-solution cloud-based shared digital storage space. We were starting from scratch.

The Process, DTLT

We identified over 5000 blogs on the platform that have not been updated since 2015 or earlier, are not administered by any current UMW community members, and have either not been visited at all in the last two years or have been visited less than 100 times in the entire time period for which we have analytics. That means essentially half the platform is inactive and no longer providing benefit to users, but is also open to vulnerabilities or “bit rot,” which can cause problems for the active sites.

However, some of the inactive sites we identified are also important pieces of institutional history. After analyzing the metadata for all 11,333 sites in the UMW Blogs database, we identified a list of over 5000 blogs that meet all of the following criteria:

  • The blog has not been updated since Jan 1, 2016.
  • None of the blog administrators are current members of the UMW community.
  • The site has either not been visited at all in the last two years, or has logged fewer than 100 visits all-time.

We then went through the entire list to identify sites important to our institutional history, as well as course websites that are less than five years old. (Some courses are offered every three or four years, and having relatively recent course websites live can be useful for faculty and students.) These are sites that we either think should be kept on the platform, or—more likely—that we think would be good candidates for UMW Libraries’ new Digital Archive. The latter will create a flat-file archive (a website with no databases or dynamic content, only HTML and CSS code) that will be far more future-proof and less likely to just break one day.

Now, we didn’t visit all 5000+ blogs manually! Rather, we looked carefully at the metadata—site titles, the person(s) attached to the sites as administrators, the administrator’s email address, and the dates the sites were created and last updated. This told us if the site was created by a student or faculty member, and if the site was a course website, collaborative student project, personal blog, etc. We identified almost 300 sites from this collection which we did check manually, often consulting with each other about them, before deciding on the 62 of these 5000+ sites that were important to keep public or submit to the UMW Digital Archive (more on that process below).

In the end, we determined that of the 11,333 blogs on the UMW Blogs platform, 6012 of them were important to keep actively published on the web (including about 50 which would best serve the UMW Community by being frozen in time and preserved publicly before “bit rot” and broken plugins bring them down). The other 5321 blogs, many of which were important in their time, are ready to be removed from the platform.

To be clear, we’re not talking about just deleting them! We are working with our hosting company, Reclaim Hosting, to create a flat-file archive and a WordPress XML export of each of those blogs, which DTLT will retain for 2 years before permanently deleting them. We are also preparing to email the administrators of those sites to let them know our plans so they can download their content before we remove anything from the platform (or, worst-case scenario, ask us to email them the backup archive after we purge the platform). But ultimately, it is important for the health of the platform to streamline the database and focus on supporting the more recent and active sites.

Through this process, we also identified a number of faculty and staff “power users” of UMW Blogs—those people who had more than 10 sites on UMW Blogs or had created a course site on the blog within the last two semesters. Once that handful of faculty were identified, we reached out to them to schedule one-on-one meetings with a member of DTLT to discuss the options for their UMW Blog site: deletion, personal archive, library archive, or migration to personal subdomain.

This was, admittedly, a fraught process for some of the faculty; these sites had become important and significant resources, examples, and case-studies of the viability and ultimate success of working openly on the web. They were sometimes years in the making, informed by countless hours of student and faculty work. To come in and say, “These sites aren’t viable in this space anymore” is intimidating.

One advantage of targeting the “power users” first is that we interacted frequently with these faculty members on a number of other projects, and thus had already developed a relationship with them, not to mention an understanding of their values, their work, and their pedagogy. We decided collectively which DTLT team member would work with each individual faculty member based on past relationships and interactions. We weren’t cold-calling these faculty; we were approaching colleagues with whom we had previously collaborated. Thus, we knew better how to discuss the issues with each individual faculty. While time consuming, we built on our relationships to tailor each interaction to the specific needs of the faculty member, allowing us to better explain and recommend options for their UMW Blog sites.

Explaining that our goal is, in fact, to preserve these websites in a more sustainable format, in order to celebrate and highlight their importance and significance to faculty, is key. We also want faculty to take more control over their data and their sites, understanding better how WordPress works and how the archival process will be of benefit to them. No technology, no matter how advanced, can survive this long without a lot of help, a lot of work, and some hard decisions about how we are going to invest our time, energy, and monetary resources.

We worked with faculty, then, to create a list of sites on UMW Blogs and categorized them based on how they wanted them to be preserved. Once that list was created and finalized, we passed the information along to the relevant people, including DTLT and UMW Archives staff, to make sure that all sites ended up in working order where they were supposed to be. When moving sites to Domain of One’s Own, we often had to replace themes and plugins, so that while the site might not look the way it did when initially created, we tried to ensure it would still retain its original functionality. The static library archive preserved the original link and function of the site in a static file.

The Process, UMW Libraries

UMW Libraries has been archiving the University’s web presence for several years now, primarily with established, automated web crawls and the occasional manual crawl to capture historical context during a special event, such as a university presidential inauguration. Our focus has been on archiving institutional sites, such as the main website, social media, UMW Athletics, or UMW News. Despite this effort, we were often missing the individual stories of the campus community.

We have a fantastic scrapbook collection in the University Archives. Stories from UMW (or MWC) students across the decades. Though students are still creating and donating scrapbooks, many are recording their college experience online, through Domain of One’s Own or UMW Blogs, rather than on paper. We also have detailed records of university business, such as meeting minutes, correspondence, and publications. The vast majority of this information is online today, with blogs or other platforms used to keep notes on committee work or to provide transparency on important campus issues, such as faculty governance or strategic planning. We must be proactive in not only preserving but providing access to these records for future students and researchers.

The UMW Archives appraisal process is an important step in beginning to archive this material. We not only need to make sure that the websites and digital projects we collect fit within our collection development policies, but we must also be confident in our abilities, through both technology and staff power, to preserve and provide access to the material we agree to accept. To help us with this process, we developed a set of criteria for appraisal:

  1. Scholarship that is new and impactful in its field.
  2. Highly innovative technical and/or creative aspects.
  3. Content that complements existing archival collections and subject areas of emphasis.
  4. Content that documents the history, administration, and/or culture of the University.
  5. Unique content that supports the research and curriculum needs of faculty.
  6. Content created, owned, or used by university departments, faculty, or students in carrying out university-related business, functions, or activities.
  7. Compatibility with SCUA’s preservation software.
  8. A faculty member’s statement of support for student-created websites.

This set of criteria will help us work through lists of current websites to determine what would be best suited for the UMW Archives. It is also published on the library website so that faculty, staff, and students can read through the list and determine if their website will be a good fit for the library’s collections. However, even if a UMW community member is unsure of where their website belongs, our hope is that the broad guidelines will encourage them to contact us and start a conversation. Even if a suggested website is not acquired by the archives, DTLT and UMW Archives staff will work with the creator to find other alternatives for migrating or archiving their content.

The lists of current websites that we are combing through and appraising do not contain the thousands of websites that DTLT started with on this project. For example, we removed from consideration sites that were created but never built out, don’t have any content, haven’t been accessed, etc. Other websites were also included because they were listed in previous university publications or suggested by a colleague. Our initial list of potential websites to archive is not all-inclusive, and it will be a continuous process as more URLs are recommended or discovered.

After websites are selected for archiving, the very important step of requesting permission follows. While the University Archives actively archives institutional websites, such as UMW Athletics or UMW Social Media, we feel strongly that we must receive permission before archiving individual blogs, websites, and other digital projects. DTLT and UMW Archives work together to reach out to the community to request permission from all creators and contributors of items that we want to archive. For those submitting archive requests, the copyright permission statement is published on the library’s website so that anyone can read and understand the terms before submission. Even if a faculty member recommends a website for archiving, the student still must provide permission before archiving takes place.

If permission is received to archive a website, the crawling can begin! UMW uses three tools for archiving websites: Preservica, Archive-It, and Webrecorder. Each web crawl is manually initiated by staff and student aides, as well as checked over for quality control after the crawl is complete. The crawl creates a WARC file, which is uploaded in the library’s digital preservation system. A metadata record in the form of Dublin Core is created for each WARC file, which includes creator(s), contributor(s), and two to three subject headings. Library staff used “Descriptive Metadata for Web Archiving: Recommendations of the OCLC Research Library Partnership Web Archiving Metadata Working Group” to help determine metadata guidelines, in addition to local, unique needs (Dooley and Bowers 2018).

The final component to the archiving process is making the archived websites accessible. Once a WARC file is created and metadata is applied, the archival item is published in Digital Collections, the library’s digital preservation and access platform. Users of the platform are able to locate archived websites through search functions that use both metadata and full-text. The websites render within the browser itself, so users can navigate the website as it existed at the time of capture.

Conclusion: Further Challenges, looking forward, plan for it

This is only the beginning of a long process of preserving and protecting our legacy platform, UMW Blogs. The platform was a launch pad for Domain of One’s Own and put UMW on the map for innovative digital learning. At the time, there was no precedent, no best practices, no road map, no rules. Now, we hope the lessons shared in this essay help schools trying to maintain their own legacy, open, digital learning platforms.

Moving forward, we will likely confront similar issues with Domain of One’s Own, particularly concerning what we should preserve in our library archives. We are developing a process for students, faculty, and staff to submit a site for preservation consideration. But given the ethos of DoOO—that the work done on users’ website is theirs to do with as they like—we know there have already been some potentially important sites deleted, as is the prerogative of the user.

How, then do you balance the imperative to save, preserve, and keep digital artifacts of (potential) historical significance with the need for agency, privacy, and freedom of the student, staff, or faculty member to delete, let die, or decay? These are the questions we are now collectively grappling with, and will continue to moving forward.

Notes

[1] Much like this project itself is trying to illustrate in the preserving of historic or significant materials that lived online, the original links to these policies and information are broken and the original information is all but inaccessible.

Bibliography

Campbell, Gardner. 2009. “UMWeb 2.0: University of Mary Washington Webifies Its World.” University of Mary Washington Magazine, Fall/Winter 2017. https://archive.org/details/universityofmary33fwuniv.

Dooley, Jackie, and Kate Bowers. 2018. Descriptive Metadata for Web Archiving: Recommendations of the OCLC Research Library Partnership Web Archiving Metadata Working Group. Dublin, OH: OCLC Research. https://doi.org/10.25333/C3005C.

Mason, Paul. 2015. Postcapitalism: A Guide to Our Future. New York: Farrar, Straus, and Giroux.

About the Authors

Angie Kemp is the Digital Resources Librarian at the University of Mary Washington. She works in Special Collections and University Archives, focusing on maintaining and expanding the university’s digital archives. She also oversees the Digital Archiving Lab, where campus and community members go to collaborate on digital collection projects and preservation. Her research interests include ethics and privacy in digital archives, as well as the long-term sustainability of digital projects.

Lee Skallerup Bessette is a Learning Design Specialist at the Center for New Designs in Learning and Scholarship (CNDLS) at Georgetown University. Previously, she was a Instructional Technology Specialist at DTLT at UMW working digital literacy and Domain of One’s Own. Her research interests include the intersections of technology and pedagogy, affect, and staff labor issues. Her writing has appeared in Hybrid Pedagogy, Inside Higher Ed, ProfHacker, Women in Higher Education, and Popula. You can find her talking about everything on Twitter as @readywriting.

Kris Shaffer is a data scientist and Senior Computational Disinformation Analyst for New Knowledge. His book, Data versus Democracy: How Big Data Algorithms Shape Opinions and Alter the Course of History, will be published Spring 2019 by Apress. Kris also coauthored “The Tactics and Tropes of the Internet Research Agency,” a report prepared for the United States Senate Select Committee on Intelligence about Russian interference in the 2016 U.S. presidential election on social media. A former academic, Kris has worked as an instructional technologist at the University of Mary Washington and has taught courses in music theory and cognition, computer science, and digital studies at Yale University, the University of Colorado–Boulder, the University of Mary Washington, and Charleston Southern University. He holds a PhD from Yale University.

Skip to toolbar