David Lyon (2017) argued that we live in a surveillance culture, a way of living under continual watch “that everyday citizens comply with—willingly and wittingly, or not” (825). Lyon (2006) previously stressed that such a pervasively visible cultural existence extends beyond notions of the “surveillance state” and the “panopticon” to forms of seemingly “soft and subtle” surveillance that produce “docile bodies” (4). Drawing upon the work of Gary Marx (2003; 2015), Lyon (2017) argued that such “soft surveillance” is seemingly less invasive and may involve individuals willingly surrendering data, perhaps through “public displays of vulnerability” (832) that are common online via cookies, internet services providers (ISPs), and social media sites. Contemporary surveillance culture is therefore less out there and more everywhere, less spy guys and big brother and much more participatory and data-driven.
In higher education, scholars like Hyslop-Margison and Rochester (2016) and Collier and Ross (2020) have argued that surveillance has always existed through “data collection, assessment, and evaluation, shaping the intellectual work, and tracking the bodies and activities of students and teachers” (Collier and Ross 2020, 276). However, the COVID-19 pandemic has accelerated and contributed to the ways that academic activity is surveilled via proprietary learning management systems and audio/video conferencing software that track clicks and log-ins while simultaneously hoarding student/user data (Atteneder and Collini-Nocker 2020). Responding to and potentially resisting such prevalent surveillance, no matter how soft, therefore requires “a careful, critical, and cultural analysis of surveillance situations” (Lyon 2017, 836). However, as Gilliard’s (2019) “Privacy’s not an abstraction” stressed, “precisely because ideas about privacy have been undermined by tech platforms like Facebook and Google, it is sometimes difficult to have these discussions with students” (para. 16). We will argue that social media news feeds are just the kind of surveillance situations that need critical attention, in writing classrooms, in service of students’ critical digital literacies.
Critical Digital Literacies in the Age of Algorithmic Surveillance
Along with many other scholars writing about technology and classroom practice before us (Selber 2003; Selfe 1999; Takayoshi and Huot 2003; Vie 2008), we suggest that critical is a keyword for theory as well as for application in our networked, digital age, and one that does not emerge fortuitously from incorporating the latest digital technologies in classrooms. In fact, by incorporating technologies into our classrooms, we are often contributing to surveillance culture, as Collier and Ross (2020) note. A critical orientation, we argue, can help.
In “Critical Digital Pedagogy: a Definition,” Jesse Stommel (2014) defined critical pedagogy “as an approach to teaching and learning predicated on fostering agency and empowering learners (implicitly and explicitly critiquing oppressive power structures)” (para. 4). Critical digital pedagogy, he argued, stems from this foundation, but localizes the impact of instructor and student attention to the “nature and effects” of digital spaces and tools (Stommel 2014, para. 14). In adapting the aims of critical pedagogy to the digital, what emerges is a clear distinction between doing the digital in instrumental fashion (e.g., to develop X skill) and doing the digital critically (e.g., to transform one’s being through X). A critical digital literacies approach to surveillance might suggest:
a willingness to speculate that some of the surveillance roles we have come to accept could be otherwise, along with an acknowledgment that we are implicated in what Lyon terms ‘surveillance culture’ (2017) in education. What can we do with that knowledge, and what culture shifts can we collectively provoke? (Collier and Ross 2020, 276)
As Selber (2004) and Noble (2018) have argued, digital technologies and platforms are made by humans that have their own biases and intentions, and those same biases and intentions may become part of the architecture of the technology itself—regardless of intentions or visibility. Other scholars, like Haas (1996) and O’Hara et al. (2002) therefore cautioned against perpetuating what is often called “The Technology Myth,” by calling teacher-scholars to look critically “at the technology itself” instead of through it (Haas 1996, xi). Without a critical perspective, students and instructors may fail to question the politics, ideologies, and rhetorical effects of their digital tools, spaces, and skills, what Selber (2004) defined as critical literacy in a digital age. We argue that there may be no better space to engage students in critical digital practice than the online spaces they visit daily, often multiple times per hour: social media news feeds.
Social Media News Feeds as a Space for Critical Digital Practice
In a report for Pew Research Center titled “Social Media Outpaces Print Newspapers in the U.S. as a News Source,” Elisa Shearer (2018) revealed that 18-to-29-year-olds are four times as likely to go to social media for news compared to those aged 65 and older. Social media applications, which are frequently accessed via mobile devices, are therefore incredibly popular with college-age students (Lutkewitte 2016) and should be seen for what they are: “technology gateways”, or the primary places where users practice digital literacies (Selfe and Hawisher 2004, 84). However, as Vie (2008) argued, even frequent users may still need to further develop “critical technological literacy skills” (10) since “comfort with technology does not imply … they can understand and critique technology’s societal effects” (12). In order to open up awareness and areas of resistance, we suggest students should be introduced to, and offered opportunities to interrogate, the ways in which their self-selected, or curricularly-mandated, technologies surveil them. Here, we aim to focus their attention on the ways they are softly surveilled via algorithms operating behind the scenes of their social media platforms. Specifically, Gilliard (2019) cautioned that “the logic of digital platforms … treats people’s data as raw material to be extracted” and put to use by individuals for a variety of purposes—malicious, benign, and in-between. Moreover, Beck (2017) argued that it has become normative for social media applications, and the companies that control them, to employ algorithmic surveillance to track all user data and personalize experiences based on that data. Indeed, these seemingly invisible mechanisms further “soften” attitudes toward surveillance that may result in sharing personal details so publicly on social media (Marx 2015; Lyon 2017).
One consequence of algorithmic surveillance on social media is what Pariser (2012) has coined the “filter bubble.” Filter bubbles are created through algorithmic content curation, which reverberates users’ pre-existing beliefs, tastes, and attitudes back to them on their own feeds, which isolates users from diverse viewpoints and content (Nguyen et al. 2014, 677). For example, YouTube recommends videos we might like, Facebook feeds us advertisements for apparel that is just our style, and Google rank-orders search results—all based on our own user data. In many ways, the ideas and information we consume are “dictated and imposed on us” by algorithms that limit our access to information and constrain our agency (Frank et al. 2019, Synopsis section). After all, as Beck (2017) argued, these filter bubbles that are curated by algorithmic surveillance constitute an “invisible digital identity” about individuals (45). And as Hayles (1999) argued, our identities are hybridized and may be seen as “an amalgam, a collection of heterogeneous components, a material-informational entity whose boundaries undergo continuous construction and reconstruction,” (Hayles 1999, 3). This suggests that an individual’s online activity and interaction with other digital actors in online spaces, which results in an algorithmic curation of a unique filter bubble, is a material instantiation of their embodied identity(ies).
We therefore maintain that turning students’ attention to their own filter bubbles on social media, a space where they may have already developed an array of literacies, means they can attempt to reconcile the distinction between their digital literacies and critical digital literacies as part of reassembling their data with their body. Indeed, the difference between digital literacies and critical digital literacies are particularly problematic in social media spaces. After all, social media are themselves sites of converging roles and agencies, where users are both producer and consumer (Beck 2017) and, as Jenkins (2006) suggested, sites “where the power of the media producer and the power of the media consumer interact in unpredictable ways” (2). We therefore ask, as William Hart-Davidson did in his foreword to the 2017 edited collection, Social Media/Social Writing: Publics, Presentations, and Pedagogies, “What if we took it [SM] seriously?” (xiii). What if instructors acted intentionally to shift students from instrumental users and information consumers to thinking critically about social media? What opportunities for agency might be revealed through concerted and critical attention to how they are algorithmically surveilled and reconstituted?
As Rheingold (2012) suggested, students who know what the tools are doing and “know what to do with the tools at hand stand a better chance of resisting enclosure” (218). For us, a critical digital pedagogy that fosters critical digital literacies is the antidote to the “enclosure” Rheingold references and a way to more holistically and critically understand agency online. Noble’s (2018) term algorithmic oppression also offers insight into the deleterious effects of unchecked algorithmic curation where, in the case of Google search, in particular, “technology ecosystems… are structuring narratives about Black women and girls” in ways that deepen inequality and reinforce harmful stereotypes (33). Jenkins (2006), too, noted that in networked systems “not all participants are created equal” (3) and that corporations have more power than individual consumers (3).
How can students therefore develop the critical literacies to resist or subvert the market-driven forces that seek to disempower and make their algorithmic identities invisible? Beck (2017) suggested that writing classrooms are a valuable space to try to do so, as “[o]ften times writing courses provide students with the means to consider possibilities for positive change to policy, procedure, and values—all with the power to enact such change through writing” (38). In other words, working with students to trace their online footprint and activities that contribute to the curation of their filter bubbles may offer the opportunity for students to critically look at their digital practices through their own digital practices. Though our interventions will be imperfect, amidst corporate-controlled, algorithmic agents, Hayles (1999) and Latour (2007) have nevertheless stressed that our informational lives are materially part of our identity, and that we do have opportunities for transforming our networked agency. Though “our lives, relationships, memories, fantasies, desires also flow across media channels” (Jenkins 2006, 17), creating data that gets funneled through algorithms for corporate or partisan profit, we can intervene. More importantly, perhaps, so can our students.
One place to begin is to reunite our digital fingerprints and our bodies through narrative, through storytelling. Hayles (1999) argued for “us[ing] the resources of narrative itself, particularly its resistance to various forms of abstraction and disembodiment” (22). We agree and have developed the Filter Bubble Narrative assignment sequence to put theory into practice. We use the term narrative in a capacious sense that recognizes the agency and positionality a writer has to arrange events or data, to tell a story, and the connective, reflective tissue that makes narrative a structure for meaning-making and future action. By investigating and storifying the effects of algorithmic curation and soft surveillance, we defragment our identity and construct a hybrid, a Haylesian posthuman assembled from a Latourian tracing. In short, through the Filter Bubble Narrative assignment sequence, we hope to offer students opportunities to act to create an embodied, expansive identity, one that is both designable and pre-designed as an interaction between humans and algorithms.
In order to encourage students to critically interrogate these filter bubbles and therefore how they’re algorithmically surveilled online, this webtext presents a scaffolded assignment, the Filter Bubble Narrative, as an example of how instructors and students might put soft surveillance under a microscope. However, unlike the hotly debated Kate Klonick assignment that involved gathering data from non-consenting research subjects conversing in public places (see Klonick’s New York Times Op-Ed “A ‘Creepy’ Assignment: Pay Attention to What Strangers Reveal in Public”), our assignment and its scaffolding invites students to investigate the technologies that they already use and that surveil them, “willingly and wittingly, or not’” (Lyon 2017, 825). We think this practice is superior to “reproducing the conditions of privacy violations” that Hutchinson and Gilliard argue against and that are enacted in assignments that involve others, especially without their knowing consent (as cited in Gilliard 2019, para. 9). However, we recognize that some students may not use social media at all, and we do not support the mandatory creation of social media accounts for academic purposes. Therefore, alternative assignments should be made available, as needed.
The Filter Bubble Narrative Assignment Sequence
Taken together, the assignment sequence aims to develop students’ critical digital literacies surrounding surveillance by creating opportunities for students to pay attention to the invisible algorithms that surveil them and personalize the information and advertising they see on their social media feeds, ultimately creating filter bubbles. Students will also be encouraged to investigate opportunities for agency within their filter bubbles through narrative and technical interventions like disabling geolocation within apps, adjusting privacy settings, and seeking out divergent points of view, among other strategies.
The assignment sequence culminates in a multimodal writing assignment, the Filter Bubble Narrative (see Appendix A). The choice to call this project a filter bubble narrative is meant to create some intertextuality between existing first-year writing (FYW) courses that may ask students to write literacy narratives, a common FYW narrative genre included in many of our colleagues’ courses and textbooks. Doing so will hopefully allow instructors to find familiar ground from which to intentionally modify more traditional assignments and to intentionally develop their critical digital pedagogies as well as their students’ critical digital literacies.
Given the widespread move to online and hybrid modes of instruction in higher education due to the COVID-19 pandemic, we intentionally designed our Filter Bubble unit for online delivery via discussion boards, though this is not strictly necessary. And though we outline a multi-week sequence of low-stakes assignments as scaffolding for the Filter Bubble Narrative, we also anticipate that instructors will modify the timeline and assignments to suit local teaching and learning contexts. Finally, in addition to fostering critical digital literacies, these assignments take into consideration the WPA’s (2014) Outcomes Statement for First-Year Writing, the guidelines Scott Warnock (2009) outlines in Teaching Writing Online, and a variety of scholarly voices that recognize opportunities for multimodal composition are essential to developing twenty-first–century literacies (Alexander and Rhodes 2014; Cope, Kalantzis and the New London Group 2000; Palmeri 2012; Yeh 2018).
Scaffolding the filter bubble narrative
During the first week of the Filter Bubble unit, students first read Genesea M. Carter and Aurora Matzke’s (2017) chapter “The More Digital Technology the Better” in the open textbook Bad Ideas About Writing and then submit a low-stakes summary/response entry in their digital writing journals. Additionally, students watch the preview episode (5:12) of Crash Course Navigating Digital Information hosted by John Green on YouTube (CrashCourse 2018). This ten-video course was created in partnership with MediaWise, The Poynter Institute, and The Stanford History Education Group. Then, students engage in an asynchronous discussion board structured by the following questions:
(Q1.) John Green from Crash Course suggests that we each experience the internet a little differently, that content is “personalized and customized” for us. What do you make of that? How is the information that you consume online personalized for you? Do you see this personalization as a form of surveillance? Or not?
(Q2.) Co-authors Genesea M. Carter and Murora Matzke define digital literacy as “students’ ability to understand and use digital devices and information streams effectively and ethically” (321). Let’s interrogate that definition a bit, making it more particular. What constitutes “effective” and/or “ethical” understanding and use?
After answering the prescribed questions, students conclude their post with their own question about the video or chapter for their classmates to answer, as replying to two or more students is a requirement for most discussion boards.
During the second week, students watch the social media episode (16:51) of the Crash Course Navigating Digital Information series. (CrashCourse 2019) After watching, students submit a low-stakes mapping activity in their digital writing journals where they map what’s in their bubble by taking screenshots of the news stories, advertisements, and top-level posts they encounter in their social media feeds. Then, students engage in an asynchronous discussion board structured by the following questions:
(Q1.) Given what you found from investigating the kinds of news stories, advertisements, and top-level posts in your social media feeds, what parts of your identity are in your filter bubble? Where do you see your interests? For example, Jessica sees a lot of ads for ethically made children’s clothing, Rothy’s sustainably made shoes, and YouTube Master Classes about writing. It seems that her filter bubble is constructed in part from her identity as an environmentalist and writing professor. Joel, on the other hand, sees ads for Star Wars merchandise and solar panel incentive programs, suggesting his filter bubble is constructed from his identity as a Star Wars fan and homeowner that needs a new roof.
(Q2.) What parts of your identity, if any, are not represented in your filter bubble?
(Q3.) How do you feel about what’s there, what’s not, and how that personalization came to be? How is your identity represented similarly or differently across digital sites and physical places?
As mentioned previously, students conclude their post with their own question about the video or discussion board topic for their classmates to answer.
In the first half of the third week, students read the Filter Bubble Narrative assignment sheet (see Appendix A) and engage in a first thoughts discussion, a practice adapted from Ben Graydon at Daytona State College. Here, students respond to one or more of the following questions after reading the Filter Bubble Narrative assignment sheet:
(Q1.) Connect the writing task described in the project instructions with one or more of your past writing experiences. When have you written something like this in the past? How was this previous piece of writing similar or different?
(Q2.) Ask a question or questions about the project instructions. Is there anything that doesn’t make sense? That you would like your instructor and classmates to help you better understand?
(Q3.) Describe your current plans for this project. How are you going to get started (explain your ideas to a friend, make an outline, just start writing, etc.)? What previously completed class activities and content might you draw on as you compose this project? What upcoming activities might help you compose this project?
In the second half of the third week, students begin knitting together the story of their filter bubble. Additionally, they engage in an asynchronous discussion board structured by the following question:
(Q1.) What can you do to take a more active role in constructing your identity and “ethically” and “effectively” (Carter and Matzke 2017, 321) navigating your information feeds?
As mentioned previously, students conclude their post with their own question, but for this discussion board topic we offer this alternative:
(Q2.) If you’d like recommendations from your classmates about steps you can take within your apps and/or feeds and pages that might diversify or productively challenge your current information landscape, let us know. If you’d rather we not send you recommendations, that’s okay, too. Go ahead and ask any other topic-related question you’ve got.
The fourth week is spent composing a full-length draft of the Filter Bubble Narrative, which students submit to a peer review discussion board for peer feedback and to an assignment folder for instructor feedback at the beginning of the fifth week.
While peer review is in-progress and the instructor reviews drafts, during the fifth week, students submit a low-stakes reflection in their digital writing journals that investigates how their ideas about digital literacy have changed (or not), especially in relation to the definition provided by Carter and Matzke (2017) about effective and ethical use of digital technologies (321), as well as what they’ve learned about themselves, surveillance, and about writing multimodality.
Limitations & risks
We acknowledge that the Filter Bubble Narrative comes with certain limitations and risks. First, while we suggest that this assignment and its scaffolding may offer potential pathways for students to develop critical digital literacies that may result in further awareness and even resistance to forms of soft surveillance, we are also aware that those practices may be ultimately out of reach. After all, as various scholars discussed above have noted (see Beck 2017; Gilliard 2019; Noble 2018), social media platforms frequently take action to purposefully obscure their very mechanisms for surveillance, which is part of the process of softening resistance (Lyon 2006; 2017; Marx 2003; 2015). Without careful critical attention to such processes, instructors and students may be misled to see this assignment as a transaction of skills necessary to resist all forms of soft surveillance. While students may become more aware of and deliberate about how they perceive or interact with their filter bubble, this does not render the surveillors and their surveillance inert.
Second, some students may be unable or unwilling to draw on their own social media use for this assignment. As we mentioned in an earlier section, not all students engage with social media and others may have broader concerns with privacy. After all, part of the assignment and its scaffolding, as described above, ask students to disclose information about their own social media use—information they may wish to keep private from their teacher and instructors. Students therefore should be reminded that they do not have to disclose any information they do not wish to and guided through alternative assignment designs (e.g., fictionalizing their filter bubble contents).
Conclusion
We’ve offered the Filter Bubble unit as one way to smooth the journey from an instructor’s critical digital pedagogy to students’ critical digital literacies. Instead of sketching this assignment for Journal of Interactive Technology and Pedagogy readers, we wanted to offer a student-directed deliverable, an assignment sheet (see Appendix A), as a way to recognize that “documents do things,” as Judith Enriquez (2020) argued in “The Documents We Teach By.” These things that documents do are many and varied. Our teaching materials are a material representation of our teaching and learning values and of our identities as critical digital pedagogues. And, perhaps most importantly, they have rhetorical effects on our students. Thus, It’s important that we offer student-centered instantiations of critical digital pedagogy along with scholarly-ish prose aimed at other teacher-scholars. Moreover, as students engage with this assignment we hope to be able to offer information about its efficacy in regard to critical digital literacies. Further, student reflections about this assignment are needed and forthcoming, as are notes about alterations we’ll make based on student-instructor collaborations.
In closing, just as we must look at technologies instead of through them in order to perceive soft surveillance and engender critical digital literacies, we must do the same with our teaching documents (Enriquez 2020). We hope that our Filter Bubble Narrative deliverable is a teaching and learning document that instructors can critically look at in order to consider ways to work together with students to reassemble a richer and more critical understanding of online identities within our algorithmically curated social media news feeds. Beyond understanding, we also hope that teachers and students will act to mitigate soft surveillance and filter bubble effects and to become ethical agents with (and even developers of) algorithmic technologies.