Dear Dean Cobblepot,
I just received my grade for last week’s assignment for Professor Crane’s psychology course. The LMS reports that he spent 3 minutes grading me and that I earned a 64%. The LMS also shows that he spent an average of 10 minutes per assignment but I somehow only got 3 minutes. I also noticed that he spent 30 seconds looking at my discussion post, even though it was 500 words—meanwhile, he spent an average of three minutes reading per post. Finally, I saw that while grading my assignment (for all of 3 minutes) he also spent time in other browser tabs including LabDepot.com and SpiritHalloween.com. Given all of this, I’m concerned about how adequately Professor Crane is treating me and my work in his course. I have attached a full report of all his interactions with my data in the LMS for further clarification.
If that email feels uncomfortable or unfair or makes you wonder if a student can understand the idiosyncrasies of grading enough to know that time-on-task does not equate with quality, then we should ask whether the reverse is equitable and fair. After all, what assumptions, misdirections, and conclusions are we drawing when we attribute meaning to our students via their data profiles (two separate and distinct things) in a learning management system (LMS)?
Many of us are worried and uncertain about the degree to which Facebook, Google, and numerous companies we don’t even know are wheeling and dealing in our data. These companies use this data to manipulate us by controlling what we see, what we have access to, and what those in our network experience while also, at times, ignoring how these tools can amplify the worst in us. However, if we are frustrated and angered by these practices, then we must also reflect on just how much we perpetuate those very same practices by ignoring or even encouraging the use of data gathering and tracking in our institutional systems in general and our digital learning environments more specifically.
Just as Facebook’s data-tracking can supposedly help us see our friends, data tracking in digital learning environments supposedly helps us see what students are doing. But how accurate is it and how often do we mistake the data for the person? Of course, we need some data to inform research, clarify our understanding of the world, and follow student progress but that data is often obtained with direct consent and with a clarity of how it will be used. Can any instructor genuinely promise that the data their students are generating within these LMSs and through third-party vendors such as etextbook sites was obtained with clear consent and used in transparent and specific ways that students agreed to?
When I started writing this piece, it was late 2019 and Instructure, known for its LMS, Canvas, was in the process of being sold to Bravo, a private equity group. People in the industry such as Phil Hill and Michael Feldstein saw the purchase as a data grab with many wondering what kind of behavioral surplus of student data would be used in the next iterations of edtech quackery. When I returned to this piece after seeing the call for this journal issue, I had the recent Dartmouth Medical School academic honesty scandal on my mind. As I finished writing this piece in early June, I read that Dartmouth had dropped its accusations against seventeen students for cheating. The accusations stemmed from using data logs from Canvas and accusing students of academic dishonesty. Students were informed of the academic charges and given 48 hours to respond; while the institution had full access to their data logs, the students themselves were denied access in trying to prove their innocence.
The power imbalances embedded in technology between students and instructors, and between instructors and institutions, was already concerning prior to COVID. But, like many things, the pandemic increased that imbalance through a form of disaster capitalism that had much of higher education more worried about preventing cheating (whatever we mean by that) than caring for and supporting students during a time of intense stress and uncertainty. We frame the LMS as our “virtual classroom,” and yet it allows us glimpses into our students’ lives that would be unforgivable violations of privacy were we to do the same thing in the classroom.
I’m only slightly amazed at how unquestioningly I, like those administrators and educators at Dartmouth, took to using the LMS to watch, judge, and control my students, and how easy it was to justify my decision. The last couple of years have given me much to think about in terms of how my role as instructor and instructional designer helped to perpetuate some of these imbalances. I have uncritically encouraged the use of the learning management systems at colleges and universities and I have leveraged the LMS as a tool of power rather than one of learning, and now I am left wondering how well it can be used for learning without the widespread potential for its use as a tool of power.
I have routinely failed, yet still aspire to pass the Jesse Stommel 4-word pedagogy test, “Start by trusting students.” I own that a good portion of that failure is the reproduction of harm that I inherited. Yet, what the LMS has offered to me and many others is a Faustian bargain that promises efficiency and productivity at the cost of respect and privacy. Such problems and limitations are often left to instructors to discover and advocate against as institutions and administrators rarely commit to a large financial investment such as an LMS and say, “by the way, there are problems.” They are too often pulled by the demands of internal and external entities to create proof of teaching and learning—the kind of proof that comes in the form of charts and numbers or the kind of proof that LMSs are nearly perfectly engineered to create.
The opportunities for privacy creep afforded by an LMS are hard to resist. It manifested in a desire not just to verify my students’ statements, but to trap them. I could ask “Who has done the reading that I put up on Blackboard?” and, as they raise their hands, I could ask, “are you sure that’s your answer?” I could then pull up a report and reveal “the truth” of their statements. “How could you read something if you haven’t accessed it on Blackboard?” The real “truth” is that there are many ways to effectively answer that but my ego and the lure of “evidence” would keep me from seeing that. And that’s the trick of technology—it gives us data or “hard facts” that undermine and erase the messiness that is humans and learning. It presents a decontextualized “objective fact” that allows us to be productively punitive rather than intentionally curious about understanding what prevents our students from doing the things we ask of them.
There are many ways we can use an LMS to check on and control our students. For example, we can keep information and knowledge from them until we decide they are ready to go forward; we can see how much time they spend in different spaces in the LMS, and watch every click of their mouse. With each of these, we can make myriad inferences about what they are or aren’t doing. Yet, often, when we’re at that point, we’re looking for “a reason” to explain something we believe to be true (e.g. plagiarism, cheating, not doing the reading, not working “hard” enough—whatever we mean by that).
So many of these features are easily accessed—simple buttons or sections labeled “Reports.” They’re not secret levels to navigate towards or particular requests to unlock, they are part and parcel of the features we are encouraged to use in our LMSs. There are no prompts that ask us if tracking students is something we should reconsider; nothing to make us pause and question our motives. We are compelled to produce reports about their work that we wouldn’t in the physical classroom, at least not without significant invasion of privacy.
On the other hand, students have no option for privacy in these environments—like elsewhere in the technological landscape of higher education, they are not individuals with agency, they are data. They are stuck with how the instructor chooses to organize the course in the LMS, and they have no control over how they are represented in the data.
I wonder why institutions would willingly encourage near-unquestioning authoritative power for instructors over students’ actions in their LMSs that they would not allow in the classroom. Can we even imagine a physical classroom where these things occur? An instructor looks over the shoulders of students to make sure they spend the appropriate amount of time on each page. Another puts a stopwatch to each student to track all sorts of things students do with the learning materials. One professor hooks students up to an eye-tracking device to make sure they can see exactly what the student sees. The level of tracking allowed and how instructors or institutions can leverage it should raise reasonable concerns, starting with how we think of the LMS. Thinking of an LMS as a “virtual classroom” obscures the level of surveillance and control an LMS affords us.
I would like to propose a new rule for all LMSs. Equal transparency for all levels of uses. For everything an instructor can see about a student, so too should students see about the instructor. But let us not stop there—students should be able to see and instantly call up reports about anyone within the organization that has come into contact with their profile or data.
Why do I propose this radical transparency? Because we have to be better. If we are individually worried about the state of digital technology in the world at large and how much of our data is controlled by others, we have to show students there are other ways and that they deserve better; they deserve to be citizens in and not objects of the LMS.
Would instructors change their practices and actions if students could track them? If students could see how much time instructors spend in LMSs and even how much an instructor spends on students’ work, it’s not hard to imagine the concerns they would raise with instructors and the administration. If emails like the one that start this piece became more common, would that change how instructors act? If being held to the same standards of scrutiny as students would change instructors’ behaviors, then we have to ask if what we are doing is right. It’s easy to imagine, then, that the faculty would start to game the system in ways that are eerily similar to what our students do when we are fixated on making them jump through our hoops.
As a rule, it’s not practical or even possible at most institutions with the given LMSs. But as a mental practice for instructors and administrators to think through when exploring data and reporting features of LMSs, it has merit. There seems an inevitability to the data mining and hoarding that institutions and third-party vendors are conducting.
On one side, the tendrils of surveillance capitalism permeate nearly any digital software being sold to institutions, whether focused on applicants, students, alumni, faculty, or staff. On the other side, boards of directors, accrediting agencies, and public agencies demand more complex proof that an institution does what it claims.
Instructors, instructional designers, and to a degree, even administrators are limited in how much say or sway they have in this data grab. We can be more intentional in how we use it and more inclusive in letting students know that it can be used in ways beyond our individual control. We can find opportunities to push or at least make known the problems and limitations of using machines as oracles to divine the truth of our students’ minds.
All technologies have tradeoffs—I certainly value that fact—and yet, I have to wonder how we can mitigate the tradeoff of students (and instructors for that matter) as data in this evolving technological landscape. This ongoing conversation should encourage instructors and administrators to consider just how an LMS acquires its data on students in numerous ways and whether it is in the students’ long-term interest to create, or more problematically, have third-party vendors create, maintain, and use that data without any responsibility to students or giving students the ability to control their own data.