Search  


Platforms Like Canvas Play Fast and Loose With Students’ Data by BRITT PARIS, REBECCA REYNOLDS and CATHERINE MCGOWAN 
Monday, October 23, 2023, 04:50 PM
Posted by Administrator
Many universities have yet to reckon with the data justice implications of learning technologies—now, with online learning the norm, these practices deserve more scrutiny.

In 2018, Rutgers University made a move that hundreds of other universities before it had made: It switched its online learning platform from Sakai—a free, community-sourced system—to Canvas, which is owned by a company called Instructure.

The switch was significant: Now the university was paying hundreds of thousands of dollars a year for a product that didn’t have to be transparent about what it did with the information and data it was mining from its users. Such systems are constantly recording users’ interactions with it—how long it takes a student to complete an assignment, for example, or her deleted words and keystrokes, and users’ IP addresses.

But students and professors rely on this technology. Private learning management systems (LMS)—the online portals students use to refer to everything from their syllabus to their assignments, grades, even course schedules—have become increasingly compulsory, especially in higher education. And they have long been a subject of suspicion for experts on privacy and others wary of relying too heavily and uncritically on tools that could potentially mine users’ data. “Students often provide the raw material by which ed tech is developed, improved, and instituted,” writes Chris Gilliard, a digital privacy scholar, for The Chronicle of Higher Education. “But their agency is for the most part not an issue.”

According to instructors, before the pandemic necessitated remote online learning, these technologies provided little benefit to education beyond archiving class materials. Rather, they tended to inherently make a reliable Internet connection the only way to participate, reinforcing structural inequities. Still, educators have felt an unspoken pressure to adopt these systems. “I was mildly concerned that not using the LMS would get me into some kind of vague ‘trouble’ with administrators,” a Rutgers professor, who wishes to remain anonymous, said. “What ‘trouble’ that would be was never clear. It was a vague sense of dread/concern, but I did certainly feel that I was doing something that could be positioned as a ‘problem.’”

This story was produced for Student Nation, a section devoted to highlighting campus activism and student movements from students in their own words. For more Student Nation, check out our archive. Are you a student with a campus activism story? Send questions and pitches to Samantha Schuyler at samantha@thenation.com. The Student Nation program is made possible through generous funding from The Puffin Foundation.

Contracts with ed tech vendors are often negotiated by university administrators and executives without faculty, student, or parent involvement. Given the lack of administrative transparency around this issue, it may be, as researcher Kyle Jones suggests, that “institutions are withholding information about [e-learning vendors’] data practices to keep student privacy concerns at bay, concerns that could potentially derail beneficial contracts with vendors.” Without transparency and accountability, it’s impossible to know.

As professors of library and information science and researchers on data justice and education technology, and as active members of the Rutgers branch of the American Association of University Professors–American Federation of Teachers, we wanted to provide answers to the questions our colleagues had. We felt it was important to penetrate the obfuscating ambiguity we see so often in private learning management systems, focusing on our own institution, Rutgers University, as a case study. But these systems are used by universities across the country—about 3,482 institutions as of fall 2020, according to EduTechnics. What we found, through records requests using the New Jersey Open Public Records Act, was that Rutgers pays high prices to ed tech companies, allowing them almost unfettered access to student and instructor data and metadata, while doing nothing to ensure that even paltry privacy terms are upheld.


We analyzed 10 vendors’ terms of service, alongside their data management plans, security and privacy statements, and contracts and payments, and found that they give university stakeholders no option but to accept vague terms and unclear expiration dates for their data use. Other pre-pandemic studies suggest that these problems with ed tech exist at institutions across the country. Those at other universities who are concerned with the relationship their institutions have with corporate ed tech and the negative implications it carries for an institution’s educational goals have a right to know what information is being withheld from them.


Because the situation has only worsened. As schools shut down to prevent the spread of Covid-19, universities oversaw, in a matter of months, the hasty adoption of technologies that play fast and loose with user privacy. Even before the pandemic, big ed tech had been embroiled in a shock doctrine relationship with universities, exploiting the crises in higher education funding to dominate campuses across the country. As universities become ever more cash-strapped, they’ve turned to offering online degrees built on LMS technology, which in their pursuit of profit have arguably undermined universities’ ability to provide equitable, quality education.

Generally, student privacy is protected under the Family Educational Rights and Privacy Act (FERPA), a federal privacy regulation for K-12 and higher education that affords students or their guardians some control over the disclosure of personally identifiable information in education records. However, ed tech companies can comply with FERPA without consent from the user under a loophole called the “school official exception,” which allows schools and districts to disclose this information to educational service providers. In these LMSs, student and instructor personal information, such as names and identification numbers, is frequently attached to classroom materials, as well as to user-generated content and assessments. Every assignment students submit and every video that instructors upload through the system is collected and attached to metadata like time, IP address, and even tracked cursor history in some instances. Additionally, there is no FERPA protection for end-users’ metadata and de-identified data, which can be easily recombined to individually identify students.

According to the terms of service and contracts of the Rutgers ed tech vendors we analyzed, vendors make only vague references to FERPA compliance or none at all. Microsoft was the only one to explicitly cop to using the school official exception as a privacy workaround. Other platforms we explored defer FERPA compliance to an ed tech industry–branded student privacy pledge that intentionally skirts enforceable privacy requirements, leaving the potential for unfettered and unregulated surveillance.

Using the school official exception as justification, profit-driven LMSs typically force users to choose between compliance with all their terms or not using the system at all. LMS use is effectively compulsory: With few other choices or support for alternate systems, even professors who attempt to circumvent the system their school has contracts with are eventually cudgeled into compliance, like the anonymous Rutgers professor who spoke to us for our research. For a long time, they told us, they had used a bespoke system of FTP sites, e-mail, and other technologies to facilitate their class instead of Canvas. But when classes moved online in early 2020, they wanted to replicate the sense of community in a physical classroom, which some ed tech like Canvas allow for with built-in forums and chats. But they also could no longer juggle their previous system with the sudden increase in their online workload—and the only available system to streamline the process that the school offered was Canvas. Begrudgingly, they said, and despite their objection to the terms of use, they switched over.

Donate Now to Power The Nation.
Readers like you make our independent journalism possible.

Rutgers could give students or instructors some control over privacy by, for example, letting them opt out of massive data and metadata collection and sharing with third-party ed-tech vendors. It notably does not. Canvas—one of the most widely used ed tech services in higher education—includes in its terms of service many problematic and ever-changing terms around privacy and intellectual property and data misuse. Indeed, Canvas’s purchase by the private equity company Thomas Bravo in 2020 sparked widespread concern over the service’s privacy practices. Rutgers paid over $1 million in 2020 to Canvas’s Unizin partnership , which allows nebulous data analytics access to Canvas’s blended- and distance-LMS courseware, according to the terms of the contract.

As a result, precarious academic workers are surveilled by administrators. Faculty and adjunct instructors at Rutgers told us about colleagues and educators at other schools who had been monitored by program directors. We received reports in the course of our research of anonymous adjuncts who suspected they had been monitored by program directors because they had “received warnings based on out-of-context interpretations of the metadata that Canvas makes available to instructors and admins.” Such uninformed “data-driven” monitoring by administrators is notorious for leading to inaccurate inferences, disadvantaging instructors as well as students.


Such practices among both tech vendors and institutions show the need for greater scrutiny. Research shows that such monitoring creates an antagonistic environment for students; chills engagement; has disproportionately negative outcomes for women, trans, and non-binary folks, and people of color; contributes to the entrenchment of unwarranted surveillance technology in education; and contributes little to student learning.

And Rutgers had dealt with issues of privacy long before they made the switch to Canvas. In 2015, the university rolled out Verificient’s ProctorTrack software, which uses facial recognition and other biometric data to monitor online testing, without any contract. There was no assurance that even the most basic review had been conducted to assess the privacy risks or utility for learning. While Rutgers secured a contract in 2015, these objections weren’t meaningfully addressed and didn’t recede. In October 2020, Rutgers sent a system-wide e-mail informing students and instructors that Proctortrack was down for maintenance—its facial recognition and other biometric data had been hacked. The company has since come under fire from multiple institutions over bias, privacy, and security concerns. The summer 2020 uprising against racial injustice and policing intensified the backlash against biased surveillance technology and facial recognition, especially as AI and machine learning proliferate in public service sectors.

Many universities have yet to reckon with the data justice implications of learning technologies, whether because of lack of knowledge among decision-makers or lack of interest in a people-centered and structural critique. Indeed, the university paid even more per unit for ProctorTrack services the year of the protests than any year prior, according to the contracts we obtained. Part of this is because the pandemic prompted an uptick in use, putting more student data at risk. While Rutgers “turned off” ProctorTrack following the breach, student and faculty objections over privacy and surveillance in online test proctoring remain unaddressed.

The 2020 confluence of the pandemic, economic and governmental crises, and a movement for racial justice has forced open public debates about how broader systemic inequalities are perpetuated in institutions of all stripes, including education. This ought to prompt greater reflection on the persistent, compelling evidence of striking digital inequities among learners and instructors, in regard to device quality, networking bandwidth, technology skills, and home support. “Some of our students from low-income backgrounds struggle to get basic Internet access,” another Rutgers professor, who also wished to remain anonymous, said. One of their students parks his car in front of the now-closed local public library to Zoom into class sessions because his home does not have enough bandwidth for him and his younger siblings to do their school lessons at the same time. Others, they said, are forced to rely on their phones. “It’s ridiculous,” they said. “In my opinion, we should be supporting basic Internet access for our students instead of spending millions on ed tech deals.”

At a minimum, it would be a reasonable expectation for the university to demand constraints on ed tech companies’ use of data and more flexible terms of service that provide greater agency and more options for users. And universities already have resources that have long advocated for privacy rights and provided support for scholarly research and teaching in online, in-person, and hybrid formats: their libraries. Yet, while Rutgers spent over a million dollars on Canvas and Unizin from 2016 to 2020, its libraries endured layoffs and budget cuts despite the signing by hundreds of faculty, students, and staff of a petition calling for funds to be restored.

We need to shift power away from corporate vendors and administrators. We should push to end the surveillance of students and instructors. We should strive toward solutions that will facilitate the agency of students and instructors, such as: shifting learning technology development and servicing to accountable on-site entities; establishing independent data privacy and protection boards that take an adversarial position in scrutinizing corporate contracts; adopting opt-in informed consent for all learning analytics platforms; and making FERPA a floor, not a ceiling in developing social and technical mechanisms to protect privacy. Ultimately, this is part of a larger project of building ways for instructors and students to meaningfully participate in determining what kind of educational environments they want and need.



Britt Paris
Britt Paris is an assistant professor at Rutgers University and an affiliate at the Data & Society Research Institute. Her work focuses developing a broader understanding of the social, political, economic, and historical forces that have shaped our current technological environment to allow us to envision new systems that might better support the future we want. She is also a member of Rutgers AAUP-AFT and produced this report in that capacity.


Rebecca Reynolds
Rebecca Reynolds is an Associate Professor at Rutgers University. She explores human learning and development in the context of socio-technical systems design for e-learning, and the systemic, social, and critical implications of their deployment. She is also a member of the Rutgers AAUP-AFT and a union rep for her academic department.


Catherine McGowan
Catherine McGowan is a PhD Student at Rutgers University. Her work focuses on data policy and ethics through a critical analysis of artificial intelligence, surveillance technology, patents, and the implications of civic rights and democracy through the data and systems that construct the datafied citizen.


add comment ( 201 views )   |  permalink   |  $star_image$star_image$star_image$star_image$star_image ( 3 / 303 )

<<First <Back | 425 | 426 | 427 | 428 | 429 | 430 | 431 | 432 | 433 | 434 | Next> Last>>







Share CertificationPoint & Stay Informed Socially About EduTech?