Abstract

The COVID-19 pandemic made traditionally proctored in-person exams impossible. This article provides a summary of the arguments against institutional adoption of remote proctoring services with a focus on equity, an account of the decision to avoid remote proctoring on the University of Michigan–Dearborn campus, and conclusions and suggestions for other teaching and learning professionals who would like to take a similar approach. Remote proctoring services require access to technology that not all students are guaranteed to have, can constitute an invasion of privacy for students, and can discriminate against students of color and disabled students. Administrators and teaching and learning staff at the University of Michigan–Dearborn made the decision to avoid adopting remote proctoring technologies and to instead invest in instructional design staff and faculty development programming to help faculty transition to authentic assessments. Lessons learned and recommendations are provided for other educational developers or institutions who want to resist remote proctoring on their campuses.

Keywords: remote proctoring, digital assessments, authentic assessments, faculty development


Introduction

One of the major challenges that higher education instructors, staff, and administrators faced in the spring 2020 transition to remote instruction was that of academic integrity and assessment. The COVID-19 pandemic made traditionally proctored in-person exams impossible because of the risk of gathering any number of students together in a room and because students at residential colleges had largely left campus for their permanent addresses. On the surface, technology provides an easy solution to this issue: several remote proctoring (also referred to as e-proctoring) tools are readily available. These include software services that conduct auditory or visual surveillance of test-takers as well as lockdown browsers that prevent students from using other applications or browser windows during an assessment (Coghlan et al., 2020; González-González et al., 2020). Many higher education institutions have already adopted remote proctoring to facilitate assessment during the COVID-19 crisis while others are considering incorporating them in the future (Grajek, 2020). The University of Michigan–Dearborn (UM-Dearborn) leadership, with the support of the staff of the Hub for Teaching and Learning Resources (the Hub), made a different decision: to resist any new implementation of remote proctoring software and to invest in additional instructional design staff and programming to support instructors in implementing alternative assessments. We refer to this strategy as a “people-centered” approach rather than one focused on technological solutions. This article provides a summary of the arguments against institutional adoption of remote proctoring services with a focus on equity, an account of the decision to avoid remote proctoring on the UM-Dearborn campus, and conclusions and suggestions for other teaching and learning professionals who would like to take a similar approach.

Background on Remote Proctoring

Remote proctoring technologies have a range of uses and functionalities. Some options include a human proctor who monitors test-takers through a webcam (Atoum et al., 2017), and others require test-takers to record themselves using video conferencing tools (Weiner & Hurtz, 2017). Other early iterations of remote proctoring employ a “lockdown” browser or other application that only allows the test-taker’s computer to be used for testing (Foster & Layman, 2013). As remote proctoring technologies matured, new functionalities became available, such as the ability of the proctor to use the computer’s webcam to record the test-taker’s video and audio during an exam (González-González et al., 2020); in the event of suspicious behavior, the recording can be consulted. Some remote proctoring technologies monitor the test-taker’s behavior when taking an exam. Other technologies only track the test-taker’s computer activity to identify cheating behavior, and some systems combine these two approaches (Karim et al., 2014). Current remote proctoring technology has become more sophisticated, combining artificial intelligence, automated proctors, wearable cameras, environmental scans, keystroke tracking, gaze and eye tracking, and head movement detection (Coghlan et al., 2020; Cramp et al., 2019) with multi-biometric systems for authentication, such as facial recognition (Fenu et al., 2018).

In April 2020, the early days of the pandemic, an EDUCAUSE quick poll about assessment and proctoring found that of the 312 institutions who responded, over half (52%) already had a remote proctoring solution of some kind, and 23% were “planning or considering using them” (Grajek, 2020). In November 2020, The Washington Post (Harwell) estimated that during the widespread move to remote learning due to the COVID-19 pandemic, thousands of schools had spent millions of dollars on the purchase of remote proctoring solutions, citing that one company charged some individual institutions approximately $500,000 for 1 year of the service. Despite this apparent widespread adoption, there were examples of institutions that had rejected the technology (Buccieri, 2020) or strongly discouraged it (Baruch College, 2020).

There are myriad concerns about remote proctoring technologies, including students’ lack of access to technology, test anxiety, privacy and security concerns, and accessibility needs, to only name a few (Langenfeld, 2020). Students were able to navigate some of these issues under normal circumstances, but many of the challenges have been intensified by the COVID-19 crisis, which has worsened mental health issues, increased inequalities, and brought financial uncertainty for many students. As the pandemic progressed, several articles in peer-reviewed journals and a slew of articles in the international popular press criticized remote proctoring for creating unjust testing environments, exacerbating test anxiety, and putting students’ data privacy at risk (Langenfeld, 2020; Logan, 2020; Swauger, 2020). Faculty and staff seeking to better understand remote proctoring technologies took to social media to discuss their functionality, but tensions flared as one company sued a University of British Columbia employee claiming copyright infringement after he criticized the company and linked to unprotected YouTube videos on social media (Chin, 2020). Additionally, students began to speak out at several institutions calling for boycotts and bans of such technology (Valentine, 2020). With more than 60,000 students signing petitions to end automated remote proctoring on campuses across the United States, one publication stated that this technology was “arguably the most controversial tool of the pandemic at colleges” (Young, 2020).

A top concern about remote proctoring is the way it centers technological solutions rather than people-centered approaches. Swauger (2020) outlines several examples of how automated approaches to remote proctoring have created unfair and discriminatory conditions for students of color and disabled students. Coghlan et al. (2020) recognized some benefits of using remote proctoring technologies, such as the ability to schedule exams outside the scheduled class time, but note the disadvantages for students that do not have reliable access to technology as well as students with disabilities. How should those of us who advocate for equitable assessments and pedagogies of care respond? What does the deployment of such technologies mean about our identities as institutions and educators? Some have suggested that refusal of such technologies is the only way to counter their harms (Logan, 2020).

The Decision to Resist Remote Proctoring at UM-Dearborn

Soon after March 11, 2020, the day our campus closed in-person classes, the UM-Dearborn administration and staff were receiving frequent offers for limited-time free access to a number of educational technology services, including remote proctoring. The Office of the Provost and the deans jointly made the decision to strongly discourage remote proctoring. Though the university did have an agreement with ProctorU to provide remote proctoring services, only select courses (almost exclusively offered by the College of Engineering and Computer Science) were using ProctorU before the pandemic, and the service was paid for by the college’s online learning office. Since most colleges at the university do not have online learning budgets, other students using remote proctoring during COVID-19 would have had to pay out of pocket for exam proctoring. Thus, during the emergency shift to remote instruction, remote proctoring was highly discouraged as students weren’t aware from the beginning of their classes that they would not only need to pay for the proctoring tool but also would need a webcam and a strong internet connection. Requiring students to pay for a service that they hadn’t anticipated, as well as the stress of monitored exams, seemed unfair to our students at a time when we needed to offer them flexibility. Even in the absence of the pandemic, equity is a major consideration for UM-Dearborn, a regional campus of just over 9,000 students, many of whom have not historically been well served by higher education. Thirty-nine percent of our students are transfer students, 44% are Pell Grant eligible, and 44% have caregiving responsibilities. Twenty-eight percent are students of color, and 47% of new students are first in their family to attend college (UM-Dearborn, 2020a). Dearborn borders Detroit, an early epicenter for COVID-19 casualties, and our students, faculty, and staff were experiencing illness, grief, and trauma at a disproportionate level to other areas of the State of Michigan.

The Office of the Provost took a strong position against remote proctoring during the emergency transition semester because of the inequity of added costs for students as well as the invasion of students’ privacy. This position was extended through winter 2020 in response to additional feedback from the university coordinator of digital education, who noted that tools like lockdown browsers might be difficult for students to install if their home devices did not meet the technical requirements. Subsequently, the Office of the Provost worked with the Hub to craft an April 1 email to all faculty with the subject “Guidance on Final Exams for Winter 2020” (Miteza et al., 2020), which included final exam strategies that did not require proctoring (UM-Dearborn, 2020b). Though the leadership’s position against remote proctoring is strong guidance and not direct policy, faculty (including faculty who strongly advocate for access to remote proctoring) have adopted the position as campus practice.

The Impact of the Decision

Communications from faculty, sometimes copied to the Hub or the Office of the Provost, expressed frustration with the decision to resist remote proctoring. Some assumed the primary issue was cost and chided the administration for not purchasing remote proctoring licenses. Others expressed distrust in students’ academic integrity and bemoaned what they believed to be widespread cheating. Still others refuted suggestions to redesign courses toward alternative assessments because they were teaching online for the first time and were worried that alternative assessments would require additional labor. However, some faculty who supported the decision expressed relief and agreement but some disappointment that there was not better communication and a larger announcement of the reasons for the administration’s position. The consistent individual or group email responses from the Hub, the Office of Digital Education, and the Office of the Provost emphasized that the decision to avoid remote proctoring was based on concerns for student privacy and equity and that instructional designers from the Hub were available to support course redesigns but these messages could have been better wrapped into a cohesive, campus-wide message.

The faculty in the Department of Natural Sciences, which includes physical and biological sciences, were particularly challenged by the lack of electronic proctoring options. Not only do classes such as introductory biology and chemistry often rely on timed, proctored exams for student assessment, but these courses are also among the largest on campus. Given these details, faculty were reluctant to take up more authentic assessments because of the potential grading burden in a high-enrollment course. Additionally, both students and instructors often view these courses as preparation for professional school admissions, making practice with high-stakes exams an important part of the course experience.

The staff at the Hub had several opportunities to engage with student feedback about the transition to remote learning in general and to remote assessment in particular. These included informal meetings with student leaders, mid-semester feedback sessions in individual courses, and a campus-wide survey of student experience during COVID-19 (Higher Education Data Sharing Consortium, 2020). A common theme in these student perspectives was that some instructors seemed to be on high alert about academic integrity issues and had made course design decisions that were intended to reduce cheating. Students were troubled by some of the efforts (e.g., offering exams during reduced time windows, configuring the exam so that only one question displays at a time, and preventing students from going back to previous questions) because they saw these practices as overly punitive, especially given all of the difficulties of remote learning during a pandemic. Students were never asked directly about remote proctoring in surveys or discussions, and students themselves did not raise this issue in any of the feedback that was collected. It is difficult to know how aware students are of the decision to not use remote proctoring software and whether this would affect their perspective on their instructors’ efforts to promote academic integrity.

Programming to Support Faculty in Implementing Alternative Assessments

In the wake of the decision to strongly discourage remote proctoring, the Hub staff noticed that faculty were reluctant to continue giving exams in a remote fashion because of academic integrity concerns. While some instructors were uncomfortable with the idea of open-book exams (a necessity because of the lack of proctoring), others were comfortable with open-book exams in principle but were concerned that students would collaborate with one another or copy and paste answers from websites such as Chegg. The Hub staff encouraged instructors to employ authentic assessments instead of exams—especially multiple-choice exams. Authentic assessments are those that ask students to apply their knowledge on “intellectually worthy tasks” (Wiggins, 1990), and those should have “the same competencies, or combinations of knowledge, skills, and attitudes, that [students] need to apply in the criterion situation in professional life” (Gulikers et al., 2004). These assessments can include case studies, portfolios, reflections, or projects. Authentic assessments help motivate students because they connect the course material to real-world applications (Svinicki, 2004) and also do not pose the same challenges with respect to academic integrity as exams because each student’s submission is unique to them and difficult to copy from another source. Furthermore, these assessments can be beneficial for students under stress because they can help students avoid the test anxiety and cognitive overload that often accompany traditional, timed exams (Carter et al., 2008; Dodeen, 2008). Canvas support personnel were often contacted by faculty about academic integrity and proctoring for Canvas quizzes. Standard practice was to connect those faculty with an instructional designer to reinforce the authentic assessment strategies the team supported.

The Hub took several steps to support faculty during the transition to remote teaching and specifically to assist faculty in developing authentic assessments using a people-centered approach. This approach served as an alternative to technology-focused approaches (such as widespread adoption of remote proctoring) and included investment in personnel and faculty development. CARES Act funding made many of these steps possible, especially hiring two additional instructional designers on 2-year contracts, which increased the size of the Hub instructional design staff from two and a half to four and a half people. This expansion was imperative because the Hub was frequently providing consultations to faculty who were new to online teaching, many of whom were overwhelmed by moving online with little preparation and mourning the sudden loss of face-to-face pedagogies. Such individualized meetings would not have had the same reach without an expansion in staff. Other steps included hosting a virtual guest speaker who is an expert in authentic assessments and specializes in the STEM disciplines (Andersen, 2020). During summer 2020, the Hub provided several people-centered faculty development programs including #DigPINS, a small online connected learning cohort experience that focuses on building an online community between an international cohort of schools (Bali & Caines, 2018).

All faculty were invited to participate in hands-on weekly online design challenges during summer 2020 to develop their digital teaching dexterity. These weekly challenges focused on course design and instructor presence, student interaction, and access and equity and culminated in a week dedicated to authentic assessments. This final week exposed faculty participants to readings exploring authentic assessments, why they are important, and what they look like in different disciplines. Participants engaged in two online discussions, and to complete the challenge, they submitted a deliverable related to authentic assessments. Choices for deliverable submissions included authentic assessment descriptions, reflections comparing traditional assessments to authentic assessments, a pitch-your-own-deliverable option, and finally two advanced options that invited the participants to dig deeper into concepts such as “ungrading” (Blum, 2020), gamification, contract grading, specifications grading, and project-based learning. Most participants chose to do either an assessment description or reflection. Assessment descriptions included case studies, essays, self-assessments, real-world data reflections, specifications grading (Nilson, 2015), and project-based learning. Examples of authentic assessments that instructors developed through this program included an assignment in which students demonstrated statistical analyses using COVID-19 incidence rates and an economics project in which students estimated the fiscal health of their own municipalities in light of the pandemic. In their reflections, some faculty noted that they did not plan to transition entirely to authentic assessments but saw their value (especially in the remote teaching context).

Another major step was earmarking CARES Act funding to hire “graders” who could help faculty in high-enrollment courses grade and provide feedback on redesigned authentic assessments, which take longer to grade than traditional exams. The additional assistance of the graders provided more bandwidth to instructors, who normally do not have access to teaching assistants/graduate student instructors (UM-Dearborn is a primarily undergraduate campus with few graduate students). The grader program was aligned with the overall people-centered approach because it provided quality assistance to faculty who were working hard to redesign aspects of their courses and also provided employment to part-time lecturers who filled the grader roles.

Lessons Learned and Recommendations

One of the key lessons learned from our campus decision to limit remote proctoring is that it is critical to communicate to faculty both a clear rationale for the decision and clear alternatives. We believe that we were largely successful in communicating the rationale for the decision (including the potential privacy and equity concerns mentioned above) and made a strong effort in providing alternative assessment approaches and support to faculty. Our recommendation to other institutions or groups that would like to resist remote proctoring is to develop a comprehensive communication strategy that details the harms of remote proctoring, alternatives available, and (crucially) what support will be available to instructors who will need to adjust their course design and assessments. In our experience, leaders such as educational developers, instructional designers, academic technologists, department chairs, and deans will need to be prepared to extensively support faculty during the transition. Many instructors have been using proctored exams for their entire careers and see them as essential to the teaching and learning process. Even as teaching and learning professionals may be eager to see instructors drop high-stakes exams all together, it is important to recognize that changing a long-held teaching practice can feel destabilizing and will likely bring some unforeseen challenges and resistance from instructors (Smith, 2020). Monetary resources saved from the refusal of remote proctoring services can be redirected to invest in ongoing faculty development on assessment and other teaching-related topics. We believe our investment in people, rather than remote proctoring services, will be more robust to the inevitable changes in learning formats and technology that are to come.

On a practical note, instructors and staff across campus were responsive to the argument that inequities in technology access make it difficult to require students to use any remote proctoring services. UM-Dearborn could not guarantee that students would have the appropriate hardware and operating systems to use a remote proctoring system and did not feel comfortable requiring students to install any particular technology on their personal computers. For other institutions under similar conditions, the result may be that implementing remote proctoring at scale is just as logistically thorny as not doing so. If other institutions are considering resisting remote proctoring on a campus where there is significant pressure from faculty or administrators to use it, unpacking the potential difficulties that would result from its use may help to clarify that adopting remote proctoring is not the “easy solution.”

While communication with faculty was mostly successful, we speculate that we did not communicate the decision about remote proctoring and its rationale effectively to students and therefore missed an opportunity to build trust between students, instructors, and the administration. Institutions that want to resist remote proctoring have an opportunity to develop a shared critical digital literacy between instructors and students by discussing the ethical problems associated with remote proctoring and building a shared understanding of academic integrity in the digital age.

We hope that our experiences and lessons learned during this process are useful to other educational developers, but we also recognize that each campus is unique and that the dynamics that affected our institution’s experience with the transition to remote teaching are not universal. We invite further writing and discussion of strategies for limiting the use of remote proctoring in different contexts with the goal of developing a robust, people-centered toolkit for supporting remote assessment in a diversity of campus contexts.

Acknowledgments

We acknowledge the students, faculty, and staff at UM-Dearborn who shared their experiences with the Hub for Teaching and Learning Resources staff to inform this article. We acknowledge Antonios Koumpias and Marie Waung for generously sharing examples of authentic assessments.

Biographies

Sarah Silverman is an Instructional Designer at the Hub for Teaching and Learning Resources at the University of Michigan–Dearborn. Her areas of interest include STEM teaching and learning, Universal Design for Learning and accessibility, and culturally responsive pedagogy. Sarah earned her PhD in Entomology at the University of California, Davis and has worked in teaching and learning programs at the University of California, Davis and the University of Wisconsin–Madison.

Autumm Caines is an Instructional Designer at the University of Michigan–Dearborn in the Hub for Teaching and Learning Resources. She holds a MA in Educational Technology from The Ohio State University. In the Open, Autumm is a Co-director of Virtually Connecting, and she also helps to organize and facilitate Open/Connected online events for the purposes of faculty development and her own practice in digital stewardship, most recently with the tags #DigCiz, #DigPINS, and #EthicalEdTech.

Christopher Casey is the Coordinator of Digital Education at the University of Michigan–Dearborn. He earned a BS in Computer Science from the University of Michigan–Dearborn in 2002. He has been working in the digital education arena since 1999, when as a student-employee he helped develop some of the University of Michigan–Dearborn’s first online courses.

Belen Garcia has a PhD in Learning Design and Technology with a focus on engineering education from Purdue University, where she taught courses in the College of Education and the School of Languages and Cultures. She currently works at the University of Michigan–Dearborn as an Instructional Designer. Her interests are centered on the integration of pedagogy for online learning and the use of emergent technologies for learning STEM or foreign languages.

Jessica Riviere is an Instructional Designer at the University of Michigan–Dearborn and has previously worked at The Ohio State University and Vanderbilt University, where she earned a PhD in German Literature. Her areas of research focus include graduate seminar pedagogy and graduate curriculum in the 21st century.

Alfonso Sintjago is an Instructional Designer at the Hub for Teaching and Learning Resources at the University of Michigan–Dearborn. His areas of interest include open education, MOOCs, international development, and gamification. Alfonso received his PhD from the University of Minnesota in Comparative and International Development Education.

Carla Vecchiola is the Director of the Hub for Teaching and Learning Resources and a Lecturer in History at the University of Michigan–Dearborn. Carla received her PhD from the University of Michigan’s American Culture program.

References