This is the size of survey education researchers should be aiming for, though I would still urge them to consider surveying non-students and aspiring students as well, in order to understand the barriers. And developers of online learning, at least in Britain, should be happy with the result that "nearly seven-in-ten students surveyed rate the quality of online and digital learning as either ‘best imaginable’, ‘excellent’, or ‘good’ (68% of both FE and HE students)." There was similar praise for online learning support.The research (12 page PDF and another 12 page PDF) also looks at how things could be improved, and the results aren't surprising: get the basics right, make learning more interactive, provide recordings of events, you know, the usual.
This article is presented as a dialogue between the two authors and thus offers a fresh light approach to discussing their research which I really enjoyed. The research itself is fairly basic - as I read it, they increased engagement in practical activities by making the practical activities part of the overall assessment for the course grade. "Only 54 per cent of our students said they would have completed the activities if they were not assessed." The suggestion is also that this resulted in better learning.
I think it was more than just student backlash. After all, the story begins with a description of an education conference boycott in response to an announcement that it would be sponsored by Proctorio, a test surveillance company. Still, I think this is now the right trend, and the story continues with a list of institutions dropping the product. As Duke University's academic integrity guide states, "Proctoring services essentially bring strangers into students’ homes or dorm rooms—in places students may not be comfortable exposing. These violations of privacy perpetuate inequity through the use of surveillance technologies."
I'm actually pretty sceptical about the reliability of this report from McKinsey, mostly because the the primary source data - teacher reports about how far behind their classes are - doesn't strike me as the best indicator for 'learning loss'. But I fear measuring 'where we are in the curriculum' will become a proxy for how much students learned or didn't learn during the pandemic. One place where I agree with the survey is in the assertion that we'll see the effects play out for years to come. No matter what you think about learning loss, I think we can all agree that you can't take someone out of in-person learning for a year and then just put them back without seeing any sort of impact. I can imagine, for example, students chafing at the restrictions of an in-person classroom after having experienced the freedom of the online alternative.
My first response was, "What are exit tickets?" proving that nobody knows everything even in their own area of specialization. Exit tickets are "informal assessment tools teachers can use to assess students understanding at the end of a class." In this post, Eric Sheninger looks at digital exit tickets. He suggested using GoSoapBox, a web-based clicker tool that allows responses to be aggregated on the screen. It allows students to work on problems, share their results anonymously, and allows the instructor to flag areas of concern.
This newsletter is sent only at the request of subscribers. If you would like to unsubscribe, Click here.
Know a friend who might enjoy this newsletter? Feel free to forward OLDaily to your colleagues. If you received this issue from a friend and would like a free subscription of your own, you can join our mailing list. Click here to subscribe.
Copyright 2021 Stephen Downes Contact: firstname.lastname@example.orgThis work is licensed under a Creative Commons License.