If you noticed that my newsletter was a little short yesterday, it's because I wrote a long post called 'Educational Research in Learning Technology'. I discuss the nature (and weaknesses) of research in our field. I am broadly sympathetic with the arguments offered by Philip J. Kerr in this recent post, but I have disagreements around the edges, enough that I think more discussion is warranted. Kerr begins with a discussion of systematic reviews of research and comments that they "did not paint a very pretty picture of the current state of AIEd research." This motivated his post on "things that, collectively, researchers need to do to improve the value of their work" drawn from "observations mostly, but not exclusively, from the authors of systematic reviews, and mostly come from reviews of general edtech research."
According to this article, "very few research articles on AI in education have been written by actual educators (8.9%), with the majority of authors coming from computer science and STEM backgrounds." I can see why that would be a problem. As the authors note, "This raises the question of how much reflection has occurred about appropriate pedagogical applications of AI." On the other hand, most educators have very little background in artificial intelligence. One wonders whether their role would be to simply ensure that AI does education the way it has always been done. Or maybe the results of this systematic review are misleading (it wouldn't be the first time for a systematic review) and that educators actually are involved in AI in education. After all, how could they not be?
I have not been the target of spies and spyware, at least, not to my knowledge, though common prudence dictates that I be aware and watchful for such things. After all, it would take just one disgruntled official somewhere to cause some secret agency to come along to try to discredit my research. So I'm thankful for the work of Citizen Lab, the University of Toronto’s digital surveillance and human-rights watchdog. This article profiles their work from the perspective of two recent high-profile cases.
As noted on Metafilter, "opensyllabus.org scraped 6 million syllabi and put them into a searchable database." Though I can't recommend the searches that Metafilter members immediately conducted. Nonetheless, browsing through this resource made me wonder what Tony Hirst or Matthias Melcher might do with it. Now to be clear: there's no way to actually view the syllabi, which is unfortunate and not very open (here's a sample). You can only see the list of books assigned (or associated with) the syllabi (which essentially makes the site a greate big Amazon affiliate engine). Here's some background on how the project was put together,
In the wake of recent news of Clearview AI being used in New York Schools we read that "more than 600 law enforcement agencies have adopted facial recognition software from a company called Clearview AI in just the past year... in the name of security and public safety." Tim Stahmer points to an an important observation by Edward Snowden on this topic: "Ask yourself: at every point in history, who suffers the most from unjustified surveillance? It is not the privileged, but the vulnerable. Surveillance is not about safety, it’s about power. It’s about control." The presumption here - and one that is borne out in observation, I think - is that the privileged are not surveilled, but the poor are.
A similar point was made in a terrific interview with Desmond Cole by CBC's local news. Among other things, Cole notes that armed police are never brought into rich white private schools, even though drug dealing and everything else may happen there, only the poor schools with minority populations. I strongly recommend listening to this interview.
This short post raised a good issue: how do independent reserachers obtain ethical review for their work? Being outside institutions, they do not have access to institutional review boards (IRB). According to this post, the British Educational Research Association (BERA) revised its Ethical Guidelines (48 page PDF) and added some case studies to provide guidance, but these don't really address the issue directly, in my view. Nor does this post address the issue, suggesting instead ways to educate indepent reserachers on ethics, while saying maybe publishers could consider their ethical review requirements, and, of course, that more research should be conducted. But I don't want to be too critical; I don't see anyone else talking about this problem. See also this from BERA.
Maren Deepwell links to slides from two recent keynotes (one of which done with Martin Hawksey). Though there's some overlap, they're both quite different, and the slides are definitely worth a look (I was interested to note that 'Content Management Systems and VLEs' remained the number one concern of ALT members over the years). Deepwell highlights common themes from the two events (paraphrased): more tech-critical voices, widespread digital adoption, negativity about pace and productivity, and a drive to resist surveillance capitalism.
I have traveled to hundreds of conferences over the years, and I assume the invitations will continue (though perhaps less frequently now that I will no longer allow myself to be crammed into a discount airline seat). And people are becoming more aware of the environmental impact of air travel to conferences. Which is fair enough, and I certainly agree with the intent. And that is essentially the scope of Bryan Alexander's post here. But we need a more nuanced analysis. We need as a society to make it possible to work, live and travel without destrying the environment. I drive an electric car, but I cannot convince NRC to install charging stations (I've been trying for years now). We should be able to heat our homes and offices, to move about the country, and even to fly using clean renewable energy. That's what should be fixed. Here are my thoughts on this, expanded.
This newsletter is sent only at the request of subscribers. If you would like to unsubscribe, Click here.
Know a friend who might enjoy this newsletter? Feel free to forward OLDaily to your colleagues. If you received this issue from a friend and would like a free subscription of your own, you can join our mailing list. Click here to subscribe.
Copyright 2020 Stephen Downes Contact: firstname.lastname@example.orgThis work is licensed under a Creative Commons License.