The argument in the story is in the first line: "Coursera's market debut is proof that the future of education is online." It is of course no such thing. The markets follow fads and rumours, are fickle and irrational, and have a history of making billion-dollar mistakes. Just ask the people at GameStop. Still, Coursera's chief executive Jeff Maggioncalda has a point when he says "we all will not go back to offices, we all will not go back to campuses. And so, online work and online learning - among other things - are here to stay." True. But they were here to stay before Coursera's IPO, and indeed, before Coursera ever came into the market.
The Australian government has released a consultation paper (17 page PDF) as part of its consultation on an international education strategy. This was a major sector for Australia, but institutions are being warned that "international students will not return en masse until 2022." The sector saw a 22 percent drop in new enrollments in 2020. The proposed priorities would put "students at the centre" (but really, who doesn't these days?) and seeks to broaden the base of international education through new modalities and diversification. But the paper doesn't really answer the question: if you're not actually traveling to Australia, why study at an Australian institution? And "as borders remain closed, there are suggestions that students originally looking to study in Australia are moving towards other countries such as the United Kingdom and Canada." I don't doubt that Australia could provide a first-class learning experience online, but for high-paying online students, marketing matters.
One way to get the credit for developing something is to simply make the claim that you've developed it. Then, if you have the right title (at MIT Press, say) and the right platform (a scholarly publishing website, say) then you can get the credit. Thus we read here the claim "When we developed the concept at Cell Press during my time there, we called it 'Master Classes'", and expect credit to be duly awarded, as it is in some of the comments. But of course, the concept has been around since forever - we've been doing it at NRC as long as I've been there - and the idea of exchanging skills with informal in-house presentations, while good, is far from original. I wouldn't complain, but at some institutions claiming credit for work done elsewhere has been elevated to an art form.
My Pocket recommender served up Robert Epstein's The Empty Brain today, a paper I first documented here in 2016. If you haven't read it you absolutely should, because it's the most thorough refutation of the idea of 'the brain as a computer' that I've read. As I reread this paper, along with the Arthur Schopenhauer paper (see below), I started thinking about the major objection to neural network theory, namely, that neural networks are not able to perform logical tasks on their own, such as mathematical reasoning, grammatical construction, inference - you know, what Chomsky called Plato's problem. We can represent that challenge by asking whether neural networks are 'Turing complete", in other words, computationally universal. In the past, the answer to that question has been "no" - as Schopenhauer might say, you need a will as well as a representation. But when I searched through Google today I found this paper (36 page PDF), which shows two popular neural network architectures "to be Turing complete exclusively based on their capacity to compute and access internal dense representations of the data," and even more importantly, "neither ... requires access to an external memory to become Turing complete."
Arthur Schopenhauer was in what we might call the 'third generation' of German philosophy, taught by second-generation thinkers (notably Fichte) who in turn studied the work of Immanuel Kant. What sort of picture of the mind must he have had, working as he did well before the age of computing and automation? His view of intelligence, at once both prescient and enormously influential, saw it composed of "will and representation". Or as we might say today: algorithm and model. There is a direct link from Schopenhauer through Wittgenstein to contemporary thought, wherein we might say, "science has its foundation in its systematic form of its representation." At the start of the last century, the only understanding we had of mind was of (something like) 'logic machines', and when we actually built some, it felt like this huge breakthrough in our understanding of human intellect. It was a completely understandable turn, it had a great outcome, but it was, in my view, wrong.
Doug Peterson raises a great question regarding the long-term sustainability of remote learning. The curriculum, he observes, is changing. Just as math today no longer looks like the math we learned as students, so also computer programming has changed. In particular, we've gone from the days where we wrote programs on a computer and got results on the screen or printout. Today's computing involves working with robots or drones using languages like Logo on tools like Lego or micro:bit. And it's not just computing: the new math curriculum may use these sorts of tools. So when the student is at home, what does the parent do? "Unless Mom and Dad have the ability to run out to an educational store and buy it," a robot or manipulatives, "you’re left with a Plan B." What does plan B even look like? Nobody is to blame, writes Peterson, but there does need to be a plan.
Martin Weller points to the the key findings on learning technology in the age of Covid from the ALT 2020 Annual survey. There's also discussion of the results from Maren Deepwell and Helen O'Sullivan along with two additional summaries, one on trends and the other on what it means to be a learning technologist. The full survey, when it's released, will be available here (presumably). Findings include the observation that "87% of Members feel Learning Technology is more positively perceived" and that "58% of respondents felt the changes were sustainable." Weller comments that "many learning technologists have felt some sense of vindication over the past year" and finds "the finding that many felt this approach was sustainable... perhaps contrary to the view I’d formed viewing online discussion." Yeah. Don't depend on Twitter for your views on learning technology.
You might remember Elluminate, the videoconferencing system of choice back before Blackboard acquired it and turned it into Blackboard Collaborate. Looking back after the Year of Zoom, this seems like a missed opportunity for Blackboard. So what's on the roadmap for what they call their virtual classroom? There isn't much. The big feature mentioned here is the 'chat mention', which is basically the use of the @ symbol to get someone's attention using chat. There's also a 'gallery view', already familiar to users of other systems such as Shindig. To be fair, given that there's already video and breakout rooms and the usual extras for a virtual classroom, Blackboard probably saw no need to extend its subscriber base.
This newsletter is sent only at the request of subscribers. If you would like to unsubscribe, Click here.
Know a friend who might enjoy this newsletter? Feel free to forward OLDaily to your colleagues. If you received this issue from a friend and would like a free subscription of your own, you can join our mailing list. Click here to subscribe.
Copyright 2021 Stephen Downes Contact: firstname.lastname@example.orgThis work is licensed under a Creative Commons License.