Stephen Downes

Knowledge, Learning, Community

David Wiley considers the impact of tools like chatGPT on instructional design, where this is "the process of leveraging what we understand about how people learn to create experiences that maximize the likelihood that the people who participate in those experiences will learn." But what is not instructional design, he says, is "the creation of accurate descriptions and explanations of facts, theories, and models." He appeals to the well-worn distinction between 'informational resources' and 'educational resources' (by contrast, I have long argued that what makes something a learning resource is how you use it, but I digress). To him, it's not an educational resource unless you add (at a minimum) practice and feedback. Not surprisingly, while he agrees that AIs will make it a lot easier to create informational resources, some sort of special instructional designer skill will still be required, specifically, "instructional design expertise will be reflected in the output of these systems in proportion to the degree that instructional design expertise is embedded in the prompts fed into the systems." Given that nothing else in AI is "proportional to the input" I don't see why instructional design should be the exception. I think we'll find that, to the AI, the distinction between information and education is mean ingless; it's all just content.

[Direct link]

files/images/photo-1523726491678-bf852e717f6a.jpg

Stephen Downes Stephen Downes, Casselman, Canada
stephen@downes.ca

Creative Commons License.

Copyright 2023
Last Updated: Jan 24, 2023 3:20 p.m.