The title is provocative, but maybe a bit overstated. Here's the argument: why not have students analyze AI-generated writing (instead of writing their own essays)? Because "this approach becomes the dominant mode, displacing rather than supplementing the generative work students need to do themselves." You can only get so far studying what others have written; you have to write for yourself to really understand it. Couros decomposes the original suggestion, identifying assumptions it rests on (for example: students are able to analyze writing, students don't need to generate their own). But even more importantly, there's the risk that students won't develop sufficient critical thinking skills. "Critical media literacy isn't just a nice academic skill. It's a survival capacity. And we're proposing to develop it by removing the very experiences that might allow students to understand, at a visceral level, what synthetic content lacks." But... is that the skill people really need? We need better standards than "two legs good, zero legs bad." I think what we really need (and never really been taught well) is the means to distinguish between what can be trusted and what can't (no matter who or what created it).
Today: Total: [] [Share]

