Stephen Downes

Knowledge, Learning, Community
I need to reserve some time for a longer response to Paul Capon's response to my criticisms of the CCL report. But for now, as a reaction to the methodology employed by the report (and endorsed my various EU and OECD organizations) I offer, as an outline, this post. It is a summary - an all-to-brief and possibly incomplete summary - of the sorts of errors these organizations make. My difficulty - and I easily confess it here - is that in such matters the scientific converges with the political. It is difficult to criticize such a report without seeming political even if the criticisms are exclusively scientific; and this is compounded is the bases for the methodology employed in the report are in fact political.

Anyhow, the criticisms, which are worth listing:
  • Believing that their expertise is more valuable than the knowledge of their subjects, and that they can avoid bias.
  • Seeking to mandate ideal behaviour or organisational strucuture.
  • Assuming an experiment will scale, or replicate in a different context.
  • Focusing on efficiency, not effectiveness; thinking of stability, rather than resilience.
  • Using outcome based targets for other than ordered systems in equilibrium states.
  • Using the wrong model of science for evidence. Evidence based policy is all well and good, but if your model of evidence requires provable outcomes in advance then your science model is locked into the last century.
  • Using hindsight rather than foresight.
The basis for these points (and you should read them in full in the original post) is "structured linear methods being applied to non-linear complex situations." As Dave Snowden comments in this post, very capable and competent professionals very easily fall back into these traps because they "have been trained for years in a particular approach to change."

[Direct link]

Stephen Downes Stephen Downes, Casselman, Canada

Creative Commons License.

Copyright 2021
Last Updated: Mar 30, 2021 02:53 a.m.