Content-type: text/html Downes.ca ~ Stephen's Web ~ Robot Teachers, Racist Algorithms, and Disaster Pedagogy

Stephen Downes

Knowledge, Learning, Community

This is Audrey Watters speaking to a class of students about the ed tech industry and especially about the use of algorithms to surveil and to predict outcomes. She covers some of the more recent events, including Ofqual's ill-conceived notion that student grades should be adjusted based on the history of the school they attend. But it makes me think - isn't this what we do anyways? The algorithms magnify and weaponize the bias, but the bias is there nonetheless. Even without AI, elite institutions somehow manage to admit more students from private upper-class schools, no matter what their grades. Even without AI, exam proctors are going to regard the darker skinned students with more suspicion. We need to look at AI in a way Watters doesn't usually, unfortunately, and that is, as a way to expose and redress bias and prejudice, as opposed to merely magnifying it. Because just going back to the way things were is just not an option for so many people.

Today: 0 Total: 1082 [Direct link] [Share]


Stephen Downes Stephen Downes, Casselman, Canada
stephen@downes.ca

Copyright 2024
Last Updated: Apr 20, 2024 06:57 a.m.

Canadian Flag Creative Commons License.

Force:yes