November.21.2022

Five thoughtful ways to approach artificial intelligence in schools

By Greg Thompson, Kalervo Gulson and Teresa Swist

The use of artificial intelligence in schools is the best example we have right now of what we call a sociotechnical controversy. As a result f of political interest in using policy and assessment to steer the work that is being done in schools, partly due to technological advances and partly due to the need for digital learning during COVID lockdowns, controversies have emerged regarding edtech and adaptive technologies in schools. 

An emerging challenge for research has been how to approach these controversies that require technical expertise, domain expertise and assessment expertise to fully grasp the complexity of the problem. That’s what we term a ‘sociotechnical controversy’, an issue or problem where one set of expertise is not enough to fully grasp, and therefore respond, to the issue at hand. A sociotechnical problem requires new methods of engagement, because: 

  1. No one set of expertise or experience is able to fully address the multi-faceted aspects of a sociotechnical controversy.
  2. We need to create opportunities for people to come together, to agree and disagree, to share their experience and to understand the limits of what is possible in a given situation. 
  3. We have to be interested in learning from the past to try to understand what is on the horizon, what should be done and who needs to be made aware of that possible future. In other words, we want to be proactive rather than reactive in the policy space.

We are particularly interested in two phenomena seemingly common in Australian education. The first of these concerns policy, and the ways that governments and government authorities make policy decisions and impose them on schools with little time for consideration, resourcing devoted to professional preparation and awareness of possible unintended consequences. Second, there tends to be a reactive rather than proactive posture with regard to emerging technologies and potential impacts on schools and classrooms. 

This particularly pertains to artificial intelligence (AI) in education, in which sides tend to be drawn over those who proselytize about the benefits of education technology, and those worried about robots replacing teachers. In our minds, the problem of AI in education could be usefully addressed through a focus on the controversy in 2018 regarding the use of automated essay scoring technology, that uses AI, to mark NAPLAN writing assessments. Our focus on this example was because it crystallised much about how AI in schools is understood, is likely to be used in the future and how the impacts that it could have on the profession. 

On July 26 2022, 19 academic and non-academic stakeholders, including psychometricians, policy scholars, teachers, union leaders, system leaders, and computer scientists, gathered at the University of Sydney to discuss the use of Automated Essay Scoring (AES) in education, especially in primary and secondary schooling. The combined expertise of this group spanned: digital assessment, innovation, teaching, psychometrics, policy, assessment, privatisation, learning analytics, data science, automated essay feedback, participatory methodologies, and emerging technologies (including artificial intelligence and machine learning). The workshop adopted a technical democracy approach which aimed not at consensus but productive dialogue through tension. We collectively experimented with AES tools and importantly heard from the profession regarding what they knew would be challenges posed by AES for schools and teachers. Our view was that as AI, and tools such as AES are not going away and are already widely used in countries like the United States, planning for its future use is essential. The group also reinforced that any use of AI in schools should be rolled out in such a way as to place those making decisions in schools, professional educators, at the centre of the process. AI and AES will only be of benefit when they support the profession rather than seek to replace it..

Ultimately, we suggested five key recommendations.

  1. Time and resources have to be devoted to helping professionals understand, and scrutinise, the AI tools being used in their context. 
  2. There needs to be equity in infrastructure and institutional resourcing to enable all institutions the opportunity to engage with the AI tools they see as necessary. We cannot expect individual schools and teachers to overcome inequitable infrastructure such as funding, availability of internet and access to computational hardware. 
  3. Systems that are thinking of using AI tools in schools must prioritise Professional Learning opportunities well in advance of the rollout of any AI tools. This should be not be on top of an already time-poor 
  4. Opportunities need to be created to enable all stakeholders to participate in decision-making regarding AI in schools. It should never be something that is done to schools, but rather supports the work they are doing.
  5. Policy frameworks and communities need to be created that guide how to procure AI tools, when to use AI, how to use AI why schools might choose not to use AI in particular circumstances. 

From working with diverse stakeholders it became clear that the introduction of AES in education should always work to reprofessionalise teaching and must be informed by multiple stakeholder expertise. These discussions should not only include policymakers and ministers across state, territory, and national jurisdictions but must recognise and incorporate the expertise of educators in classrooms and schools. A cooperative process would ensure that diverse stakeholder expertise is integrated across education sector innovation and reforms, such as future AES developments. Educators, policymakers, and EdTech companies must work together to frame the use of AES in schools as it is likely that AES will be adopted over time. There is an opportunity for Australia to lead the way in the collective development of AES guidance, policy, and regulation. 

Link to whitepaper & policy brief. https://www.sydney.edu.au/arts/our-research/centres-institutes-and-groups/sydney-social-sciences-and-humanities-advanced-research-centre/research.html

Greg Thompson is a professor in the Faculty of Creative Industries, Education & Social Justice at the Queensland University of Technology. His research focuses on the philosophy of education and educational theory. He is also interested in education policy, and the philosophy/sociology of education assessment and measurement with a focus on large-scale testing and learning analytics/big data.

Kalervo Gulson is an Australian Research Council Future Fellow (2019-2022). His current research program looks at education governance and policy futures and the life and computing sciences. It investigates whether new knowledge, methods and technologies from life and computing sciences, with a specific focus on Artificial Intelligence, will substantively alter education policy and governance.

Teresa Swist is Senior Research Officer with the Education Futures Studio and Research Associate for the Developing Teachers’ Interdisciplinary Expertise project at the University of Sydney. Her research interests span participatory methodologies, knowledge practices, and emerging technologies. She has a particular interest in how people with diverse expertise can generate ideas, tools, and processes for collective learning and socio-technical change.

Republish this article for free, online or in print, under Creative Commons licence.

One thought on “Five thoughtful ways to approach artificial intelligence in schools

Comments are closed.