ALT Members views on Learning Analytics

ALT was approached by a writer for ManagerSeminare (German monthly magazine on professional training and development) asking for comment on Learning Analytics. Given the questions asked the views of ALT Members we put the questions to them to discuss via the ALT-MEMBERS JiscMail list. These are the resulting reflections taken 21st-28th March 2014. The editorial processes has only made minor corrections to spelling and grammar. The editors notes are in italics.

What is your definition of learning analytics and how would you distinguish learning analytics from educational data mining?

A number of definitions were put forward:

Learning analytics (LA) is data generated by engagement in online learning activity usually relating to how/when activities accessed, scores etc. Educational data mining (EDM) is extraction of data for analysis possibly for commercial purposes or to identify trends for research purposes.

Learning analytics is the intelligent use of data about learner behaviour. It overlaps with educational data mining but can also include data about an individual learner”s behaviour. Data mining assumes using bulk data.

From the EDM “Educational Data Mining is an emerging discipline, concerned with developing methods for exploring the unique types of data that come from educational settings, and using those methods to better understand students, and the settings which they learn in.”

There were several comments on the value in making a distinction:

Not sure how much you want to focus on agreed definition in contrast to interpreted ones, but this one is from the Society of LA research: “SoLAR defines learning analytics as the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs.”

I don’t see, or perhaps don’t want there to be, a big distinction between the two. LA has more origins in learning, EDM in computer science and knowledge discovery. Both are converging and should converge with LA knowing more about EDM and vice versa.

The final reflection was:

If we stick to the agreed definitions EDM is much more about the development of models and techniques (some of which borrowed from completely different domains) and experiment/fit them in the educational context. In my view this lends itself more to exploration, R&D and academic pursuit.

LA is much more about the application of these techniques with something specific in mind (I.e. The improvement of learning and teaching, the learning environment or the student experience). In fact I don’t think that LA is much different from the traditional quality enhancement process, just bring in more empirical data and use more sophisticated techniques borrowed from EDM thanks to the increased availability of tracking data of some sort which is a lot more precise than self-reported metrics (surveys) or subjective teacher/student reflections

How can we use learning analytics to adapt and improve learning processes?

A number of the comments on this question asked us to remember the learner:

It is important that analytics be seen as a way of informing learners as well as instructors/managers. Also a way of influencing learning design. Well designed analytics tools for example can help a student to work more effectively and better understand their progress.

Which was responded with:

I agree – with above. I think it’s useful to consider them as one of a variety of the feedforward mechanisms we can use.

Other suggestions were:

By identifying single points of design failure and success contexts.

Use data to inform students, teachers, designers, researchers about learning by learners.

Could you give an example?

Examples were:

Our kaltura integration [kaltura.com] gives the course tutor an opportunity to see how students are using their video resources (drop out points etc) helping to inform learning design.

Analysis of failure in online course completion rates showed high failure on start and to submit first project. Led to new ‘#getting started’ module.

Using information on students’ misconceptions in algebra for a subsequent lesson addressing this misconception.

[It’s worth remembering that whilst the term ‘learning analytics’ is relatively new there are a number of well established instruments and techniques for improving learning design. What LA has done is focus the attention of faculty on exploring increased use of data informed insight. A combination of improved data collection and handling making analysing things like video drop out points easier.]

Based on the possibilities of “big data” – what kind of predictions and recommendations are possible?

There was some questions about the relationship of Big Data and Learning Analytics:

In Learning Analytics Big Data is a bit of a red herring. The majority of data being dealt with neither has the volume or velocity to really be ‘Big Data’. More importantly it is perhaps important for Learning Analytics to distance itself from Big Data. Big Data is more often interested in correlation rather than causation. So Amazon doesn’t care why you like fly fishing and grunge music as long as you end up at the checkout with both. For Learning Analytics to be effective and useful we should never forget to care. Analytics by it’s nature will generally churn out a number, but as educators we should care what that number means and be ready to interpret what it means for the learner and learning environment.

To me, the Big Data hype seems to be more about management information. No particular focus on learning, unlike LA and EDM. However, perhaps some techniques can be utilized for useful LA/EDM applications.

There were a number of further questions triggered:

This about scale: if the data of a single ‘learning opportunity’ or an individual course is considered, this is not big at all. However when one starts to aggregate to institutional level or sector level it is a completely different story. One could argue that learning is bound to its context and therefore aggregation is meaningless, but this is where we need to work on and the evidence comes from the relatively big data in moocs. However, would institution be willing to share data? What are the ethical implications? How useful it would be?

This and the follow comment hint at a shortfall in skills or tangible solutions:

Empower and train educators and designers to use the techniques

And what kind of disadvantages or risks do you see (e.g. for employees)?

Comments on caring and understanding resurfaced:

Users of online environments generally understand that they leave an electronic footprint and that their activity is tracked but should they feel that information is used maliciously or shared inappropriately there is a real risk of disenfranchising them.

There is a real risk of forgetting the human element, by solely focussing on activity online only. This data can be indicative at best; it’s not ‘true’ or the sole source of truth about an individual.

Which was agreed with:

Absolutely agree with above, learning is a complex, social, sometimes unconscious event – try designing an algorithm for that!

I agree with the comment about the risk of forgetting the human element. Clearly it can be of some use to gather and analyse data about how and when students interact with online resources etc. But there is a huge danger in thinking that we can understand how people really learn just by analysing data. Human learning, human achievement and human capability are complex non-linear systems (properly so called) and during the last 70 years scientists have increasingly come to realise that complex non-linear systems cannot be understood by reductionist measurement. We need less data and more understanding of the big picture.

Big data is not the same as the big picture

Concerns were also raised about using the data to find the answers you want:

Although I understand the criticism of Big Data as described above, I do see history repeating itself with a ‘clash of the paradigms’. The ‘human’ element to me seems the ‘interpretive paradigm’. Valid and worthwhile but lets not disregard some advantages of the positivist more quantitative paradigm as well. I think the main risk lies in ‘data fishing’. You’re bound to find interesting things in large datasets.

The challenge is to measure what is valuable rather than valuing what is measurable.

Reflections

Terry Loane shared the following reflections on responses to the questions:

For me one of the most important current questions in education is to ask to what extent we can really understand human learning, achievement and capability through gathering and analysing quantitative data. I certainly agree with the titanpad comment that:

Big data is not the same as the big picture.

I found the following comment particularly interesting:

Although I understand the criticism of Big Data as described above, I do see history repeating itself with a ‘clash of the paradigms’. The ‘human’ element to me seems the ‘interpretive paradigm’. Valid and worthwhile but let’s not disregard some advantages of the positivist more quantitative paradigm as well. I think the main risk lies in ‘data fishing’. You’re bound to find interesting things in large datasets.

I would suggest that the world should have moved on from the particular paradigm clash referred to here, the conflict between a positivist/reductionist/quantitative paradigm and an interpretive/humanist/soft/qualitative paradigm. Integrating these two approaches is what systems thinking and complexity science is all about! Yet we have seen a ‘resurgent positivism’, a very shallow data-driven world view, emerge in educational discourse in recent years. I recommend this article to anyone who is interested in this: http://files.eric.ed.gov/fulltext/EJ832194.pdf

Let me finish with the words of the great scientist Werner Heisenberg (whose ‘uncertainty principle’ was one of the key nails in the coffin of the idea that you can understand the world through measurement):

The positivists have a simple solution: the world must be divided into that which we can say clearly and the rest, which we had better pass over in silence. But can any one conceive of a more pointless philosophy, seeing that what we can say clearly amounts to next to nothing? If we omitted all that is unclear we would probably be left with completely uninteresting and trivial tautologies.

If you enjoyed reading this article we invite you to join the Association for Learning Technology (ALT) as an individual member, and to encourage your own organisation to join ALT as an organisational or sponsoring member

Leave a Reply

Your email address will not be published. Required fields are marked *