Week 5: Learning Analytics – Introducing the concept

"Super Crazy Dashboard" by Aaron Pareckl. Creative Commons licence CC BY
“Super Crazy Dashboard” by Aaron Pareckl. Creative Commons licence CC BY

The e-Learning Ecologies MOOC: Week 5 Question

Make a post introducing a recursive feedback concept. Define the concept and provide at least 1 example of the concept in practice. Be sure to add links or other references and images or other media to illustrate your point. If possible, select a concept that nobody has addressed yet so we get a well-balanced view of recursive feedback. Recursive feedback concepts might include:

Formative assessment; Continuous assessment; Criterion-referenced (versus norm-referenced) assessment; Intelligent tutors; Educational data mining; Learning analytics; Dashboards and mashups; Quizzes; Computer adaptive testing; Diagnostic testing; Peer review; Automated writing evaluations; Or suggest a concept in need of definition!

My Response

Week 5 of the e-Learning Ecologies MOOC takes us to the fourth of the seven “e-affordances”. This time we are looking at the notion of recursive feedback – this is suggestive of feedback that is timely and can include continuous machine-mediated human assessment from multiple perspectives (peers, self, teacher, parents, invited experts etc.). The concept that I would like to look at that could potentially play a very powerful role in recursive feedback approaches is learning analytics (LA).

The notion of learning analytics (LA) is a relatively new one, though it may have its’ early roots in “business intelligence”, a term created by H.P. Luhn in 1958 (Cooper, 2012b). Like many new terms, there has been a number of attempts to try and define it. The problem with defining any “buzz word”, as we have seen with digital literacy, is that over-use and “band-wagon jumping” reduces the specificity of the word or phrase into an “empty shell” definition (Baume, 2012) – by that, such definitions need to be “filled up” or fully developed before it can be applied to policy, strategy and practice. The gamut of LA definitions currently include:

…is the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environments in which it occurs (SOLAR, 2011:4).

Analytics is the use of data, statistical analysis, and explanatory and predictive models to gain insights and act on complex issues (Bichsel, 2012:6).

Analytics is the process of developing actionable insights through problem definition and the application of statistical models and analysis against existing and/or simulated future data (Cooper, 2012:3).

Cooper’s (2012) definition sees analytics as three activities: 1) data provision, 2) interpretation and visualisation and 3) actions based on insights.

SOLAR’s “Open Learning Analytics” (2011) proposal associates learning analytics with the type of “big data” that is being used in “business intelligence”, a term that is used to describe this intersection of data and insight. When applied to the education sector, these analytics fall into two broad “schools of thought”: learning (course and departmental levels) and academic (institutional, regional, national and international levels).

Greller & Drachsler (2012) suggest that learning analytics is made up of “soft” (e.g. society) and “hard” (e.g. data) “critical dimensions”. However, they warn against technological (i.e. biometrics), ethical (i.e. data surveillance), legal (i.e. privacy), social (i.e. exploitation of data for commercial purposes), and human (competencies in making sense of data) issues and constraints that could undermine the LA project. Buckingham Shum (2012:6), on the other hand, argues that “data is not neutral” and is “infused with human judgment” and it would be naive of us to think that the data we hold is wholly accurate, correct or complete, or “data fragmentation” as Miller & Mork (2013) would describe it. However, Pardo & Siemens (2014:443) suggest that a way forward in dealing with privacy and ethical issues is to look at other areas, such as medical research, with a view of “analyzing the possibility of translating some of the policies used in those fields to learning analytics”.

We currently use the rudimentary learning analytics tools within our virtual learning environment (VLE) to monitor student engagement with the course. This is especially useful with our first year students as they transition from school or Further Education (FE) to Higher Education (HE) and provides tutors with an “early warning system” to enable them to offer proactive interventions. The flip-side to this around maintaining healthy student attrition and retention rates which, in itself, is linked to student fees. As the 2011 Horizon Report in Higher Education (Johnson et al., 2011) notes, learning analytics has the “considerable potential to enhance teaching, learning, and assessment”.


Baume, D. (2012). “Digital Literacy and Fluency: SEDA initiatives supporting an enlightened approach to Academic Development in the field”. Educational Developments, 13(2), pp. 6-10. Available at: http://www.seda.ac.uk/resources/files/publications_129_Ed%20Devs%2013.2%20v3%20%28FINAL%29.pdf [Accessed 20.2.2015].

Bichsel, J. (2012). Analytics in Higher Education: Benefits, Barriers, Progress and Recommendations (Research Report). Louisville, CO: EDUCAUSE. Available at: http://net.educause.edu/ir/library/pdf/ERS1207/ers1207.pdf [Accessed 20.2.2015].

Buckingham Shum, S. (2012). Learning Analytics: UNESCO IITE Policy Briefing (Draft). Knowledge Media Institute, The Open University, UK: Milton Keynes. Available at: http://people.kmi.open.ac.uk/sbs/wp-content/uploads/2012/10/UNESCOIITE-LearningAnalytics.v4.pdf [Accessed 20.2.2015].

Cooper, A. (2012a). “What is Analytics? Definition and Essential Characteristics”. Analytics Series, 1(5). Bristol: JISC CETIS. Available at: http://publications.cetis.ac.uk/2012/521 [Accessed 20.2.2015].

Cooper, A. (2012b). “A Brief History of Analytics”. Analytics Series, 1(9). Bristol: JISC CETIS. Available at: http://publications.cetis.ac.uk/2012/529 [Accessed 20.2.2015].

Greller, W. & Drachsler, H. (2012). “Translating Learning into Numbers: A Generic Framework for Learning Analytics”. Educational Technology & Society, 15(3), pp. 42-57. Available at: http://www.ifets.info/journals/15_3/4.pdf [Accessed 20.2.2015].

Johnson, L., Smith, R., Willis, H., Levine, A. & Haywood, K. (2011). The 2011 Horizon Report: Higher Education Edition. The New Media Consortium: Austin, Texas. Available at: http://redarchive.nmc.org/publications/horizon-report-2011-higher-ed-edition [Accessed 20.2.2015].

Miller, H.G. & Mork, P. (2013). “From Data to Decisions: A Value Chain for Big Data”. IT Professional, 15(1), pp. 57-59. Available at: http://dx.doi.org/10.1109/MITP.2013.11 [Accessed 20.2.2015].

Pardo, A. & Siemens, G. (2014). “Ethical and Privacy Principles for Learning Analytics”. British Journal of Educational Technology, 45(3), pp. 438-450. Available at: http://dx.doi.org/10.1111/bjet.12152 [Accessed 20.2.2015].

SOLAR. (2011). Open Learning Analytics: An Integrated & Modularized Platform. SOLAR. Available at: http://solaresearch.org/OpenLearningAnalytics.pdf [Accessed 20.2.2015].