[Presentation] Learning Analytics: Threats and opportunities

Update 10/02/2015: A recording (slides/audio) is available here

Today I’ve been invited to ALT’s White Rose Learning Technologists SIG to talk about Learning Analytics and educational mining of Twitter. The Learning Analytics part of this was something I was somewhat reluctant to do as I have recently realised how much I don’t know about this area. Another factor was my fear that learning analytics is being eroded by ‘counts’ rather than actually caring about the learner. This appears to be a shared concern and I was fortunate to recently see a talk given by Dragan Gasevic which addressed this. A recording on his session is currently here and Dragon’s slides are on Slideshare, which I heavily reference in my own talk. I’ve put my own slides online and a recording may be available soon. Here are some notes/reflections:

Absence of theory

Amazon cares not a whit *why* people who buy german chocolate also buy cake pans as long as they get to the checkout buying both – Mike Caulfield – Short Notes on the Absence of Theory

I was fortunate to be in the bar room when ‘absence of theory’ was being discussed at MRI13. The thing that hit me hardest was the reflection that throughout my career there has been an absence of theory. Like many other learning technologists I jumped into this area from another discipline, in my case structural engineering. Consequently I started in this area with more knowledge of the plastic analysis or portal frames than  educational theory. Being a curious person has taken me down numerous avenues and often along the way I’ve been lucky to work and learn with some of the best. For example I was fortunate to work with Professors David Nicol and Jim Boyle at the University of Strathclyde, Jim arguably being responsible for importing Peer Instruction to the UK. So while I have some theory I don’t have enough and whilst I have connections to some of the best people in LA my job isn’t aligned. But enough about me, without theory the danger is you have data but no actual insight to what it means.

Visualizations

Graphs can be a powerful way to represent relationships between data, but they are also a very abstract concept, which means that they run the danger of meaning something only to the creator of the graph … Everything looks like a graph, but almost nothing should ever be drawn as one. – Ben Fry in ‘Visualizing Data’

The consequence is ‘every chart is a lie’, a representation of data defined by it’s creator. One option here is to turn the learner into the creator. With modern web browsers it’s becoming even easier for someone to become the explorer of their own data. Dashboards, which appear to have the same appeal as Marmite, can also be personalised to give more meaning to the learner. Even with personalisation and customisation there is a danger of misinterpretation which Dragon highlighted with Corrin, L., & de Barba, P. (2014). ‘Exploring students’ interpretation of feedback delivered through learning analytics dashboards’.

image

Ethics and privacy

The worlds of privacy and analytics intersect …not always happily – Stephen Downes

I was browsing some slide decks by Doug Clow as part of the LACEProject and he captured the sentiment nicely highlighting that there needs to be transparency when using learning analytics. He contextualised this around guidance and support rather than  surveillance and control. Given the varying degrees of apathy I see around data and privacy this is a conversation always worth having. There is clear outline of ethical considerations in the Analytics for Education chapter penned by my co-authors Sheila MacNeill and Lorna Campbell:

Ethical Issues

As institutional managers, administrators and researchers are well aware, any practice involving data collection and reuse has inherent legal and ethical implications. Most institutions have clear guidelines and policies in place governing the collection and use of research data; however it is less common for institutions to have legal and ethical guidelines on the use of data gathered from internal systems (Prinsloo & Slade, 2013). As is often the case, the development of legal frameworks has not kept pace with the development of new technologies.
The Cetis Analytics Series paper on Legal, Risk and Ethical Aspects of Analytics in Higher Education (Kay, Korn, & Oppenheim, 2012) outlines a set of common principles that have universal application:

  • Clarity – open definition of purpose, scope and boundaries, even if that is broad and in some respects extent open-ended.
  • Comfort and care – consideration for both the interests and the feelings of the data subject and vigilance regarding exceptional cases.
  • Choice and consent – informed individual opportunity to opt-out or opt-in.
  • Consequence and complaint – recognition that there may be unforeseen consequences and therefore provision of mechanisms for redress. (p. 6)

In short, it is fundamental that institutions are aware of the legal and ethical implications of any activity requiring data collection before undertaking any form of data analysis activity.

Opportunity

At best analytics can help start a conversation. People have to be willing to take the conversation on – Roberts, G. Analytics are not relationships

My biggest fear is Learning Analytics just becomes ‘computer says no’. I’m reassured that there are many people working very hard to make sure this doesn’t happen, but in the glitz and glamor of ‘big data’, prediction algorithms and dashboards there is a danger that we start caring about the wrong thing. For me the biggest opportunity is analytics are used as feedback, helping inform the conversation.

chevron_left
chevron_right

Join the conversation

comment 2 comments
  • acooper667

    Martin – thanks for that nice readable slice of angst!
    I share your concerns and, as someone who has been de-facto promoting learning analytics, I somewhat-fatalistically expect to utter “thats not what I meant…” more frequently as time goes on.
    I think it is important that we dont just stick together and grumble (not that I’m saying you are, maybe I am ;-), but get out there and engage with those who we think are leading us to mis-adventure. I suggest this should involve active promotion of positive ideas in addition to critique. In this respect, I think emphasising the role of analytics in assessment and feedback has potential because there is established academic practice to build on, because we know this is an issue for student satisfaction, and because it has a clear link to “student success” (a term I dislike becase it is wholly under-specified, frequently used in rhetoric…). When it comes to predictive models, I suggest we try to turn the tables and emphasise the models over prediction. i.e. to ask about the factors that seem to be influential and to ask “how can we improve?”, “what are we doing wrong that means that students like this are not succeeding?”. These beg more questions, stimulate challenge to the status quo; analytics is expanding possibilities rather than homing in (and potentially objectifying the student and neglecting context).
    Cheers, Adam

  • Martin Hawksey

    Hi Adam – thanks for your thoughts. I agree practively highlighting positive approaches to analytics is important which is why I follow the LACE Project with interest ;). My grumbles are perhaps more directed at myself as I wrestle with the oppourtunities.
    cheers,
    Martin

Comments are closed.

css.php