By @mhawksey

#LAK13: Recipes in capturing and analyzing data – Canvas Network Discussion Activity Data

In my last post I looked at the data available around a course hashtag from Twitter. For this next post I want to start looking at what’s available around the Canvas Network platform which is being used to host Learning Analytics and Knowledge (LAK13). Sizing up what was available I did come across the Canvas LMS API documentation, which provides a similar method of accessing data as the Twitter API. I wasn’t sure if this extended to Canvas Network but because the authentication method it uses (oAuth2) isn’t possible using my dev tools of choice (mainly Google Apps Script) I looked for something else.
Whilst browsing on the discussion page for the course I noticed that my browser was auto-detecting a feed:

an excerpt of this feed is below:

https://learn.canvas.net/courses/33/discussion_topicsLearning Analytics and Knowledge Discussion Feed
  2013-02-20T12:39:02+00:00
  	Discussion: Week 8 Discussion Forum
    tag:canvas.instructure.com,2013-02-01:/discussion_topics/discussion_topic_580
    2013-02-01T21:15:20+00:00
    2013-02-01T21:15:02+00:00
    	George SiemensDiscussion: Week 3 Discussion Forum
    tag:canvas.instructure.com,2013-02-01:/discussion_topics/discussion_topic_575
    2013-02-01T21:15:56+00:00
    2013-02-01T21:13:24+00:00
    	
      George Siemens

Looking at the raw feed I could see it wasn’t limited (often feeds only contain the last 10 entries) and contained the entire content of messages.

Update: Below I use Google Apps Script to extract the data. Since then I’ve created a ‘no code’ solution that only uses existing Spreadsheet formula. There is an accompanying presentation (the webinar should eventually appear here)

Looking for an easy way to consume this I first turned to the importFeed formula in Google Spreadsheet’s but unfortunately it only returned the last 20 results. A trick I’ve used in the past is to put feeds through Yahoo Pipes to get a JSON/CSV to work with, but as working with dates this way isn’t straight forward I opted for some Google Apps Script which would create a custom formula to fetch the feed from Canvas Network and enter the results into a sheet.  The 12 lines of code for the main part of this are below:

function getCanvasDiscussions(url) {
  var response = UrlFetchApp.fetch(url);
  var contentHeader = response.getHeaders();
  if (response.getResponseCode() == 200) {
    var d = Xml.parse(response.getContentText()).feed.entry;
    var output = [['published','updated','title','author','link','id','content']];
    for (i in d){
      output.push([getDateFromIso(d[i].published.Text),getDateFromIso(d[i].updated.Text),d[i].title.Text,d[i].author.name.Text,d[i].link.href,d[i].id.Text,d[i].content.Text]);
    }
    return output;
  }
}

The getDateFromIso is a subfunction I use quite often and is available in this stackoverflow answer. Adding the above code to a Google Sheet (via Tools > Script editor..) allows me to use a custom formula to fetch the data.
Below is a quick look at the data returned (here it is published in a table). In the columns we have publish dates, title, author, link, uri and post content. As it goes this isn’t too bad. The big thing that is missing is whilst we can see which topic the message is in the reply threading is lost.

Even with this like the #lak13 Twitter dashboard from last week I can quickly add some formulas to process the data and get an overview of what is going on (for the live view visit this spreadsheet – File > Make a copy is you want to edit).

This obviously isn’t a complicated analytic and it wouldn’t surprise me if the course tutors didn’t have something similar on the backend of Canvas Network. As a student it’s useful for me to see how I’m doing compared to others on the course and get a sense of who else is contributing. [Adam Cooper has a great post on How to do Analytics Right… with some tips he picked up for John Campbell who is behind Purdue’s Signals Project which fits in nicely here.]

Summary

So with a bit of ken and a couple lines of code I can see how the #lak13 discussions are going. Again I’ve avoided any deep analytics such as analysing what has been said, to who, at what time, but hopefully now that I’ve highlighted and freed the data you can do something else with it. Not being able to extract the conversation thread is a little disappointing as it would have been nice to fire up SNAPP or NodeXL, but I’ll have to save those for another day 😉

Exit mobile version