Core area 2: Teaching, learning and/or assessment processes

Candidates should demonstrate their understanding of and engagement with teaching, learning and assessment processes. ‘Engagement’ may include using understanding to inform the development, adaptation or application of technology. Note that your learners are the people with whom you work. For teaching staff this will typically be students. For many learning technologists this may be students or the staff that you support and train. This should include evidence of:

a) An understanding of teaching, learning and/or assessment processes

Statements here might relate to areas such as teaching experience, learning design, curriculum development, work-based assessment, the creation and execution of a programme of training and so on. Evidence might include being on the register of the Higher Education Academy, a PGCE award, having completed a SEDA-approved course, extracts from your Institute for Learning (IfL) portfolio or undertaken relevant sections of the Certified E-Learning Professional courses. Commentaries from peers on your approach would also provide suitable evidence. Other possibilities include teaching experience, reflective statements that analyse experience in terms of learning theory, pedagogic approaches, sociological theories, or a comparable, recognised perspective. In relation to learning design, a report, specification or reflective statement might be provided that clearly elaborates the principles that informed the design process. In any collection of evidence there should be some consideration of how technology is changing approaches to teaching and learning and/or the roles of learners, teachers and support staff.

For the last 10 years I’ve been interested in the developments in open courses and open educational practices. The origins of this interest lay in the work of George Siemens, Stephen Downes, Dave Cromier, Bonnie Stewart and others in the design and implementation of Massive Open Online Courses (MOOCs). I’ve been particularly interested in the technology used to support open courses and how this impacts the learning design, and ultimately, the learner experience. I’ve been interested in the development of open courses, particularly those with a connectivist underpinning, because often the distinction between the learner and educator becomes blurred.

My knowledge in the design of connectivist MOOCs used to support the course team who developed ALT’s Open Course in Technology Enhanced Learning (ocTEL). As part of my role I was responsible for providing guidance on the infrastructure used to support the delivery of the course. A parallel interest, which has developed as a result of my involvement in open course design, is Learning Analytics and the use of data to gain actionable insight in learning and the environments in which they occur. This has manifested itself in the development of TAGSExplorer, mentioned in the previous section, which is a free visualisation tool designed to support learners and educators who are interested in using Twitter, and specifically hashtag communities used in education.

More recently my knowledge and interest in both open courses and learning analytics has been applied in the support of the Blended Learning Essentials courses developed for FutureLearn by the University of Leeds and UCL. As part of this I have been responsible for producing a series of course evaluation reports. These reports are designed to provide the course design team with the quantitative analysis of learner activity data generated during the courses as well as pre and post course surveys. Because this work has been funded by the Ufi this includes analysis of some top level indicators such as the number of learner registrations and conversion to active participants, but the reports also provide detailed analysis of learner engagement with each course step/activity. The reports have enabled the course team to identify important learner transition periods, for example, the last/step of activity a learner completes, as well as overall engagement patterns which help the course team allocate resourcing.

A key design decisions when developing the reports for BLE was to make it easy to both replicate the analysis and also have a degree of transparency in how the data was used. When designing the process I was aware of work Dr Tony Hirst at the Open University was doing using Jupyter Notebooks for the computational analysis of FutureLearn course data. At the time whilst I had a conceptual understanding of Jupyter Notebooks I had never used or written the code to make them, but given all of the affordances of using them I decided to use it as an opportunity to learn about them. The approach I adopted was action learning, building on the notebook created by Tony, using internet searches to learn how to extend the existing analysis and contacting Tony when I occasionally got stuck. Like Tony I’ve reshared my work openly in case it can be a benefit to others.

Evidence

Reflection

On a practical level the work I’ve carried out providing course evaluation reports was an opportunity to learn about Jupyter Notebooks. I’m under no illusion that what I’ve gained in this area is surface knowledge but at the same time believe it was the pragmatic choice which ultimately meant that the course evaluation could be done in a timely manner. I am aware that in some ways it would have been better to be able to spend more time developing the analysis and also apply more of an iterative approach to their development, particularly giving the course team more opportunity to provide feedback during the development phases.

On reflection I’m also aware that my reporting relied purely on the quantitative analysis of the data available. As part of the BLE project qualitative analysis was conducted but this was done separately and there was perhaps a missed opportunity to use this to inform some of the quantitative work I was leading on. This said the work has had a positive impact providing the course team with the information they can use in both the implementation and support of the BLE courses.

b) An understanding of your target learners

Statements should show how you have found out about learners’ needs and the context for their studies, and how you have developed approaches that reflect this. Evidence might include a description of how assistive technologies have been used to support disabled students, how learner feedback has influenced the design of an e-portfolio, how the needs of work-based learners or overseas students have shaped the curriculum, or records of conversations with product analysts, marketing departments or course teams and the resulting plans for your design. Evidence of changed practice, rather than simply the recognition that this is an important area, is required.

In the previous section I have talked about my interest in open courses and the technology used to support them. I’ve also mentioned that the nature of these courses means often your identity as a learner or the teacher becomes blurred, there being times when you learn from others or others learn from you. The development of my TAGSExplorer Twitter visualisation tool is a prime example of this. This tool was developed from my personal experience as a learner in an open course trying to understand where I was situated within the distributed community of other learners on the course. TAGSExplorer was developed so I could see and explore the course learner community and make new connections to both other learners and the knowledge they shared.

TAGSExplorer still develops, albeit at a slow pace, based on both the feedback I receive either directly or via the TAGS Forums and observations of how it is used. A recent example of this was the feedback I received from Alan Levine who was looking for a way to highlight particular nodes in the conversation. Via a series of exchanges via Twitter DM I was able to identify Alan’s requirements and propose a solution that we both felt would work. Following the implementation of this feature, I was able to, thanks to a blog post by Alan, follow any supplemental feedback from learners and users as to how this feature could be improved.

Evidence

Reflection

As TAGSExplorer is an unfunded personal project I’ve been limited in the amount of time I’ve been able to dedicate to its development. Having also completed studies in Human Computer Interaction (HCI) and usability testing I’m very aware that there are a number of techniques I could be applying such as cognitive walkthroughs, heuristic evaluation or even just focus groups which could be used to improve the functionality of this tool. That said one of the benefits of adopting a philosophy of being an open practitioner is that it creates opportunities where people, aware of my interests, are happy to direct me to resources or provide feedback I can use to develop both the TAGSExplorer tool and my knowledge in this area.

css.php