A2: Teach and/or Support learning.
A core way in which I support learning is in my development of and training in the Virtual Learning Environment (Moodle), ensuring its basic setup is consistent across programmes and modules, that academic staff have access to knowledge and support to ensure courses are as appropriate, accessible and beneficial as possible to their teaching and to their students’ learning. I regularly deliver Moodle training sessions to academic staff, from new starters, to long-established professors, often with a huge variety of technological competence in the room. Below, I outline specific experiences developing and implementing a new course template, embedded as a starting point for every taught course, following a systematic and thematic Moodle upgrade in July 2019.
My initial approach included extensive research: auditing existing Moodle use, a case study of label-led module redesign (Evidence 3), academic feedback on draft templates, the TEL user forum, programme meetings and informal conversations, as well as mapping requirements for quality and monitoring purposes (K1, K2, K3, K6).
Designing training for implementation was integral to the design of the template itself, as I knew from previous software training sessions I’d have to justify such significant change, as well as the need for mandatory training at a time where academics had little to spare, so it was key the template was rooted in solid teaching and learning principles, that it would improve quantifiably on the existing setup, and that academic representation and review was integrated into every stage of its development (D2v, K4, K5, K6).
The template was built with the RASE model (Resources, Activities, Support and Evaluation (Churchill et al., 2013)) as, common across existing modules was a lack of consistency and conscious design anchoring these elements. The RASE model provides a student-centred, learning outcome-driven method for structuring modules encompassing any number or type of resources, building on numerous theoretical frameworks and principles (Churchill et al., 2013) (A4, K3, V3, V4).
I developed an Assessment Overview section, addressing concerns raised in programme boards regarding students’ inability to easily reconcile a module’s teaching and learning with the expected and assessed learning outcomes (Evidence 4), whilst providing clarity and reassurance for students who were new to Higher Education or who required clear and consistent instruction, again, embracing RASE methodology (A3, A4, K3, V1, V2).
Training sessions comprised 90 minutes in a computer lab. The first 45 minutes provided a bespoke introduction centred around solutions to issues highlighted by each programme’s National Student Survey, programme board and programme review comments, positive and negative. I always attempted to present these with “group empathy” ensuring adequate opportunity to re-orientate my approach if a particular group was not in the emotional or receptive space I had anticipated (Mortiboys, 2012). This was consistently effective, and in all sessions staff were engaged and responsive, appearing reassured. I was grateful for the preparatory research conducted – even those who weren't interested, per se, could see why proficiency with the new setup would be productive for their teaching, and their students’ learning (Evidence 5, a testimonial as to the contents and productivity of my session from a programme director) (K3, K5, V1, V2, V3).
The latter 45 minutes was a hands-on opportunity for staff to start populating the new template and importing content with on-hand support. As long as time was spent on the template, I did not direct this further, giving space for practical autonomy if preferred, but would offer suggestions (e.g. “Start by completing the contact details in ‘About Your Module’”) for those who were cautious about initial engagement (Boud, 1988). This went well, with all attendees apparently achieving confidence in the basics (evidenced by timely, quality population of the templates), and, where issues arose, I was able to support them through appropriately (D2v, D2vi, K3, V2).
In future I would change my approach to the few colleagues we did not see in workshops. Some were reluctant to undergo training at all, insisting a brief one-to-one would suffice. I was not able to workshop the template effectively, to fully embed the RASE model in my explanation, or to give supported space to practice, and, invariably, these academics raised support queries prior to the start of semester with issues that did not arise after full sessions. It was also harder to discuss negative issues from an individually-empathetic perspective than with a full programme (K3). Next time, I would run cross-programme wash-up sessions to ensure peer engagement, or create preliminary learning activities, prior to a short, practical session.
I am in the process of creating a case study on this, which I hope to present back to the Moodle user group for peer review and discussion (D2vi). Such extensive training across a variety of learners provided a template for future delivery initiatives (such as Panopto training), helping me shape and hone my delivery and approach, ensuring, most of all, that I focus on the desired outcomes when designing, delivering and evaluating teaching and learning (K2, K3, K5, V1, V2, V4).
References
Boud, D. – Developing Student Autonomy in Learning [2nd. Ed.] (1988, Kogan Page)
Churchill, D., King, M., Webster, B. & Fox, B. (2013). Integrating Learning Design, Interactivity, and Technology. In H. Carter, M. Gosper and J. Hedberg (Eds.), Electric Dreams. Proceedings ascilite 2013 Sydney.(pp.139-143)
Churchill, Daniel & King, M. & Fox, Bob. (2013). Learning design for science education in the 21st century. Zbornik Instituta za pedagoska istrazivanja. 45. 404-421. 10.2298/ZIPI1302404C.
Mortiboys, A – Teaching with Emotional Intelligence [2nd. Ed.] (2012, Routledge)