How can we design and evaluate great online learning activities? This SEDA project explored how a new tool, the e-Design Assessment Tool (eDAT) could be used to help design online learning and evaluate its effectiveness easily.
There is increasing evidence that really effective online learning includes interaction between students and tutors, as well as on-going feedback on learning (Croxton, 2014; Hattie, 2012). But how do we know how much interaction or feedback is needed? Do we need a constant flow of activities or just one per week? Can we compare the number of interactive and feedback activities to attainment or retention data to tell us how effective our online learning is? These were the questions the project explored.
Online learning designs are complex and difficult to analyse (Walmsley-Smith, Machin, & Walton, 2019). But the eDAT simplifies the analysis of online learning activities so that they can be quantified and compared with other data or even other courses. The eDAT asks only 2 questions (the most important questions!) about each of the activities in a course:
- Does the activity include interaction? With peers, the tutor or perhaps AI or simulation
- Does the activity include feedback opportunity? Is the feedback from the tutor, peers, self-feedback (e.g., using a model answer), or automated computer feedback?
For the project, we invited tutors from a range of distance learning courses to try out the eDAT. We found that distance learning courses differ in the way they are organised in that some had a variety of different activities each week, some were self-paced and some were structured around topics. Some courses had lots of activities, and some very few in the design. However, with some modifications, the tutors were able to use the eDAT to review and reflect on their learning design.
We had hoped to use retention data to compare to the activity ratios, but this was not easily available at the modular level so we focussed on attainment data. The results from our sample of courses showed that the courses with the lowest ratio of activities with interaction and feedback also had the lowest pass rates. Modules with high pass rates had either high ratios of interaction or feedback activities but not both.
As an academic development activity, the use of the eDAT was also effective. The tutors using the tool were able to quickly see areas that needed more interaction, or to include additional opportunities for feedback. There were valuable team discussions stimulated by the tool and several action plans drawn up. The tool is being used as part of our academic development support for distance learning tutors and is included in our PgCHPE.
The eDAT is available to all interested learning designers and academic developers and we encourage you to explore it for yourself by clicking this link.
Croxton, R. A. (2014). The Role of Interactivity in Student Satisfaction and Persistence in Online Learning. Journal of Online Learning and Teaching, 10(2), 314–325.
Hattie, J. (2012). Visible Learning for Teachers: Maximizing Impact on Learning. London: Routledge.
Walmsley-Smith, H., Machin, L., & Walton, G. (2019). The E-Design Assessment Tool: An evidence-informed approach towards a consistent terminology for quantifying online distance learning activities. Research in Learning Technology, 27. https://doi.org/doi.org/10.25304/rlt.v27.2106
Helen Walmsley Smith, E-Learning Development Officer, Staffordshire University