Learning analytics is something all good education providers take seriously. It is definitely a highly discussed topic among our team and clients. With student engagement and retention being an ongoing challenge for universities and colleges, mastering learning analytics is key.
Before we get into the specifics and your 10 step blueprint, we recommend you check out the following blogs we’ve written on the topic previously:
Moodle Analytics and Reporting: what you should be measuring for student success.
Moodle Activities you should be using for better engagement and course experience.
In-built Analytics Models in Moodle
Moodle has two built-in analytics models:
1. For identifying students at risk of dropping out, and
2. To discover abandoned courses with no teaching activity.
Learn more here.
The ‘Students at risk of dropping out’ model of Moodle sounds especially promising, but like with many existing analytics tools in the world of eLearning, has its limitations. It is most effective with 100% online learning, or blended learning courses that have a substantial online component.
Your 10 Step Blueprint for Predictive Analytics
A high-level, 10-step practical approach for an initial feasibility assessment and your first iteration towards predictive analytics probably looks something like this:
1. Setup a secure test environment with a copy of your production Moodle application, files and database.
2. Before enabling and running any scheduled tasks, ensure your test environment is not able to send email, and that it’s secure (perhaps even consider anonymising user data!)
3. Create an ad-hoc report to review what percentage of your courses have start and end dates set, what the average duration is, and how many run for less than 12 months.
4. If course start/end dates are not set, use the CLI script to guess and apply start/end dates or set them in bulk based on other metadata you know about the courses from the report.
5. Spot-check (or run a report) on the number of course sections in applicable courses, to see if they’re generally structured with progress chronologically across course sections (e.g. week-by-week, or topic by topic – i.e. split by course sections).
6. Ensure course activity is required in the final quarter of the course delivery, such as submission of a final assessment or an exam activity, such as an assignment or quiz.
7. Consider the number of predictions you’re seeking throughout the course delivery, which you’ll need for calculating and setting the time splitting methods according to your prediction requirements.
8. Apply the Analytics settings as required.
See: https://docs.moodle.org/405/en/Analytics_settings – and consider whether your site is big enough to require a high performance python backend for training the machine learning model.
9. Review the user roles/capability permissions for which users can manage the models and List the insights.
10. View the insights on students predicted to drop out, validate against actual student outcomes, and decide on next steps to deploy to production.
Some potential limitations (which can still be overcome) mainly centre around:
- Courses running for no more than 12 months
- Content in courses being split up and structured progressively across course sections
- Courses having fixed start and end dates set
- Activities in courses primarily adopting Moodle core activities (cf. additional, third-party plugins or external activities, such as activity tools integrated via LTI)
Steps 3-7 of our Analytics Blueprint above should help you assess the extent to which the default models will immediately assist you – or if you want to delve deeper by yourself, or with Moodle experts, to understand and overcome any limitations that may exist.
These limitations are mainly because of the assumptions built into default machine learning models for predicting and preventing students from dropping out:
Generally, it’s assumed that it’s not a single “always open course”, with multiple student intakes being enrolled over a long period of time, but having each student intake enrolled into a course for that group of students.
Also, Learners should come into a fresh course which has a start, be required to continue engaging in the course throughout, and still be active towards the end. There should be consistency of progress and participation.
If the completion of the course is a traditional written test in a “bricks-and-mortar” exam hall (and doesn’t require activity in Moodle in the last quarter of the course), then it would be difficult for any model to predict if a student is still engaged or has already dropped out. Step #10 points to the need to validate analytics against delivery practice to assess the actual level of participation.
Whether you:
- have had mixed success with your analytics tools,
- are looking for more tools,
- want to validate the analytics and reporting you’ve developed, or
- are embarking on a learner outreach campaign (to provide additional support or “interventions” for students who you believe are at risk of dropping out),
we recommend you speak with a team of Moodle experts who work with clients in the education industry on a day-to-day basis. At Catalyst IT Australia, we have helped hundreds of clients – universities, colleges and registered training organisations – achieve their e-Learning goals and are always happy to share the lessons we learn with the wider Moodle community!
You may also like: Getting access to your data in Moodle.
