4 Ways To Perform A Post-Course Evaluation Analysis

The 4 Keys To Effective eLearning Evaluation

So, you've started your training portal. You added (wrote, imported, or bought) your learning content. You conducted your training courses. You evaluated your learners' performance. But is it time to pat your instructors on the back and declare the mission accomplished?

Sort of - there's just one more thing you need to do, especially if you plan to repeat the training for other learners (e.g. your next bunch of hires). You should, if you're not already, also practice eLearning evaluation, assessing your course material, and making any necessary adjustments to improve it for future learners.

In this article, we will examine the benefits of evaluating eLearning courses after a training program is complete, and while there's no definitive eLearning evaluation checklist, we will describe four specific ways that this can be achieved.

Benefits Of Post-Training Course Evaluation

When you were building your training material, you probably had some process for evaluating its quality and iterating on it until you were satisfied to have it be part of your course.

Whether that involved rewriting some original material, adapting imported content to fit with the course's goals, or adding some custom touches to generic bought content, your instructors probably spent a fair amount of time getting your training content right.

So, you might be asking, why should you go through your content again since you've already deployed said material, and with the training sessions completed?

There are several reasons for adding an eLearning evaluation step at the end of your courses:

You probably still missed lots of things. Editing is difficult, and a content writer that edits their own material will have several blind spots -- things they consider fine but are not. (Which is why newspapers and publishing houses have dedicated editors separate from the authors).

Ιnstructors, by definition, should know more than learners do, which makes it difficult for them to understand when a piece of content might be too difficult for the latter.

Some piece of content might look fine to everybody involved, only to be proven problematic in actual training. Even the best content writers and editors can't fully anticipate how a piece will work out in practice without learner feedback and a thorough post-course eLearning evaluation.

Given the above, we suggest the following 4 methods that will help you gather actionable feedback and insight to improve your course:

1. Create A Questionnaire

The most valuable eLearning evaluation insight will come from directly asking your users to evaluate the course they've just finished, and a questionnaire (or survey, if you wish) is the perfect tool for the job.

Since learners might not be very eloquent or focused on their feedback, try to guide them as much as possible by asking the appropriate questions.

These include asking them to identify the most and least useful items in the course, whether they feel the course helped them achieve their learning objectives, how applicable their new knowledge is in their everyday job, and so on.

On top of those course-focused questions, ask them to evaluate their own participation and overall performance. Finally, include a generic comment box so that they can contribute any other criticism or suggestion that they might have.

For those eager to put this into practice, we will see below how TalentLMS Survey functionality can be easily leveraged to create both learner feedback questionnaires and trainer feedback surveys.

2. Create A Trainer Feedback Survey

Insight from your trainers is also invaluable for a full eLearning evaluation. After all, they are the ones that run it and get to witness first hand (when correcting trainee work, during instructor-led training sessions, etc.) the effectiveness of each chunk of material.

A trainer feedback survey should try to get instructors to evaluate each lesson and each part of the training program that they were involved with -- and perhaps even those that were handled by other trainers.

Use a simple grading scale (e.g. 1-5), and try to keep your questions focused. This will make it easier to evaluate the feedback that you'll get. That said, add a general comment box at the end of the survey to give trainers the change to address things that you haven't anticipated asking.

Trainer feedback surveys and learner questionnaires (which are just especially focused survey) and can be easily constructed in TalentLMS, through the "Add Survey" option within a course unit.

If you're familiar with creating Tests in TalentLMS, Surveys are more or less the same in their implementation but focused more on feedback-gathering than to test a learner's skills.

TalentLMS Survey functionality allows you to use the platform's familiar WYSIWYG editor to edit your questions, add multiple choice answers (for targeted questions and/or grading) and generic comment boxes, and re-arrange them to get the right ordering.

3. Create A Trainee Evaluation Form

If you follow our advice above, you'll have your trainees evaluate their own performance in your course's questionnaire. Since you can't trust that completely, though, it makes sense to also have your instructors evaluate your trainees and their performance during the course.

Again, you don't need to go into much qualitative detail -- just create a simple trainee evaluation form that assesses a learner's performance in terms of participation, test and in-person training performance, cooperation, knowledge retention, effort, etc.

In TalentLMS a "trainee evaluation form" can be implemented and maintained as part of a learner's profile information -- as the platform allows you to add custom fields to user profiles (dropboxes, text fields, checkboxes and so on).

4. Create A Training Evaluation Report

Last, but not least, you will need to get the actual figures that characterize your training course and compile them in a comprehensive training evaluation report.

This should include things like learner attendance records, all test and quiz scores, homework grades, final grades, certificates awarded, total hours spent, and so on.

Checking individual learners against their group averages will give you hints at what might be material that doesn't work well with some types of learners (e.g. because they lack some foundational knowledge that is not explained at the start of the course).

Checking for anomalies in group results (e.g. most users failing a test) will likewise indicate problematic material.

In TalentLMS, training evaluation reports are the realm of the Reports engine, which keeps track of all aforementioned metrics aggregated per course or individual learner level, and lets you dig in with custom filtering and query options, accumulate the data you're after, and view them (or export them) in an easily digestible format.


In this article, we had a look at how to evaluate online courses, and how TalentLMS tools can help you monitor, assess, and revise your courses.

This, of course, needs to be an iterative process, to ensure that your material adjusts over time to changing business environments, different learner profiles, and market and/or technological changes.

And just like you should adjust your course material, it also makes sense to adjust your course evaluation criteria over time too to match different training goals.

eBook Release: TalentLMS
An award-winning LMS for those looking to build online courses for any purpose in a few easy clicks, even with zero experience.