7 Learning Analytics Challenges That Hinder Training Evaluation (And Solutions)
Brian A Jackson/Shutterstock.com

Learning Analytics Challenges In Training Evaluation

Training evaluation is used to determine if training solutions are effective and answers whether "the training was worth the effort, time, and investment." As L&D professionals, we know that this is one question stakeholders often ask.

eBook Release: Leveraging Learning Analytics To Maximize Training Effectiveness - Practical Insights And Ideas
eBook Release
Leveraging Learning Analytics To Maximize Training Effectiveness - Practical Insights And Ideas
Learn how to establish the required KPIs, and more—this eBook is a how-to guide on learning analytics.

Evaluating training and proving its worth isn't a child's play. You should be able to show stakeholders the numbers and you can determine these numbers by leveraging learning analytics. Here's what learning analytics and training evaluation mean.

Learning Analytics Training Evaluation
The Society for Learning Analytics Research (SoLAR) defines learning analytics as the measurement [1], collection, analysis, and reporting of data about learners and their contexts for the purposes of understanding and optimizing learning and the environments in which it occurs. Training evaluation is a process to evaluate if training solutions are successful and if they are aligned with the organization's goals.

Learning analytics and training evaluation are inter-related because both serve the purpose of improving organizational performance. And, if you don't enable learning analytics, training evaluation isn’t going to be effective.

Leveraging learning analytics in training evaluation is not without its fair share of challenges. Here are 7 challenges, along with relevant solutions.

1. Ambiguous Questions Are Used

You must be familiar with the expression "Data is only as good as the questions asked" and this is true of leveraging learning analytics for training evaluation as well.

If you ask ambiguous questions, you may not receive valid answers and it's hard to determine what data to collect.

Solution

Meet with stakeholders to determine the appropriate questions to ask. Understand what stakeholders need to know and the decisions they need to make. This can help you frame the right questions.

Remember, including ambiguous questions in training evaluation is similar to having a training solution with weak or no learning objectives.

2. Training Evaluation Data Isn't Used To Make Informed Decisions

Often, information gathered from training evaluation is either used on a limited basis (e.g., to answer questions such as how many people passed or how many people completed the training) or not used at all.

Solution

Save time and money by asking questions where the information will actually be used. While some information from training evaluation can be considered 'nice-to-know', the emphasis should be on well-informed decisions and actions. For clarity on decisions, talk to stakeholders and use that evaluation as a means to informed decisions.

3. Training Solutions Aren't Designed To Enable Effective Data Capture

Your training solution, be it classroom-based or technology-based, should be designed to enable the capture of learning data (information that provides insights on how participants learn).

For example, if your training solution aims to teach a 10-step procedure and the only information that's captured at the end of the training program is whether learners have completed going through the 10 steps, it doesn't say anything about the quality of the training or whether learners have actually understood the 10-step procedure.

Solution

Consider using online assessments in classroom training. This can eliminate the burden on the instructor(s) to evaluate training.

If you're using eLearning or microlearning, configure elements to capture data. Understand the LMS protocol used in your organization. For instance, SCORM version 1.2 may offer limited details compared to the potential level of detail available when using SCORM 2004 or xAPI.

4. A Repository Isn't Available For Data Capture

According to research from Lighthouse [2], most measurement and analysis is still manual. If the current training evaluation system is paper based, it'll be difficult to keep track of data.

Solution

It is essential to capture data in a repository that supports learning analytics. For example, a microlearning video that is hosted within a system such as a private YouTube channel that is optimized for data capture would be a better choice, compared to hosting it on a simple shared server with no analytics capabilities.

5. Capabilities To Analyze Data Are Limited

Setting parameters and requirements for data analysis can be quite a challenge. If learning analytics isn't enabled in your organization's LMS, then again, the capabilities to analyze data are limited.

Solution

Check the native analytics capabilities available within the repository, such as an LMS. Does it support built-in plugins or third party tools for learning analytics? Does the data need to be exported to a different learning analytics software for analysis? If yes, get that infrastructure in place. Also, ensure your organization has the expertise for conducting analysis.

6. Accuracy Of Data Isn't Ensured

If the data that's entered is invalid or incorrect, it affects the reliability of learning analytics reports and may result in skewed analysis. For instance, if learners complete a training evaluation without actually completing the entire training, then the feedback could be incorrect.

Solution

To ensure the reliability and validity of results from learning analytics, data needs to be clean. Have a QA process to check if data is reliable. It’s a good practice to encourage learners to share honest feedback after completion of training.

7. Reports Aren't Generated from the Repository

Often, the standard reports generated by the repository may not be sufficient to yield reliable results on training evaluation.

Solution

Learning analytics is much more than simple tracking of course completion and passing scores. If learning analytics tools are not integrated within the repository, you need to export data to an analytics' software and generate custom reports.

Concluding Remarks

Have a plan in place to decide how and when learning analytics reports will be accessed and distributed. Most importantly, evaluate how the information gained from the reports is going to be utilized to improve training solutions.

By successfully tackling these challenges, you can make effective use of learning analytics for training evaluation. So, are you ready to get started? Download the eBook Leveraging Learning Analytics To Maximize Training Effectiveness - Practical Insights And Ideas for an incredible in-depth view of the topic. And for more insights on this, watch the webinar and discover how to quickly plan for learning analytics to ensure utilization.

References:

[1] What is Learning Analytics?

[2] Executive Summary: 2016/2017 Learning Analytics Trends Report

eBook Release: Effectus LMS
Effectus LMS
Effectus is a light, easy-to-use, rapid to install & deploy, plug and play LMS. Developed by learning professionals for learning professionals, it redefines the corporate LMS as a learner-centric, intuitive platform.
Close