Improve Performance And Compliance Training With New Killer Tools For Learning Data Analysis

Improve Performance And Compliance Training With New Killer Tools For Learning Data Analysis
eelnosiva/Shutterstock.com
Summary: With L&D departments increasingly being challenged on what they are achieving and required to take a more evidence-based approach, learning professionals find themselves needing new tools for support with learning data analysis. The Learning Analytics Canvas provides just that support, across both performance and compliance training. Here’s how it works.

Learning Data Analysis Tools To Enhance Performance & Compliance Training

How can learning data analysis tools enhance your L&D strategy? The Learning Analytics Canvas (see diagram below) is a free planning tool and checklist created by Learning Pool’s data scientists that can be used by anybody embarking on a project or program that is going to involve learning data, regardless of their level of data maturity. It is based on the Business Model Canvas (Osterwalder, Pigneur et al, 2010), which will be familiar to many. Before we work through an example project with the LAC, which is really the easiest way to understand it, there are some core principles to explain.

eBook Release: Data And Learning: Adding Learning Analytics To Your Organization
eBook Release
Data And Learning: Adding Learning Analytics To Your Organization
Discover how data and learning analytics can help your organization overcome its biggest challenges cost-effectively.

6 Principles Of The Learning Analytics Canvas

1. Be Goal-Driven In Your Use Of Data

All experts agree that defining goals at the outset is key to success.

2. Think From The Start About The Why, What, Who, And When Of The Evaluation Effort.

Why are you doing it, what are your KPIs, who is it for, and what are the time constraints?

3. Depending On The Nature Of The Project, Your Data Goals Will Probably Focus On One Or More Areas:

  • Engagement
  • Knowledge (retention)
  • Behavior
  • Organizational change

These four goal areas will typically “staircase”. In other words, if you want to look at whether behavior change occurred, you might want to look back at how much of what was learned was retained and whether (or not) that impacted on behavior change; establishing a causal chain.

4. The LAC References Three Dimensions Of Data:

  • Anonymous data
  • Group data
  • Individual data

In the real world, we don’t always have “perfect” data to work with. But depending on how your goals have been defined, anonymous data could be completely serviceable for what you want to do. Conversely, the data that you have could well impact on the scope and scale of what you are able to achieve and thus your goals. It’s a feature of this type of planning that what is in the boxes interacts!

The Learning Analytics Canvas

5. Look To Assemble A Portfolio Of Evidence

Concentrate on the hard facts, rather than focusing on a ‘smoking gun’ – i.e. a single intervention that can be credited with achieving a performance improvement.

6. Consider The "When"

In the when box, we consider three different time stages or measurement points:

  • Pre
  • Post
  • Post-post

You might learn different things about your learning intervention at each of these stages with the help of learning data analysis tools.

Putting The Learning Analytics Canvas & Learning Analysis Tools To Work

Say as a starting point I wanted to get a quick look at whether or not my recently introduced performance support materials are more or less effective than what I had before (why). I might begin with some Google Analytics data or similar.

Anonymous Data

This will likely be anonymous data, not tied to any particular individual or team; telling me about the volume of usage of the new materials, how long people engage with them and so on (who). In terms of time stage (when), I could easily compare pre-deployment with post-deployment of the new materials to draw an initial comparison—have our engagement figures changed? (engagement)—and continue to monitor changes in usage and dwell time as the months and years wear on (post-post). Note that I don't need to know anything about the individuals concerned at this stage in order to make inferences about whether we have moved the needle or not on engagement, but already I can start to make decisions and take action based on what the data is telling me.

Group Data

The analysis would be better if I could have group data (who) identifying different teams or populations, through cohort analysis. And for more of a gold-standard analysis, I might set up a control group of people who did not receive the new materials but stuck with the old stuff. Widening the lens, we could look at how the data breaks down between different functional, geographic, or customer groups, for instance, enabling a different scale of decision-making.

Individualized Data

Then again, let’s say I was able to get individualized data tied to particular individuals about whom the organization will hold other data, some of which I may have access to for analytical purposes. This will allow me to go deeper. Group analysis becomes easier at this point: team, function, and role data is likely to be available if we know who an individual is.

It is when we work with individual data that things really open up, and a useful addition to your tech toolbox at this point in moving you along a path to more sophisticated use of learning data would be xAPI and a suitable learning record store. xAPI statements can be used not only to describe and analyze but can also be the trigger for actions within a system.

Knowledge Retention

So far we have only talked about performance support at the engagement stage. Moving on a bit in our ambitions, we might begin to look at knowledge (retention) and think about that box of our Canvas.

Even in a context of performance support, there might well be an instance where I want to know whether the just-in-time resources are having any lasting effect. But in the case of compliance training it could be critical to know whether we are bringing about an improvement in retention. To give a practical example, if tested at a certain point after the first engagement, how well can the learner carry out a particular procedure in accordance with regulatory requirements?

We are venturing into the territory of learning transfer here, but the point about this approach is that you don’t necessarily have to invoke the whole machinery of a four-step or seven-step or 15-step evaluation process if what you want to find out is something quite specific.

We might just need to find out whether our performance or compliance training materials are not only engaging but also impactful—in the limited sense of being memorable. In terms of measurement stage, comparing pre data and post is of course useful (when). As far as post data goes, there are L&D departments that use a 12-12-12 model for retention—i.e. test 12 hours after the intervention, then after 12 days and then 12 months. We can do this at group or individual level, and even anonymized survey data could be useful at this point, because fundamentally the point of our inquiry is the intervention itself: the content, the resource, the experience.

Behavior

When we move on to the next stage, however, behavior, that locus changes. And this is where we start to require additional sources of data, where just knowing whether someone accessed a particular piece of eLearning, for instance, or scored well on an assessment, won’t tell us much. Self-assessment can be useful in seeing whether behavior change has occurred and gets even more useful if there is some sort of triangulation, as given by 360-degree surveys, for instance, or comparing a salesperson’s self-assessment of their confidence levels against how much they sold over time; comparing “hard” measures such as, in the case of a programming team, quantity of “code commits” to code quality, against how supervisors and peers rate each other.

Portfolio Of Evidence

This is where we begin to talk about assembling a portfolio of evidence. Where certainty might be difficult to achieve, given the many factors we mentioned earlier that tend to bedevil learning analytics at these later and most important stages, we can still assemble a portfolio of evidence that says, on balance of probability, it seems like this intervention has had this effect because of this evidence.

Organizational Change

If our goal is to look at organizational change (what), the locus of inquiry must shift again, from individual behaviors to group behaviors, aggregated into how multiple sets of behaviors in multiple parts of the organization work to change the behavior and culture of the organization.

This could be evidenced by a big number of sales, or retention figures, or statistics on diversity, or whichever Key Performance Indicator is most relevant. The LAC model encourages users to get to this measure quickly to highlight the KPI that would most readily demonstrate the why as the second thing they do when completing the Canvas. This helps keep a “true north” for the exercise, saying: “This is where we are aiming.”

The difficulties of achieving certainty are compounded when looking at the entire organization, so this is where we really do need to take a portfolio-of-evidence approach and where that approach comes into its own.

Conclusion

From the very simple beginnings we have described here, with modest ambitions, running through to a complex, multidimensional organization-wide analysis effort, the structure of our LAC still manages to contain the important points we need to think about. The Canvas is broad enough to contain whatever type of picture you want to paint of your aspirations for learning data. We hope you find it useful as you continue on your data analytics journey!

How can learning data analysis tools help you tackle your biggest business challenges? Download the eBook Data And Learning: Adding Learning Analytics To Your Organization to tap into the power of Big Data for your L&D program. You can also join the webinar to learn how to leverage analytics to shape your strategy.