Stop Counting Courses And Start Counting Impact
In Learning and Development, there's one metric I've seen time and time again on leadership dashboards and monthly reports: completion rates. And every time, I think the same thing: "That's not impact, that's activity."
It's easy to see why this metric became the go-to. It's clean. It's trackable. It gives L&D teams something to show for their efforts. But let's not confuse visibility with value. Just because a learner has completed a course doesn't mean they've understood it, applied it, or changed a single thing in how they work.
In my book IMPACT: How to Turn Learning Into Results, I talk about the difference between learning activity and performance improvement. Completion rates sit firmly in the first camp. They measure motion, not momentum. Progress bars, modules ticked off, certificates downloaded—none of that tells you if someone is doing their job better as a result.
The Comfort Of Completion Metrics
Let's be honest: tracking completions is comfortable. It's the low-hanging fruit of L&D data. Most LMS platforms are built to report on it. Stakeholders understand it. It makes for easy charts in a boardroom.
But if your job is to build capability, drive performance, and support the business in achieving results, then focusing on completions is like judging a gym's effectiveness based on how many people swipe their access card. It tells you who showed up but not who got stronger.
Why Completion ≠ Competence
Here's a question every L&D leader should be asking: "What happened after the course was completed?"
Did the learner:
- Apply the knowledge in real-life scenarios?
- Improve a key behaviour?
- Increase productivity, quality, or sales results?
- Influence business KPIs?
If the answer is "we don't know," then we're not measuring learning, we're measuring logistics.
The completion rate assumes that learning is a one-time event. But we know real learning happens in layers. It takes practice, feedback, reinforcement, and reflection. Clicking through a 20-minute course on "having difficult conversations" doesn't mean the learner is now equipped to handle them in real life.
In fact, learners often game the system. They skip videos, fast-forward to the quiz, or pass with minimal effort. That's not their fault, it's a design issue. But it makes the completion metric even less reliable as a proxy for learning or impact.
What Gets Measured Drives Behaviour
The danger here isn't just bad data, it's bad decisions. When L&D teams are evaluated on completion rates, they naturally build strategies that drive completions. More nudges. More mandatory courses. More clicks.
But if we flipped the focus to impact—improvements in behaviour, performance, or outcomes—L&D would start designing for transfer, not just attendance. And that's where the real ROI starts to show.
LMSs Were Built For Delivery, Not For Impact
Most LMS platforms were never built to track performance, they were built to deliver content. They're excellent at hosting courses, tracking completions, and managing compliance training. But when it comes to telling you whether that content made a difference in someone's job performance? They fall flat.
That's not a criticism, it's a reality. The LMS is just one tool in the L&D toolkit. But when organisations rely solely on LMS data to evaluate success, they end up with a skewed view of reality. It's like using a speedometer to measure the quality of your journey. Sure, it tells you how fast you were going, but not whether you ended up where you needed to be.
If you want to measure impact, you need to look beyond the LMS. You need to track behavioural change, on-the-job performance, and business outcomes. That requires conversations with managers, pulse surveys, performance data, and yes, sometimes even going out and asking people what's actually changed.
ROI Starts With The Right Questions
Most learning leaders don't have an ROI problem, they have a question problem. They're asking things like:
- How many people completed the course?
- What was the average score?
- Did they like the training?
But they should be asking:
- What did learners do differently after the training?
- How did that impact their individual performance?
- Did that improvement influence a team or business metric?
This is the heart of the IMPACT approach: shifting from input-focused questions to outcome-focused ones. Because ROI in L&D isn't about how much learning you delivered. It's about how much of that learning translated into real, measurable value.
For example, let's say you run a time management course. It's got great engagement and a 95% completion rate. But did it reduce missed deadlines? Did it improve team efficiency? Did it shorten project timelines?
Until you connect learning to outcomes, you'll be stuck reporting outputs that sound good but mean very little.
The Problem With Proxy Metrics
Completion rates are what we call "proxy metrics." They're stand-ins, indicators that are easy to measure but don't directly prove what you care about.
Think about this: if someone completes a training module on sales objection handling, what do you really want to know?
You want to know if they can now overcome objections more effectively, close more deals, and increase revenue. But the LMS will only tell you: "Yes, they watched the course." That's like giving someone a pilot's licence just because they read the manual.
Proxy metrics can be useful if they're part of a broader dashboard of measures. But if they're the only data point, they give a false sense of success. And that's where L&D gets into trouble: chasing numbers that look good in isolation but don't hold up under scrutiny.
Shifting The Conversation: From Completions To Contributions
To prove the value of learning, we need to shift the conversation inside organisations from what people completed to what people contributed as a result of the learning.
Did your leadership programme result in fewer escalations and better team performance? Did your customer service module lead to higher satisfaction scores or reduced complaints? Did your sales enablement training generate more qualified leads or shorten the sales cycle?
These are the measures the business actually cares about. No one in the C-suite is losing sleep over whether 87% of staff completed a module on communication. They're focused on outcomes: productivity, profit, retention, and risk reduction.
When L&D starts talking in those terms, when we speak the language of the business, we stop being seen as a cost centre and start being seen as a performance partner.
That's the opportunity. But to get there, we need to stop hiding behind completion rates and start showing our work in a way that matters.
The IMPACT Mindset For Measuring ROI
In my IMPACT model, I outline how to transition from measuring activity to demonstrating tangible business results. It's about aligning learning with business goals and proving contribution at every stage.
Here's how that mindset looks in practice:
- Intended outcome – What business result is this training meant to support?
- Meaningful measures – What metrics actually reflect change (beyond completions)?
- Performance focus – How does this improve individual or team performance?
- Application tracking – What behaviours are being applied on the job?
- Change evidence – What proof do we have that something improved?
- Tangible ROI – Can we link that improvement to real value (time, money, quality)?
This framework gives you a different lens. One that focuses less on learning logistics and more on business results. You can still track completions if you want to, but now they become a footnote, not the headline.
How To Start Measuring What Matters
If you want to move beyond completions and start measuring impact, here's where to start:
- Have a business conversation first. What problem are you solving? What's the cost of not solving it?
- Agree on success metrics up front. Define what good looks like before you build anything.
- Involve managers. They see behaviour change more than anyone, get them into the feedback loop.
- Track application. Post-training reflection, self-assessments, or action plans can help show transfer.
- Collect stories and stats. Qualitative evidence and quantitative data is a powerful combination.
- Use your LMS as a distribution tool, not an impact engine. Let it support delivery, but don't expect it to tell the whole story.
The good news? You don't need a PhD in data science to measure impact. You need curiosity, business alignment, and a commitment to stop settling for surface-level stats.
What Happens When You Stop Chasing Completions
When you stop chasing completions as your primary measure of success, something powerful happens: you start designing learning that actually works.
Instead of asking, "How can we get more people to finish this course?", you start asking, "How can we make sure this learning solves the business problem?"
The design changes. The follow-up changes. The conversations change. And most importantly, the results change.
Teams begin to feel the difference. Learning becomes less about ticking boxes and more about solving real problems. Learners stop feeling like passive consumers and start becoming active participants in their own growth. Managers get involved because they can finally see the link between development and performance.
That's the shift. That's what learning with impact looks like.
A Final Word To L&D Leaders
If you're still reporting completion rates as your primary measure of success, you're not alone. For years, the industry has pushed these numbers because they were easy to track and made us look busy.
But busy doesn't mean effective. The future of Learning and Development is performance-focused, outcomes-driven, and impact-obsessed. And that starts with us. As learning leaders, we have to raise the bar on ourselves, our platforms, and our measurement standards.
Let's stop asking, "Did they complete it?", and start asking, "Did it make a difference?" That one question changes everything.
If you're ready to stop counting courses and start counting impact, you're already ahead of most. Now's the time to take the next step and build a learning culture that delivers more than knowledge. One that delivers results.