How To Measure The Impact Of Learning In Real Time

The Big Picture: How To Measure The Impact Of Learning In Real Time
A. Solano/Shutterstock.com
Summary: Discover how to evaluate the real impact of eLearning and how holistic data analysis can help you prove ROI.

The Big Picture: Evaluating The Impact Of Learning In Your Organization

This article is part of a series on tracking the right metrics using the right methodology to visually display and quantify the investment in learning and prove ROI. What you are about to read is a fable. The company, AshCom, is fictional, but the learning challenges faced by Kathryn, AshCom’s CLO, and her team are real and commonly shared by learning teams in large organizations. It is our hope that you will be able to connect with the characters, their challenges, and the solutions they discover. We also invite you to read the first eBook in the series.

eBook Release: The Learning Scorecard: How To Measure L&D Impact And Prove ROI
eBook Release
The Learning Scorecard: How To Measure L&D Impact And Prove ROI
This eBook introduces a system to track the right metrics using the right methodology to visually display and quantify the investment in learning and prove ROI.

Updated Learning Experiences

Three months had passed since the AshCom learning team released the updated learning experiences for preventative maintenance. The team understood the stakes in getting it right. AshCom was losing hundreds of thousands of dollars a year because preventative maintenance was not being properly performed. Scheduled uptime on machines was too low. Unscheduled downtime meant that machines were breaking down too often. These two factors meant that production goals were almost never met. It also showed up in much higher than planned scrap rates, which meant a significant amount of waste in their system. Machines not maintained had much shorter life cycles than those that were properly cared for.

Kurtis, the CFO of AshCom, brought this to Kathryn’s attention several months earlier. The executives believed that the people running the machines were not properly trained on preventative maintenance, which is what led to an entirely new set of learning experiences.

Kurtis gave Kathryn another assignment in tandem with the new learning experiences. He wanted her to create a system for measuring the impact of learning. Particularly important to him was getting clear return-on-learning metrics. In other words, he wanted to understand there was money to be made or saved by investing in learning.

The Learning Scorecard In Action

Kathryn and Amy, her trusted consultant, used The Learning Scorecard system to build their performance measurement system. It scored and weighted numerous data points at five different levels to show how learning was changing behavior and improving AshCom’s financials. When the preventative maintenance program was released, Kathryn and the learning team watched the metrics on The Learning Scorecard’s dashboard daily. Kathryn liked keeping score, so it was not unusual for her to check the dashboard multiple times a day.

She determined that 12 weeks would be sufficient time for getting a sense of how the preventative maintenance learning experiences were performing and how well The Learning Scorecard was tracking that performance. This review would also help determine when and how to bring more of AshCom’s learning experiences into The Learning Scorecard.

Kathryn called a meeting with Kurtis. She also invited Amy to join the meeting as Amy had been integral in setting up the system. She also knew how multiple other organizations used it to measure their learning performance. She invited both of them to join her in her office the following Tuesday at noon. She would provide lunch.

Kurtis was not much for small talk, so he dove right in as soon as they were seated and had opened their lunches. “I’ve been watching results on my own dashboard,” he said, “and I have to say that I am impressed. Of course, my dashboard mostly shows improvement in performance results. I don’t see all of the other metrics you are tracking.”

“Just the way we designed it,” said Amy. “We didn’t assume you wanted all the details behind performance given all the other metrics you need to track.”

“I love numbers,” said Kurtis, “but it can get a little overwhelming sometimes. I need to focus on the key performance metrics across the entire company.”

“Let me suggest an agenda,” said Kathryn. “Let’s begin with a review of what you are seeing at the company performance level. After that, we can cover what we are seeing. And then we should talk about how we move forward integrating other learning experiences into The Learning Scorecard. Am I missing anything you want to cover?”

“Sounds right to me,” said Kurtis. Amy nodded.

Bad Metrics On The Decline

Kurtis continued, “I can tell you that I like what I see. Bad metrics like scrap rate and unscheduled machine downtime have dropped by 20%. This is incredible. Bad metrics going down means that good metrics are going up. Our scheduled machine uptime is up and so is our production rate per hour. I could see this on my dashboard. I’ll admit that I check it a lot.”

“So do I,” said Kathryn. “I can’t help myself.”

Kurtis laughed. “My kind of person. I am also tracking energy use because properly maintained machines are more energy-efficient. I am seeing slight improvement there.”

“I would imagine your energy costs are substantial,” said Amy.

“It accounts for millions of dollars per year,” replied Kurtis, “so even small improvements matter. I suspect this will only improve over time. I am also tracking machine life cycles, which means the amount of time a machine can perform before it must be replaced. I won’t know those numbers for the next year, but I am impressed we can track that.”

“Are you happy with the ROI numbers?” asked Kathryn.

“Oh yes,” said Kurtis. “We spent about $150,000 building the learning experiences for the preventative maintenance program and getting The Learning Scorecard in place for these modules. That includes time for our internal team, external vendors, software, and people like Amy. In the last quarter since the learning experiences were released, I can see a return-on-investment of approximately $75,000.”

“The learning hasn’t paid for itself yet?” asked Kathryn.

“I didn’t expect it to,” said Kurtis. “The savings from reduced scrap and reduced downtime will carry forward. So will the revenue produced from increased production per hour. If I look at this on an annual basis, I anticipate the return will be $250,000. Seventy-five thousand dollars a quarter is remarkable.”

Calculating Learning ROI

Kurtis went to the whiteboard and wrote:

ROI=NET BENEFIT/COST OF INVESTMENT x 100

ROI is $250,000/$150,000 x 100 = 167% ROI

“How does this compare with the ROI from other areas?” asked Kathryn.

“Let’s use equipment purchases as a comparable,” replied Kurtis. “When we buy a new machine, we are usually hoping they pay for themselves in the first three years.

That means that a single machine has an annual ROI of about 33%. In three years, it will be paid for and then we are into profit.”

“We are doing well by example!” said Kathryn.

“Very well,” said Kurtis. “Preventative maintenance learning will more than pay for itself in its first year. Of course, that assumes these performance numbers stay where they are. It depends on how long the impact of learning lasts.”

“We have some reinforcement activities lined up,” said Kathryn, “but we will also be tracking performance. If it starts to dip, we will know it and can respond immediately.”

Real-Time Data

“That’s part of what is remarkable about this scorecard system,” said Amy. “The learning team doesn’t have to wait to be told performance is dropping or that a learning experience isn’t working. They will know it in real-time.”

“We’ve talked about what I’m seeing,” said Kurtis. “Kathryn, what are you seeing in your dashboard?”

“Your news was so good that I hesitate to tell you mine,” said Kathryn. “Not everything is working the way we want it to. Most of our learning experiences got high marks from both the learning team that built them and from the learners that went through them. But some of the experiences didn’t work as well as we hoped. Learning didn’t react well to them.”

Amy jumped in. “In other cases, we saw some instances where what we were trying to teach was either too simple to be interesting or too complex to be understood.

“We need to make some adjustments in some of our content and delivery methods,” said Kathryn.

“Can you give me an example?” asked Kurtis.

“One that comes to mind,” replied Kathryn, “was a module on the purpose of preventative maintenance. We gave an overview of everything that needed to be maintained on a specific piece of equipment. It was too much to take in all at once. We need to shorten that module and make it simpler. We learned this from the feedback we tracked.”

“I like hearing there are areas where your team can improve,” said Kurtis. “I can’t see into all that data, but I’m glad that you can. What I’m hearing is that we could expect our ROI to improve?”

Kathryn laughed. “I can’t predict that. But I can tell you that we see several areas where we can serve learners better. I suspect that will only improve performance, but the scorecard will show us.”

“Is it fair to say that The Learning Scorecard was a success while acknowledging we have areas where we can improve?” asked Amy.

“Yes,” said Kurtis.

“Same answer,” said Kathryn.

“Good,” said Amy. “Then let’s move on to integrate more of AshCom’s learning experiences into The Learning Scorecard. I’ve done this with several companies among my other consulting clients, and I need to give you a warning.”

“I’m all ears,” said Kurtis.

“It will take time,” said Amy. “My clients sometimes think they can just push a button, everything will go into The Learning Scorecard, and they’ll have what they want the next day.”

“This isn’t how it works?” said Kurtis smiling. “I do know what you mean. In the companies I’ve worked for, we have frequently changed accounting or management software. It all looks easy on their websites, but it is always serious work to get our data integrated into the new software.”

“Kathryn’s learning team will have the same challenge,” said Amy.

Kathryn replied, “If we have to do this in stages, how do we determine the integration process?”

Rethinking The Integration Process

“Glad you asked,” replied Amy. “I suggest that the process look something like this,” she said as she walked to the whiteboard.

  1. All new or rebuilt projects
  2. Existing high priority/high-impact learning
  3. Existing medium priorities/medium-impact learning
  4. Existing low priorities/low-impact learning

“A few caveats,” continued Amy. “You will need some input from Kurtis or other members of the executive team to help set priorities. Their buy-in will be key.”

“I think that would be appreciated by the executives,” said Kurtis.

“I’ve worked with a few clients on the same process,” said Amy, “and there may be some things that you decide that do not need to be included in The Learning Scorecard. Some of the very simple compliance courses were not added because they were sort of check-the-box things.”

“It still might be good to know learner reaction,” said Kathryn.

“I don’t disagree,” said Amy. “All I am saying is that this might be a good option for a handful of learning experiences.”

“Duly noted. It sounds like our next order of business is to create a schedule and a timeline. I can start working on that and run it past you and the executives,” said Kathryn, looking at Kurtis. “We will need to decide who on my team can dedicate a portion of their time to the integration.”

“Or maybe we can look at an additional position to help with this,” said Kurtis. “Given the performance and the ROI, it won’t be hard for me to support that.”

Kathryn and Amy looked at each other, both a little surprised. Amy said, “You really see the value in this, don’t you?”

“I do,” said Kurtis, “and so does the rest of the executive team. I’ve been updating them weekly on the progress I’m seeing on my dashboard, and you have their attention.”

“I’m glad to hear you say that,” said Kathryn. “There is something I want our learning team to begin to incorporate at some point. We need to explore how to use Augmented Reality and Virtual Reality into our learning. My guess is that it will take some time and extra budget for us to begin that process.”

Kurtis thought for a moment. “If you can figure out a way to start with some small projects, and you can measure its performance and ROI, I would welcome that conversation.”

“We just need to find the opportunity,” said Kathryn. “And when we do, you will be the first to know.”

“I look forward to that,” said Kurtis. “I want to congratulate both of you. I know we gave you a significant challenge. You have risen to it, and I am grateful for all your work.”

“I appreciate it,” said Kathryn. “This has been stressful for me personally and for my entire team. Amy has been my rock. She was direct with me when I needed it, but she was also encouraging by telling me that we could get this done. And so, we did.”

“And so you did,” said Kurtis as he stood up to leave. “Stay on this path.”

“We will,” said Kathryn. “And thank you for all your support through this process.” With that, Kurtis and Amy left. Kathryn leaned back and took a moment. She was in a good place.

Conclusion

Download the eBook The Learning Scorecard: How To Measure L&D Impact And Prove ROI to delve into the data and discover which key metrics your L&D team should consider. You can also join the webinar to discover a completely new approach to measuring ROI.

Dear Reader, if you would like to see a demo of MindSpring’s Learning Scorecard, please click here to schedule a time convenient for you and the learning experts at MindSpring will be happy to walk you through it.

eBook Release: MindSpring
MindSpring
MindSpring is an award-winning agency focused on delivering engaging and transformative digital content. We create digital experiences using exceptional creativity, the best of learning science, and innovative technology. (Previously Inno-Versity)