Building A Solid ROI Framework With The Learning Scorecard

Building A Solid ROI Framework With The Learning Scorecard
ktasimar/Shutterstock.com
Summary: How do you create a successful ROI framework that's sustainable and measures real business impact? Read on to learn more.

Creating a Sustainable Framework Using Measurable Metrics

This article is part of a series on tracking the right metrics using the right methodology to visually display and quantify the investment in learning and prove ROI. What you are about to read is a fable. The company, AshCom, is fictional, but the learning challenges faced by Kathryn, AshCom’s CLO, and her team are real and commonly shared by learning teams in large organizations. It is our hope that you will be able to connect with the characters, their challenges, and the solutions they discover. We also invite you to read the first eBook in the series.

eBook Release: The Learning Scorecard: How To Measure L&D Impact And Prove ROI
eBook Release
The Learning Scorecard: How To Measure L&D Impact And Prove ROI
This eBook introduces a system to track the right metrics using the right methodology to visually display and quantify the investment in learning and prove ROI.

Benchmarking Best Practices

Amy arrived five minutes early for her meeting with Kathryn, the Chief Learning Officer at AshCom. Kathryn had already been in the office for some time, as was clear from the freshly brewed coffee and the danish neatly arranged on her conference table. “I told you I would have something good to eat,” said Kathryn as she welcomed Amy with a smile.

Amy knew that Kathryn was under pressure. The financial strain on AshCom from their first quarterly loss in their history was making its way into every corner of the company. Not only was the finance team looking for ways to save money, but it had also adopted the motto, “Defend the Spend,” which meant every leader with a budget would need to clarify what they were spending and how it was helping the company.

Kathryn was feeling the strain. She knew that showing a return on learning investment (ROI) was not easily done. She also knew that ROI was neither a high priority nor a specialty for her team. Soon, she would be expected to give a presentation on the financial costs and benefits of her team’s work, and she was not confident in her ability to do so.

This prompted her to contact Amy, a learning consultant to Kathryn and her team. Amy had done work for dozens of companies around the United States. She had proven to be an excellent source for benchmarking the best practices for some of the largest companies in the world.

In their first meeting, Kathryn laid out the financial challenges of AshCom and her own angst related to “Defend the Spend.” Amy did not alleviate her concern but made two things clear: First, this was a problem that Kathryn owned. Second, Amy offered hope and promised that she would work through the challenge with Kathryn.

What Are The Scope Limitations?

Once they each poured their coffee and chose a danish, Amy started right in. “I want to begin by reminding you of scope limitations. We are not trying to create an entire ROI system for all learning experiences that AshCom offers. We are going to focus only on what you’ve been asked to do, which is to build a return-on-investment system for a single set of experiences your team is creating on preventative maintenance.”

“That’s a good reminder,” said Kathryn. “Over the past few nights, I’ve allowed this to snowball into something bigger and more complex than it needs to be. I haven’t slept well. Our CFO, Kurtis, brought me this challenge, and he was wise to limit it at the beginning. But he also said that over time, he wants our ability to judge the impact of learning to apply to everything.”

“I know that,” said Amy, “but that isn’t what you are being asked to do right now. And right now is what matters. If we get this right, we can increase the scope and bring in other learning experiences. We need to take the ‘one bite at a time’ approach.”

“It is just that I feel we are far behind other departments,” said Kathryn. “Our finance, operations, and sales teams have everything measured. They have several dashboards that they can bring up in real time that show exactly how their areas are performing. It feels like I will never catch up on this kind of reporting. And I am afraid of what might happen to my team if I don’t and our company continues to lose money.”

“Kathryn,” said Amy, “I want you to know that I hear you. I understand the anxiety. And I don’t mean to be harsh, but we need to stop admiring the problem. We need to get to work on solving it. In my time working as a consultant with you, I’ve not seen you like this. So, let’s acknowledge that and put together a plan to move forward. I promise we will get you where you need to be.”

Kathryn breathed a deep sigh of relief. After a moment, she said, “OK, I hear you too. I am grateful for your listening ear and for your willingness to push me to move forward. So, what does moving forward actually look like?”

Creating The ROI Framework

“I’ve given this a lot of thought,” said Amy, “and I’d like to suggest a framework for taking this one bite at a time. Mind if I use your whiteboard?”

“Of course not,” said Kathryn.

Amy rose from her seat, grabbed a black marker from the ledge of the whiteboard, and wrote on the left side of the board:

Must-Haves
1. What to Measure
2. How to Measure
3. How to Make Visible

On the right side, she wrote:

Future Steps
1. Predictability
2. Bringing It All Together

Amy continued, “I want to caution you to leave the future steps for the future. I know the temptation will be strong to think about how we bring every learning experience into one system, but that is a challenge we will handle later.”

Kathryn smiled and nodded. “You know me so well.”

“I do,” replied Amy. “I appreciate your strategic mind, but we have to focus on the core needs for building a system that can tell us about the ROI on the preventative maintenance program that your team is building for Kurtis.”

Kirkpatrick’s Model

Kathryn continued to study the whiteboard. Lost in thought, she mumbled, “I’m not sure I know what you mean by ‘predictability.’”

Amy shook her head.

Kathryn now laughed out loud. “You mean I have to wait to find out what you mean by the terms on the whiteboard?”

“Yes,” replied Amy. “I promise you that we will get to that. For now, the things on the left side are all we care about. And we are going to take them in the order I have them written on the board.”

“So, we start with what to measure,” said Kathryn. “Don’t we already know that from Kirkpatrick’s Model?”

“Yes,” said Amy, “but we are going to go back to the basics of Kirkpatrick. And I have something I want to add to his model that I think will be helpful.”

“I’m all ears,” said Kathryn.

“Kirkpatrick’s Model,” said Amy, “is something I’ve seen in every organization with whom I worked on ROI. So just a quick refresher.”

In the middle of the whiteboard, Amy drew a triangle with four levels.

4 Levels

  • Level 1 = Learner Reaction
  • Level 2 = Learning
  • Level 3 = Behavior
  • Level 4 = Results/Organizational Performance

Amy continued, “Level One means we will be measuring the reaction of the learners to the learning experience. We will want to know if they found it to be useful and relevant to their jobs. Did they find it engaging? Relevant? We will also want to know if they would recommend it to their colleagues, which is one of the most important factors.”

“We do that sometimes,” said Kathryn. “Actually, I should say we do that most of the time with learning experiences. But I will admit that we don’t look at the data very often.”

“That needs to change,” said Amy. “The second level examines whether learners gained what you wanted them to gain. Your team had objectives for increases in knowledge or skills. Sometimes you are looking for changes in attitude or deeper commitment. Did the learners exit the learning experience achieving what you set out to do when you built your learning objectives for each experience?”

“We have this information,” said Kathryn, “but as in Level One, we don’t go through it very carefully. I’m not sure how what learners gain will help with ROI?”

“Stay tuned,” said Amy, “because all levels are part of a whole picture. You can’t just start at Level Four. We need to build to it. Level Three is attempting to determine whether behavior actually changed.”

“I think this is the most difficult level,” said Kathryn. “A lot of times, we are asked to build learning experiences, but we don’t actually see the results. We either don’t have a line of sight to what changed in their behavior because that isn’t shared with us, or the behavior changes themselves are not actually being measured by someone else.”

“That is a common problem,” said Amy, “and we are going to address that when we get to the next conversation on how to measure. For now, it is enough for us to agree that to get to ROI, we need to measure actual behavioral changes. Agreed?”

“Yes,” said Kathryn.

“Level Four,” said Amy, “is what Kurtis and his finance team are trying to get from you. This is the ROI. The direct results that will tell us whether the company’s performance improved because of their teams’ learning experiences.”

“We have even less insight into Level Four than we do into Level Three,” said Kathryn.

“I’m not surprised by that,” said Amy. “If you don’t have good data for Level Three, it will be nearly impossible to say anything meaningful about Level Four.”

“I feel my anxiety rising again,” said Kathryn.

“I’m sure,” said Amy, “but remember that we aren’t doing any of this yet. We are simply agreeing on the levels we are going to seek to measure.”

“Got it,” replied Kathryn.

The Learning Team

“I want to add one more level,” said Amy. “I have done this with several other large organizations that have a learning team, and it has proved to be helpful. There is a data point we are failing to capture in Kirkpatrick’s Model.” Amy walked to the triangle on the whiteboard and wrote: “Learning Team.”

She then renumbered all the others so that the Learning Team was on Level One and Results was on Level Five.

“That’s new,” said Kathryn. “I’ve not seen anyone do that before.”

“The reason I’m adding it is that you have an incredibly talented and insightful team,” said Amy. “They are the experts in learning science and Instructional Design. Some of them are what I call learning artists. They know more about learning than anyone else at AshCom and yet nowhere in Kirkpatrick’s Model are they asked to give their insights on what they’ve built.”

Kathryn sat quietly for a moment. “I suppose my initial reaction is that they might all give themselves an A+ on everything they build.

But if I think about it for a moment, I know them. They are often their own harshest critics. They can usually tell if something is going to work well or not.”

“And that’s why I’m adding it to Kirkpatrick’s Model,” said Amy. “It will help us understand what is happening in learning. I don’t want us to get ahead of ourselves, but this level will come into play in a big way when we get to our discussion on predictability.”

“I thought you said we couldn’t talk about that yet,” said Kathryn with a smirk.

“I said you couldn’t,” replied Amy. “I didn’t say I couldn’t.” Kathryn laughed out loud. “In all seriousness, this will prove to be useful, so we need it at the beginning for us to agree on what we are going to measure.”

“From my work with other companies,” continued Amy, “I can give you a few overall principles when deciding what to measure for each of these levels.”

“And this is why I’m grateful for you,” said Kathryn. “Your experience is invaluable.”

Demonstrating Impact

“One common mistake is to try to measure too many things,” said Amy. “Call it ‘metrics explosion.’ There might be a lot of individual metrics you would like to see, but that will only cause confusion. I’d suggest that for each of the levels, you spend some time thinking about which are key to demonstrating the impact you want to see. What is really going to move the needle.”

“How many is too many?” asked Kathryn. “I think the range is between four and eight metrics for each category,” replied Amy, “but the fewer, the better. If each level had six metrics, you would end up with thirty total.”

“Given that our focus will be on preventative maintenance,” said Kathryn, “I’m guessing many of the metrics will already be set by our operations team.”

“That will be true when you get to Level Four: Behavior and Level Five: Business Impact,” said Amy. “For the first three levels, it is most likely that you will be using similar metrics no matter what learning experience you are examining. For the learner experience level, for instance, you will want to know whether the experience was relevant, useful, and something they would recommend to others. You will find the same dynamic for Levels One and Three.”

“So how do we begin?” asked Kathryn.

“I have a surprise for you,” replied Amy.

“Who doesn’t like surprises?” said Kathryn smiling.

“For other clients trying to figure out ROI, I use a product called The Learning Scorecard,” said Amy. “It was developed by a company called MindSpring. They spent years working through the challenge of ROI and learning. They built a model that gives users some options for deciding what they want to measure at each level.”

“I like the sound of that,” replied Kathryn.

“But that’s not even the best part,” said Amy. “I’ve used The Learning Scorecard to measure the ROI for other manufacturers and their preventative maintenance programs.”

“And it took you this long to tell me about it?” asked Kathryn, leaning back in her chair.

“It is best to think through each step before you dive into the tool,” said Amy. “Later today, I will send you a link to see what some other companies have done when deciding what to measure. And I want to remind you that we still have to talk through how we are going to measure our metrics and how we are going to make the results visible to the financial and operations people.”

“That will be the topic for our next conversation?” asked Kathryn.

“Yes,” replied Amy. “Next time, we will just focus on how we measure. That will help us begin to develop a model. Are you available next Tuesday morning at 8 a.m.?”

“I am,” said Kathryn. “It is kind of nice to get an earlier start while the office is still quiet.”

“Great,” said Amy. “I’ll bring breakfast for both of us.”

Conclusion

Download the eBook The Learning Scorecard: How To Measure L&D Impact And Prove ROI to delve into the data and discover which key metrics your L&D team should consider. You can also join the webinar to discover a completely new approach to measuring ROI.

Dear Reader, if you would like to see a demo of MindSpring’s Learning Scorecard, please click here to schedule a time convenient for you and the learning experts at MindSpring will be happy to walk you through it.