Learning Development Gets A Bad Rap And We're Partly To Blame

Why Learning Development Gets A Bad Rap

In May 2017, Forbes published the article Learning Development Not Valued By Organizations. While hyperbolic headlines to drive clicks seem to be the norm these days, the article identifies real challenges preventing learning development practitioners from shifting their brand from one of cost centers to-be-tolerated, to value-adding contributors worthy of additional investment.

The article’s title references findings from The Challenges of Global L&D Survey conducted by The Open University Business School. The survey which asked 200 senior leaders across the industry to comment upon the state of learning development, produced sobering findings:

  • Two-fifths of international organizations don’t have a global strategy for learning.
  • Half of Learning and Development decision-makers think learning is not seen as important in their organization.
  • 42% claim they lack direction from the top.
  • In nearly half of the organizations surveyed, the learning architecture is ‘decades’ out-of-date.

In discussing the genesis behind these views, Bernd Vogel, director of Henley Centre for Leadership at Henley Business School, stated:

L&D is often seen as a ‘token’ activity and that is the underlying philosophy that top managers have about it. The L&D function needs to be seen as a senior partner. L&D people often lack confidence and clarity about what they contribute to the business and it’s not always about retention and investment.”

Data from additional industry reports supports Vogel’s comment. Only one third of learning professionals surveyed in ATD’s 2016 Evaluating Learning White Paper indicated their organization assesses learning impact at Kirkpatrick’s fourth level, while even fewer, 15%, measure ROI.

Brandon Hall’s 2015 State of Learning Study produced even bleaker results: 19% of the learning groups surveyed conduct level 4 measures, while an alarming 35% indicated either “little or no” connection between their work and corporate goals. Additionally, Brandon Hall noted the trend has not improved over time. In 2009 37% of respondents agreed with the statement “Our learning evaluation efforts help us meet our organization’s business goals”. In 2015 the number dropped to 36%.

Meanwhile, senior leaders have made no secret about their desire to better connect learning initiatives to corporate strategy. LinkedIn’s 2017 State of the Industry Report reveals executives identified business impact and ROI as the two most important, yet lacking, measures in the field of learning development. A mere 8% said they can see the business impact of learning initiatives, and a paltry 4% correlate learning development in their organization with a tangible ROI. In other words, learning professionals rarely view their work as adding measurable value to their organization’s goals, and the leaders for whom they work feel even worse about the prospect.

No sane practitioner strives to under-deliver or fail to meet the expectations of leadership, so why does this disconnect persist? The answer may spring from 2 root causes:

  1. A lack of the requisite knowledge and skills to conduct higher-level measurement.
  2. Little support in the form of a framework to institute repeatable practices.

Studying To Become A Learning Expert

The number of colleges and universities offering degrees in Instructional Design, eLearning development, and learning technology has grown in recent years, yet many practitioners enter the field via a less direct route. An article published by Learning Solutions Magazine in 2015 references yet another ATD industry survey of 1100 Instructional Designers, and reveals that less than half (46%) of practitioners hold an Instructional Design or related degree. Let that sink in for a minute. Less than half of all learning professionals surveyed received extended formal education in the area in which they work. Does that mean the industry should create some bar of entry in response? Absolutely not; brilliant learning professionals come from a variety of backgrounds, many of which add value. However, it doesn’t mean we should ignore this dynamic either.

It goes without saying that degree programs, particularly those at the masters or doctorate level, should require proficiency in statistical analysis, but what of the 54% of learning professionals who start as Subject Matter Experts or who studied English, communications, HR management, graphic design, business administration, or another discipline? Since Instructional Design incorporates a wide array of competencies, certification, and continuing education programs upon which these individuals rely, take a survey approach providing students with exposure to a huge array of topics without going into great depth on any one.

Seeking Solutions

And this approach may be necessary, as the broad roles and responsibilities of Instructional Designers requires exposure to many disciplines. Still, if the learning profession wants to make headway in demonstrating value via hard metrics and ROI, it needs to offer additional training in its arguably most complex domain. Measurement incorporates regression, confidence, relevance, repeatability, and other difficult concepts that require in-depth study to execute at the highest levels. Continuing education through certification programs, self-study, and industry events can provide a base from which to start, but they must demonstrate a degree of rigor. If learning professionals demand better training and support in measurement and evaluation, capable solution providers will oblige and offer the industry an opportunity to better convey value.

A lack of framework further complicates measurement initiatives and exacerbates the already significant challenges. Learning professionals who implement a repeatable process should focus on asking the right questions at the right time; linking their work to organization-specific efficiency, compliance, and sales metrics takes a huge step in demonstrating the value leadership craves. Establishing this structure takes time and expense, anathema to most businesses, yet without it, the status quo reigns.

Additionally, existing practices may discourage good execution if learning professions misapply them. For instance, the still relevant and actionable ADDIE model denotes a linear process with evaluation bringing up the rear. Nevertheless, measurement should permeate all phases of Instructional Design. Framework connecting objectives to observable, quantifiable behaviors is a must. Capturing the raw cost of development as well as opportunity cost is a must. Instituting a measurement plan to check progress at defined intervals and document learning decay is a must. Communicating these results to leadership is a must. Should we apply these practices to every project, every deliverable? Unless we have access to the talents of a full-time measurement specialist, for the sake of expediency and practicality, probably not. We should however, take these steps for the initiatives most directly aligned with organizational goals, financial or otherwise.

Let us move with urgency. If fewer than 1 in 10 business leaders see learning development as adding value to core strategy, and fewer than 1 in 20 believe we can articulate this value, something needs to change. Near the conclusion of the Forbes article, Penny Asher, director of executive education at Open University Business School, succinctly summarized both the danger of the status quo and the potential for those willing to adapt:

We are experiencing a fundamental shift that will affect every L&D department. Tighter margins and the increasing expectations of candidates and employees mean there has never been such pressure to get it right; those that do so stand to make great global gains.

 

References

Close