Is L&D Hallucinating Its Value In The Age Of AI?

Is L&D Hallucinating Its Value In The Age Of AI?
eamesBot/Shutterstock.com
Summary: AI can generate learning content at scale. That puts pressure on L&D to prove business value in outcomes: faster time-to-competence, better decisions, and stronger execution—not more courses.

Moving From Content Output To Capability Stewardship

AI isn't just changing how we produce learning. It's changing what learning work is worth. For years, many L&D teams have been funded and evaluated on visible outputs: courses launched, completions, content libraries, learning journeys, and activity dashboards. That model was already under pressure. Now, generative AI can produce a large portion of those outputs in minutes. When content becomes cheap and fast, L&D faces a credibility test: If we can generate 10x more assets, will the business perform 10x better? If not, the function risks "hallucinating" its own value—mistaking content velocity for capability lift and stewardship.

This isn't about whether AI will "replace" L&D. It's about whether L&D will finally be measured on what leaders actually care about: better performance, better decisions, and faster execution in critical roles.

In this article...

When AI Does Your "Value Proposition" In Seconds

Let's be honest about where AI is already strong:

  1. Turning SME knowledge into polished drafts (decks, scripts, modules)
  2. Generating quiz questions, scenarios, role plays, and job aids.
  3. Summarizing policies into microlearning and knowledge checks.
  4. Translating and localizing content at speed.
  5. Creating "personalized" learning pathways based on tags and skills.

If a general-purpose model can draft 70–80% of what many teams publish, content output can no longer be the center of the L&D value proposition. That output will still matter—but it won't be differentiating. And it raises a tougher question: what part of capability building and stewardship is L&D uniquely positioned to own, that AI cannot commoditize?

A useful way to frame it for executives is this:

AI improves learning efficiency. It does not automatically improve learning value.

Evidence from field experiments shows that generative AI can boost productivity for certain tasks, often helping less experienced workers most. But productivity gains aren't the same as stronger judgment, better leadership decisions, or safer and more ethical execution.

The New Hallucination: "AI-Powered" Equals "Strategic"

Under pressure to demonstrate innovation, many learning teams are racing to "AI-enable" the learning stack:

  1. AI search in the LMS/LXP.
  2. AI content libraries.
  3. AI coaching bots.
  4. Autogenerated learning paths.
  5. AI skills inference and taxonomies.

Some of these features are genuinely useful. The risk is what they can mask. Because dashboards improve quickly:

  1. More enrollments and completions.
  2. More content consumption.
  3. More "engagement signals" (clicks, likes, time-in-platform)
  4. More stakeholder confidence ("We're future-ready—look, we have AI!")

But at executive level, those are leading indicators at best. The real questions are harder.

Four Executive Questions That Cut Through The AI Glow

  1. Are managers having better performance conversations?
  2. Are decisions improving in measurable ways?
  3. Are we reducing time-to-competence in critical roles?
  4. Can we name capabilities we now "own" as a competitive advantage?

If the honest answer is "we don't know," the AI layer hasn't made L&D more strategic. It's made the appearance of value more convincing.

Why Content Is The Wrong Center Of Gravity

Most organizations don't have a content problem. They have a transfer and execution problem. Decades of research on training transfer has shown that what happens after training—manager support, opportunity to apply, job context, incentives, and reinforcement—heavily determines whether learning becomes performance. So if L&D uses AI to produce more content without changing the conditions for transfer, the likely outcome is:

  1. More learning "supply."
  2. The same performance friction.
  3. More noise in the ecosystem.
  4. Greater skepticism from leaders who need results, not libraries.

AI can make the "content engine" faster. It doesn't solve the organizational conditions that make learning stick.

The Human Capability Gap AI Can't Close

AI is excellent at scaling information and drafts. It is far weaker at the capabilities that determine whether strategy gets executed in real workplaces:

  1. Judgment under uncertainty
    When no prompt has the full context.
  2. Trade-offs and prioritization
    Especially across competing stakeholders.
  3. Ethical reasoning and accountability
    What should be done, not just what can be done.
  4. Leadership courage
    Holding the line in high-pressure moments.
  5. Trust-building
    Relational capital that enables execution.

These aren't "soft skills." They are operational capabilities. When they're weak, organizations become brittle: highly informed, poorly prepared. And here's the L&D risk: if the function remains centered on content production—now faster with AI—it can look more productive while the enterprise becomes more fragile.

What CLOs Need To Own: Capability Stewardship

To stay defensible, L&D leadership has to shift from "learning supply" to capability stewardship. That means owning a small number of high-stakes questions with the business.

The Four Stewardship Questions

  1. Which capabilities will decide performance over the next three to five years?
  2. What evidence will prove those capabilities are strengthening?
  3. Where is performance breaking down beyond what courses can fix (work design, decision rights, manager habits, incentives)?
  4. How will we use AI to remove friction so humans spend more time in high-effort practice?

This is not a rebrand. It's an operating model shift.

What This Looks Like In Practice (Without Vendor Hype)

When capability stewardship is real, the work changes. It looks less like "AI-powered courses" and more like performance-aligned design:

1) Practice Environments, Not Content Catalogs

High-impact capability is built through practice, feedback, and reflection—especially when the context is complex. Research on simulation-based training supports its value for developing leadership and decision-making capabilities. AI can help here, but not as an "answer engine." Use it as a sparring partner:

  1. Role-play difficult conversations.
  2. Pressure-test decisions.
  3. Surface risks and counterarguments.
  4. Generate scenario variations for practice repetition.

2) Performance Data → Workflow Redesign

Use AI to detect patterns in performance friction (tickets, quality issues, time delays, customer sentiment, manager behaviors). Then partner with the business to redesign workflows. That might mean fewer courses and more:

  1. In-the-flow prompts.
  2. Decision checklists.
  3. Manager routines.
  4. Communities of practice.
  5. Practice loops tied to real work.

3) Manager Enablement As The Primary "Learning Platform"

If transfer is the problem, managers are part of the solution. Equip them with:

  1. Short observation guides.
  2. Coaching prompts.
  3. Rubrics for "good" performance.
  4. Quick practice routines in team meetings.
  5. Ethical guidance for using AI with their teams.

This is where many learning strategies succeed or fail—because the manager shapes the environment where capability either strengthens or decays.

4) AI Supervision As A Learnable Capability

As AI becomes embedded in workflows, people must learn to supervise it:

  1. Detect when outputs are unreliable.
  2. Validate against policy and context.
  3. Escalate risks.
  4. Document decisions.
  5. Maintain accountability.

Recent reporting and surveys suggest many workers spend significant time correcting weak AI outputs—often because they lack training and clear guardrails. (This is exactly where L&D can create value.)

A Simple Scorecard L&D Leaders Can Defend

If you want executives to fund capability, measure capability. Here are outcome categories most senior leaders recognize immediately:

  1. Time-to-competence
    In critical roles.
  2. Error, rework, or quality incidents
    Tied to decision-making and execution.
  3. Manager effectiveness lift
    Coaching frequency/quality, performance conversation quality.
  4. Customer outcomes
    Where relevant (sentiment, resolution time, escalations)
  5. Bench readiness
    Internal mobility, succession, readiness-to-promote.

The goal isn't perfect attribution. It's credible stewardship: clear outcomes, measurable movement, and a transparent story of contribution.

From Tool Adoption To Ecosystem Resilience

A quieter hallucination is emerging: "If we pick the right AI platform, we'll be future-proof." In reality:

  1. The market will consolidate.
  2. Tools will change quickly.
  3. Regulation and ethical expectations will tighten.
  4. Budgets will keep shrinking.

Resilience is the strategy:

  1. Design learning data models and processes that are portable and tool-agnostic.
  2. Keep ownership of core capability frameworks and success measures.
  3. Build internal AI literacy so you can evaluate, switch, and orchestrate tools rather than becoming dependent on one vendor.

A function anchored on clear human capabilities can move across tools without losing itself. A function anchored on a platform is one procurement decision away from irrelevance.

The Leadership Challenge For L&D In The AI Era

AI is forcing a choice. L&D can double down on the illusion—more content, more features, more "AI-powered" labels—or use this moment to tell the truth:

  1. Information access is not capability.
  2. Content output is not performance change.
  3. Automation doesn't eliminate the need for judgment; it increases it.

The L&D functions that win in the age of AI won't be the ones producing the most assets. They'll be the ones that can clearly define—and measurably grow—the human capabilities that no model can replace. If your learning strategy can't name the few capabilities it is strengthening—and show evidence they're improving—AI will make you faster at producing work the business no longer funds.