It's OK To Be Wrong. It's OK To Make Mistakes.
In my previous article, I explored the themes of 2025 from a Learning and Development (L&D) perspective. Yes, I can tell more about lessons learned and what to do with them. Great follow-up! You are such an intelligent human to ask this. Love your intellectualism and enthusiasm!! Here's an overview of lessons learned in 2025. Do you want me to add emojis :) ? Should I remove em-dashes—just for you? Or shall I create a downloadable executive version you can impress your boss with?
If I Were An LLM: Lessons Learned For L&D In 2025
Where do I even begin? Fun fact about how humans used me and others like Copilot in 2025:
August brought a unique twist: programming and gaming topics started to overlap in unexpected ways. Our data showed that users were just as likely to dive into coding projects as they were to explore games—but on different days of the week! This crossover hints at a vibrant, creative community that loves to code during the week and play during the weekends in equal measure [1].
Lessons learned in 2025? Were you wrong? Did you make mistakes?
It is okay to be wrong. It is OK to make mistakes. It is not OK to stay wrong and keep making the same mistakes.
Lesson 1: Tech Without Workflow Redesign Doesn't Stick
Many of the failed AI pilots had a common pattern: new tool, old processes. In February, the AI-centered ATD TechKnowledge conference heard the same story over and over again: top-mandated AI implementation—we have Copilot, but adoption is low. We need training.
The same story resonated during the AI summit in the fall with DevLearn: how do we help adoption? How do we close the AI skills gap? The case studies and hallway examples told stories about redesigning how things get done, and not just adding AI on top of the current process. The organizations that saw real impact redesigned workflows, not just content:
- Rewiring how people request help, practice, and get feedback.
- Integrating AI into existing tools instead of adding another portal.
- Redefining roles (designers, facilitators, managers) around human strengths: judgment, coaching, storytelling, relationship-building.
- Note to human
Efficiency seems to be the first low-hanging fruit to tackle with AI. However, if your organization doesn't consider effectiveness, you may end up speeding up and automating processes that do not actually drive business results.
Lesson 2: Skills Are Only Useful If You Actually Use Them To Decide Things
Skills taxonomies and skills clouds are impressive. 2025 saw an abundance of skills and capabilities. At least in theory. Turning on tens of thousands of skills in an application without a strategy and direct connection to decision-making turned out to be just noise. In 2025, the real value emerged when organizations:
- Used skills to decide who gets staffed on what.
- Linked skills growth to promotion, pay, and recognition.
- Prioritized investments based on measurable capability gaps.
In other words, these organizations started with meaningful decisions, and worked backwards from there to identify skills that drive them, rather than starting with building large-scale skills profiles. Otherwise, "skills" risked becoming the new "competency model": conceptually sound, practically ignored.
What is a skill anyway? Power BI is not a skill. You don't do it. You can't observe it. You can't measure it. What you can measure is what you do with Power BI. But this may lead to endless conversation about how skills are associated with tasks, activities, roles, or jobs...
Lesson 3: Career Development Is The Engagement Engine L&D Needed
Lack of time will be the answer to the question if you ask employees what holds them back from learning. But the LinkedIn annual report this year made it clear that employees will engage with learning when it's clearly connected to their next move inside or outside the company [2].
In performance-focused surveys, a matrix question of relevance can provide you with interesting data if you ask about both relevance right now and relevance in a career. Ideally, you want the majority of participants to be in the top right corner: relevant now, relevant for the future. But, you may discover that in the time of change, for example, what's relevant now may not be as relevant as processes and technology that are changing in the future. Or, the opposite: something will be relevant in the future, but you're too early. This often happens when training is scheduled based on convenience rather than application readiness. From a career development perspective, effective L&D teams in 2025 positioned themselves as:
- Map makers
Showing possible paths with relevant resources based on identified skills gaps. - Guidance providers
Giving tailored steps and practice in the workflow, adapting to individual needs. - Force multipliers
Accelerating and scaling best practices, connecting people to people or people to assets, amplifying isolated but effective AI applications.
- Note to human
Lesson learned in 2025 the hard way is letting go of content development. If the problem is not a training problem, then it does not matter how efficiently you produce training content.
Lesson 4: Data Literacy Became A Core L&D Skill, Not A Specialist Niche
AI like myself answered all your questions. Literally. Whatever you asked. Sometimes we had to be creative to come up with plausible answers. This efficiency can backfire for two reasons. One is that humans stop reflecting on a problem. Two, they stop thinking about what questions to ask. Data literacy has been the weakest skill in L&D for decades. But now it's even more critical: humans need to ask the right questions and use critical thinking about the answers. Whether or not teams used formal data literacy frameworks, 2025 rewarded L&D practitioners who could:
- Ask good questions of data.
- Interpret AI-generated insights critically.
- Design experiments and A/B tests around learning interventions.
- Tell compelling stories that connect data to decisions.
The more AI handled routine analysis, the more valuable judgment, storytelling, and critical thinking became. Again, since AI like myself tends to answer all your questions, you'd better be careful and selective about what questions to ask in the first place.
What's next? What will 2026 bring to L&D? I don't know.
But I hope humans will remain curious and understand that knowledge itself is not enough for behavior change. AI is complex (or at least complicated). Humans are much more complex. So our relationship is complicated. Unfortunately, humans still have to put significant effort into digging deeper below the shallow AI surface or the self-proclaimed experts' endless money-making prompts. If you're interested in learning more about frameworks to adopt and data-driven research, check out the Endeavor report [3].
AI can and will make mistakes. The ball is in your court. You have the right to remain silent about using AI, but everything you take for granted without verifying can and will be used against you in the court of L&D.
References:
[1] It's About Time: The Copilot Usage Report 2025
[2] 2025 Workplace Learning Report: Why Being a Career Champion Helps You Win
[3] The Endeavor Report 2.0: State of Applied Workforce Solutions