Learning Conference Insights

Learning Conference Insights
tynyuk/Shutterstock
Summary: What did you miss at DevLearn?

Trends, Insights, And Nuggets Of Wisdom

The Learning Guild's DevLearn conference in Las Vegas, Nevada is one of the yearly main gatherings for learning professionals curious about the intersection of technology, tools, data, methodology, and other Learning and Development (L&D) humans. This is a summary of conference insights that I gathered from sessions, the vendor expo, demos, and hallway conversations, as an attendee, a speaker, and generally, a human.

Why Would Someone Go To A Learning Conference At All?

It's 2022. The metaverse(s) is/are upon us. We have Artificial Intelligence (AI) creating pictures and videos from text descriptions. We have virtual meetings all day long, literally from our living room. Why would anyone want to get out of their comfort zone, go through the pain of airport delays with jetlag, and stay in a hotel (that is also a loud casino), just to see thousands of other people (who they don't know at all) in person?

1. Because Your Brain Needs A Break

Your brain needs a break from trying to figure out whether, at this time of the day, you're Zsolt who's living at home, or Zsolt who's working from home. Regardless of where you are on the introvert-extravert scale, making human connections, learning new things, reflecting on what you learned, and even just lurking to see what questions or problems other companies are facing, is much easier when you take your brain out of your routine place.

2. Because You Have A Chance To Meet Industry Leaders

In today's constant online presence, a yearly conference shouldn't be the only occasion when you learn from others. Brushing your teeth once for a whole week doesn't make up for the year-long lack of hygiene. When you go to a learning conference, you have a chance to see the people you follow online, ask questions, and hang out with them in the hallways.

3. Because You Learn From Other Participants

One of the most practical aspects of a conference session is learning how other participants think about the same challenge, how they solved it, and what worked and didn't work. In the last 15 years of attending and speaking at learning conferences, I connected with thousands of people who were working on the same challenge as I was. This is not networking for selling, it is networking for growth.

4. Because You Can Test Out And Experience Over A 100 Technology And Tool Vendors In A Single Place

The expo is something hard to describe if you have not had the chance to attend one of these learning conferences. Imagine a warehouse filled with booths (growing in size every year) with a team showcasing their product or services. You can try Augmented Reality (AR), Virtual Reality (VR), games, authoring tools, video, xAPI, Learning Record Stores (LRSs)/Learning Management Systems (LMSs)/Learning Experience Platforms (LXPs), etc.

Insights And Takeaways From DevLearn 2022

This year, I had four sessions at DevLearn. The first two were actually at pre-conference events. Conferences are often adding on pre-conference or co-located events, so those who are planning to attend the learning conference itself can do a more focused deep dive into a topic. This can be a workshop, certification, user conference, or similar day-long event.

I am a risk taker and tinkerer when it comes to technology and skills. I believe that those of us who have both technical and learning backgrounds are responsible for experimenting and pushing the boundaries of what tech (and tech companies) can do. And therefore, my sessions tend to be always more advanced and experiential. I do run into challenges, but it is worth pushing both the tools we use and the vendors who own those tools to evolve and grow.

1. Tools Are Tools

They come and go. They also evolve. For example, I view authoring tools like Articulate 360, Adobe Captivate, Lectora, Adapt, Evolve, etc. as tools to support and enable learning design. What you do with these tools is more important than the tool itself. You can't blame a tool for not having the skills to use it well. On the other hand, we should select a tool for specific projects and purposes, and not the other way.

Start with the business problem, identify the players who can influence the outcome (directly or indirectly), list the key performance indicators that are measured, and then finally, the behaviors needed to move those along with the current barriers (why people are not doing what they're supposed to be doing). This backward-chain design is the only way to get a chance to identify learning or training problems. If you start with content and your favorite authoring tool to make the content engaging, you may have a job today but within five years, AI will redefine what "rapid" means.

2. If You're New To xAPI, I Recommend Digging In

Every learning designer at this point probably knows SCORM and its limitations to track data in the LMS. For the last two decades, we taught businesses that we are really good at telling how many people completed the courses, how much time they spent on the course, and what their score was. We've been also proudly presenting our "Level 1" results that people loved the experience.

But the thing is, all these don't really matter. Before I get stoned, let me rephrase: if you only measure these "vanity" metrics, you can build a great PowerPoint presentation with charts for your learning department at the end of the year, but these metrics are not the ones the business is interested in. In fact, they can backfire. Have you heard things like "making more with less" and similar slogans when the training budget is cut and you have to "work smarter not harder," blahblahblah? Or that we need to move faster and create content rapidly?

One of the reasons for this is that we are considered a cost center. If I were a business stakeholder and saw a large number of hours on the PowerPoint as "training delivered," my reaction would be: so what? This is the investment we made. Where's the result? Well, we can't track anything else because of SCORM...but now you can. xAPI (I agree, not the most intuitive name) allows you to capture and store more granular data of who did what, where, how, under what circumstances, why, and what the results were. And it's not restricted to what you do on a course. What does that mean? You can capture information from a game, a VR headset, an offline activity, a project, or from a coffee maker (so I've heard).

Before you get carried away with capturing all kinds of data, though, here's my two cents I shared at the xAPI user conference: your stakeholders don't necessarily know what they want to "track" as far as activities go. You may need to help them, by showing the value of what we can provide. And so, you may want to start with this simple approach to open the discussion:

What decisions are you making today without data? What decisions are you not making due to a lack of data? What risks are you taking today due to a lack of data? What is some information that you don't have today that would make your decision-making more accurate?

The value of what we do is not in the course content. It is in the impact on the job. But we haven't done a good job advocating for it so far. It's time to measure the right thing! Along with xAPI, I also strongly recommend Will Thalheimer's learning-transfer evaluation model (LTEM) and his book on performance-based smile sheets. By replacing vanity metrics, we can provide more actionable data for businesses.

3. How To Research, Assess, Select, And Implement New Technology In Your Organization

The panel discussion on this topic included eight of the Guild Masters (Mark Lassoff, Chad Udell, Dr. Jane Bozarth, Julie Dirksen, Karl Kapp, Clark Quinn, Megan Torrance, Ron Price), facilitated by David Kelly, CEO of the Learning Guild. These are leaders I’ve been following and learning from for years because you never stop learning. If you don't know these learning professionals, you should. Here are some takeaways from the session:

  • Don’t “trust” your vendors
    I second that. Not because they are not honest. We often stop at the what: can your platform do X? Yes. Go deeper; ask for data, examples, and results. Explore the how, and what it takes to get that result.
  • Ask questions
    "What unexpected results did you encounter in the previous implementation?" This is a funny question, but it's powerful. First, you can learn who actually implemented something and whether they know the details. Second, it opens up the dialogue on mistakes to avoid or ideas to explore.
  • Know what problem you’re solving for
    Write it down! It’s easy to get lost in bells and whistles. Writing it down also helps you to define scope.
  • Don’t expect magic from technology
    Unless your problem is really just technology (I’ve never seen that before), there’s a lot more to consider, including solving it without tech. Explore how people solve it today. You’ll need that for two reasons: the first is for change management. Get everyone involved. Create your digital adoption strategy based on the existing workaround. Second, you may learn that the problem will not be solved by tech. You may learn that existing tech can solve it.
  • There was a question about mistakes when selecting tech
    The answers included shiny object theory, jumping on the bandwagon, some executive’s bad decisions, and building something when you don’t have the expertise and capacity. My two cents on this (I have many examples): buying an expensive tech and then mandating to use it for everything (stuff that it was not designed for) because you define its value as price/usage. Horrible idea. You not only slow down performance but also ruin the technology’s reputation within the organization.
  • Pilot. Pilot. Pilot. With real users
    Don't fall for some demo data or dummy results that show that "in theory" it works well. Find real users, real context, and real circumstances, and do a controlled pilot before you roll it out. You'll have a chance to see how it is a fit for the culture, iron out any kinks, and learn a lot about digital adoption challenges.
  • Define success
    I don't know how many times I've seen this mistake: falling in love with implementing the new shiny thing, without defining how to measure and evaluate the success of the implementation. No, the success of any tech implementation is not that it launched. That is a milestone in the project plan. If you do not have a measurement and evaluation plan, you may expect one thing while the business expects another.

My two cents on this: don't overpromise to solve world hunger using new technology. Don't even start with the technology. Find a clear pain point. Solve that. Show the result. Others will come and ask two things: first, how did you do that? This is when tech can be mentioned. Second, what else could we solve for? This is when they're hooked.

What's In The Expo?

Roaming the expo floor is one of my favorite activities. For one, I love "people watching", as in observing interactions. You learn a lot about vendors by observing their nonverbal communication with potential customers (and among themselves) from a distance. Of course, it doesn't hurt that some of them have raffles or draft Guinness in a branded shot glass :) But more importantly, here are some of the takeaways:

1. It Was Good To See More Practical VR Implementations

The push for online in 2020 definitely had an impact on VR solutions. While not many are touting metaverse presence yet (it is a whole different story that the definition of metaverse(s) is pretty vague), the solutions showcased were more practical in nature. However, don't expect to create all your learning solutions in VR any time soon.

VR is a great match for risky operations (robots, machinery, surgery, etc.) and soft skills, like empathy. However, it still takes skills and time to do it right. One of the challenges of designing for VR is that the more realistic you get (which is the big advantage of VR), the more critical people get about the experience. For example, in a text scenario, they can imagine anything they want. If the text is accompanied by a 2D image, you have to decide what to show. If you're a global company, the same office, for example, may look very different. And then when you get to VR, you have to decide on the avatar, the environment, how objects look and interact, etc.

2. More 360 Images

More 360 images are embedded in solutions. Even if you don't go all the way to VR, there were good examples using 360 images to explore space, warehouses, buildings, etc.

3. LMS Vs. LXP Vs. Skills Development Platforms

The good thing is, skills are hot. I mean approaching learning from the skills perspective versus the content perspective is definitely a good shift. Just by walking around and listening to conversations, the phrase "we're not an LMS" was loud and clear. In fact, in the Guild Masters panel discussion the statement that "LMSs are going away" caused a bit of a heated debate. Eventually, we settled on "the LMS as we know it" is going away. Not that the technology suddenly dies, rather it evolves into something new.

Before you choose your LMS/LXP (or whatever three letters they have), make sure you write down your requirements. It is easy to get lost in shiny new features. As for skills, don't get too excited about 10,000 skills. You will first need to figure out what skills you need in your organization for tasks performed, how to measure them, what competency level they require, etc. Most skills management apps use various sources to collect and keep those skills up to date. One of these sources is job descriptions. Here's my question: look at your own job description and compare it to what you really do in your role. A lot of job descriptions are a laundry list of stuff you may or may not have to ever do in that role—that's the data the AI was trained on.

4. Games And Gamification

If you know me, you know I've been speaking about games and gamification for years at learning conferences. There was a time when all you needed was a warm body and the word "gamification" to be invited into all kinds of cool clubs. Not anymore. Most learning professionals realized that it was not as simple as throwing game mechanics onto top content. Where do you start? Start playing games, but not just playing for the sake of playing. We call it deliberate play. Take something you like. Use a framework to dissect the game. This is the only way to understand the dynamics between elements. Then you can apply these to your design. My ATD@Work publication Game Thinking: From Content to Action was sold out at DevLearn. Check it out if you're interested in learning how to move from content to action.

Finally, One Last Thought About Any Learning Conference

Once it's over, you'll most likely feel tired in a good way, full of energy and motivation. Make a plan! Write down your goals of how and what you're going to apply from the conference and share it with someone! Because, as a speaker, my measurement of success is not the number of people in my session, or even the ratings. It is the impact after the conference, where people apply what they learned. Now, go and build something!