Optimizing Workflow Performance
In any industry that has multiple diverse workflows and processes with many steps and roles tasked with execution, there seems to be a consensus that "one-size-fits-all" training is not a sustainable solution. Training of that nature cannot be targeted enough to reach the point of work where actual performance takes place. A logical solution involves enabling access to performance support in the workflow. I offer this question: what happens when our performance support solutions carry the same "one-size-fits-all" characteristics?
Performance Support At Point Of Work
Certainly, performance support is more granular and aligned with workflows, and we design them to be task-centric and role-specific. Those designs match up with our current best practice design efforts. And we are slowly finding ways and technologies to inject those assets into workflows at moments of need. I am suggesting they are not granular enough. Being task- and role-aligned, we are making the assumption that each individual worker in these many workflows has the same issue at the same place/step within the targeted deficient processes. I also suggest that moments of need are based on individual needs, not just the tasks themselves. Certainly, there are correlations where there are matching overlaps of needs; however, the requirements to design performance support solutions must align with individual needs. It sounds like a high volume of data points will be required to build solutions that are that dynamic, doesn’t it?
Consider a workflow having a 17–step process as an example. With our current level of analytics and discovery, we establish that a particular process is not optimized. The challenge at the granular worker performance level prompts us to do enough discovery to identify what is breaking down and why. Now we need to identify the individual workers’ moments of need. Are they always satisfied by task-based performance support? Is every worker represented by the same patterns of deficiency?
Today, performance support is designed and injected into the workflow to satisfy moments of need. What our current level of analytics does not do is measure individual performance deficiencies. If there are 40 worker roles involved in this 17–step process, how do we identify individual deficiencies? Without knowing these details and uncovering these data points we are destined to build "one-size-fits-all" performance support. True, that is better than training assets locked away in the LMS but methinks we are not optimizing a solution set without considering the additional granularity of the individual’s moments of need.
Required Data Points For Granular Performance Support
To reach the level of granularity for performance support that is necessary, we need several things:
- Data providing step-level visibility by individual worker performance.
- Visibility of patterns within that data during a worker’s on-the-job development learning curve.
- Recognition of variability to microlearning and guided support needs within the workflow.
- All of the above data must be discrete to the individual workers tasked to execute the work.
These requirements indicate a significant volume of data points that inform the level of granularity necessary to enable individual workers with support suited to their discrete moments of need. These requirements are out of reach for our current design methodologies, based on the volume of data alone.
Purpose-built Artificial Intelligence (AI) represents a bridge between high data volumes and actionable design attributes. Digging into the data will give us visibility into the things listed above and at the individual level. The result is the creation of prescriptive performance support which is data-driven workflow instructions to the individual workers in the workflow and at the moment of need.
What I find exciting about gaining access to design criteria at such a granular level is the ability to monitor the development of learning and performance curves. Optimizing performance support is only one aspect of improvement. Using the same prescriptive performance support assets within process-level training and utilizing the same technology that supports actual workflows, we can reduce the amount of onboarding training significantly and accelerate productivity at the same time.
Prescriptive Performance Support
Given that 85% of digital transformations fail, I doubt it is because of deployment failures every time; rather, it is effective implementation at the point of work that ultimately enables adoption. Visibility at granular levels of performance by individuals enables the optimization and fine-tuning of prescriptive performance support, essential for sustainable transformation. The common denominator as I see it is integrating a technological environment, based on purpose-driven AI that can optimize workflow procedures, with prescriptive assets at the individual worker level.
A couple of years ago I spoke at a guild conference on the “7 Right Things for Impactful Intentional Design”. An infographic was provided and is now downloadable here. This conference was well before I started thinking about the "prescriptive" evolution of performance support; however, all seven things are valid. The only difference now is they relate to an individual’s moments of need. Note the key "right thing" is access.
Thanks for reading! As always, I welcome thoughts and comments from any of my readers. Take good care!