The Best Behind The Scenes Design Tips: Should We Use Game Shows In Training?
The following two statements have the exact same valid use for Learning and Development:
"Game shows appeal to all learning styles—they allow visual learners to see the question and surrounding information; auditory learners to hear the question and discuss answers, and kinesthetic learners to ring in, cheer, and participate."
"The proof of the devil is in the details of the pudding eating."
The second statement is mine, and it does not make any sense. The first one is an actual statement. There are other baffling statements in the same article[1]. For the record, there is no significant body of research evidence proving that matching someone’s “learning style” with that of the same instructional style would lead to a better learning outcome. (If you do have evidence, collect your $3,000 here[2].)
Putting learning styles aside, I am sure at one point in your life you’ve been exposed to a training session where you played a game inspired by Jeopardy, Who Wants to be a Millionaire, Family Feud, or other well-known game shows. In this article, we'll explore design decisions while creating a “game show” experience for a new-hire course.
Let’s Go Behind The Scenes (Case Study)
Business Problem
In a seven-week instructor-led new hire program, our design challenge was to create a blended, engaging review/assessment activity that can be repeated throughout the program with different topics.
Competition
Game shows are normally competitive. Competition brings drama to the screen. While competition is a common game element to create engagement, the truth is, in the workplace, there’s more cooperation between people than competition. Therefore, we decided our target audience, customer service new hires, should not play against each other.
Our solution was based on the game show Who Wants to Be a Millionaire? with some modification to make it more learning-friendly.
Measurement Of Success
The original game show is all about winning money.
When customer service agents hit the floor, their success will be measured in customer satisfaction, not money. Therefore, we wanted to reflect the same in our design.
Social Element
Originally, the game show is a single-player activity (although one of the lifelines is polling the audience). When you have over 20 people in a classroom, you can’t have one person "playing" only. Therefore, we designed the game-based assessment to serve one question for each person.
Have you been inside a classroom where everyone has one, designated question to answer? And they’re going in order? You know what happens: they pay attention when their question is coming up. Otherwise, they just write down the correct answers.
To address this issue, we designed the game to randomly pick the order of participants. At the beginning of the game, the facilitator copy-pastes the names into a window, and the application randomly assigns each question to one of them. This way, nobody knows when they are up. As a minimum viable product, a simple copy-paste of names worked well without any integration.
Heart Of The Design: Model-View-Controller
One of the very first design decision was to approach the game with a model-view-controller (MVC) in mind. MVC is a common design pattern used in software applications. In layman’s term, it separates the information architecture into three parts:
- Model
This is where the data lives. In our case, the questions were in an external XML file. - View
This is the user interface, what the user interacts with. - Controller
This is the logic behind what the user can and can’t do. Think of as the code, the engine behind the game show.
Why MVC?
Why choosing the MVC model? By separating the data from the logic and user interface, we were able to build this game show once and replicated it in any number of times after just by changing the questions. After one game show was completed, we just made a copy and edited the XML text file with the questions. It took less than 20 minutes to create a new instance of the game show.
There’s nothing worse than time and resources poured into a game that includes outdated answers. The MVC model allowed us to update the questions any time without breaking the game or republishing any code.
Separating the logic (Controller) allowed us to make changes to the gameplay without the risk of breaking the user interface. For example, we changed whether the questions from the text file should be randomized or not. Or, in one case, we needed the timer go slower.
Designing The Questions
We built the engine to be flexible with the type of questions to use. Besides the traditional True/False and multiple choice questions, we wanted to make sure actual performance could be assessed as well. The manual assessment function allowed the Facilitator to observe and assess real performance. For example, a new hire had entered a payment in the billing system. The Facilitator observed the process (in front of the whole class) and evaluated with a pass or no pass in the game.
You Build It, They Will Facilitate It. Or Not?
"You build it, they will facilitate it." Don’t make that mistake! If Facilitators are not comfortable using anything new, the results will be inconsistent, the least. We certified every Facilitator on how to use this and other interactive elements in the new hire program to make sure we get consistent and valid data after the program is launched.
Lesson Learned Tips
- If you include a mini-game or game-based assessment in an ILT, introduce the game to the class first. Play a couple of rounds with them, so they understand the mechanics before you let them do it on their own.
- Be consistent how and when you use the game in class.
- Always debrief after! The learning moments came from discussing failures or confusion after the games.
- We didn’t include any sound effects in the game show prototype. What we learned later is that we didn’t need them. With good Facilitators and enthusiastic groups, they made up their own "sound effects" that became part of their routine.
Anything I Would Change Today?
Yes! I would add an xAPI component to the game show. I’d track data across classes using the Context object including the Instructor and Team property as well. The collected data might give you insights about any correlation between topic, questions, individual learners, as a team, and their Facilitators.
As a closing note, for assessing more complex decisions making skills, you should go beyond the question-based game show templates. Research shows serious game-based assessments "prove valid alternatives to traditional psychometric measures"[3]. Combine that with AI and machine learning to capture and analyze data points, extend it to beyond classroom learning to support continuous development, and there you have it! Now, that's a pudding you want to taste.
Conclusion
Using game show templates in training is not necessarily an ineffective instructional strategy in a blended course. Games can support assessment and review of knowledge and skills. However, the devil is in the details, and the proof of the pudding is in the eating! You won’t know until you do try it!
References:
- Using game shows as an instructional tool
- Learning Styles Challenge — Year Eight — Now at $5,000
- Playing Around