A Case Study On Evaluating An eLearning Course

A Case Study On Evaluating An eLearning Course

A Case Study On Evaluating An eLearning Course

Evaluating An eLearning Course: Case Study For Instructional Designers And Project Managers

Continually evaluating and analyzing your data ensures that your content remains relevant and informative.

1. Evaluate Your Course Objectives Before You Start Designing

Surveying your audience before you write your content is critical to making sure you address their learning needs. Interviewing prospective learners and conducting focus groups are 2 ways to gather this information. Design questions to find out:

It’s helpful to have a list of proposed learning objectives that you can go through with your audience to see if they match their own learning needs. Be prepared to get widely different opinions that you will need to assimilate to come up with your learning objectives and course design.

2. Survey Your Learners After the Course Content Is Available

Once you’ve prototyped (or even launched) your course, you can get more specific feedback about your design by adding a survey at the end to collect feedback from your learners. Most users want to give input on how to make the course better. It’s helpful to ask questions about User Experience. Quantify what they felt to be the most valuable takeaways. You can also gather input about the content (whether anything should be added or eliminated). Using the information to further refine your course makes it more likely that your course remains valuable and current.

3. Analyzing Data from Your Website or LMS

In addition to adding an evaluation at the end of the course, you can also analyze the data provided by your web analytics or LMS. Both of these allow you to track user activity. You can gather information such as:

This information can often inform you that your course is too long, too easy, or too difficult. Making adjustments to the content keeps your course targeted and effective.

Case Study

Our client, the California Victim Compensation Board (CalVCB), wanted to make sure they created eLearning courses that addressed the needs of their audience. Using data collected from learners, we were able to refine the courses to address feedback gathered before, during, and after course completion.

The Scenario

CalVCB provides compensation for victims of violent crime who are injured or threatened with injury. Among the crimes covered are domestic violence, child abuse, sexual and physical assault, homicide, robbery, and vehicular manslaughter. CalVCB wanted to create 5 online courses to increase 1) awareness of the program with advocates and 2) accessibility to compensation benefits for victims.

Solution

CalVCB already had a strong culture of sensitivity toward their audience. Together, we identified the importance of relying on user input and data (not just conventional thinking) as the foundation for our work. Our approach included gathering user information and data throughout the entire process of product development and iteration. The 3 ways we did this included:

Key Benefits

Here are more details about each piece of the three-pronged approach.

Focus Groups

We took a user-centered approach that included extensive dialog with target audience members to gather input on course Instructional Design and style. Here are some of the key findings based on these focus group conversations:

Usage Analytics

Our hosting of the courses allowed the collection of certain learner data. Here are some of the findings of our data tracking:

User Feedback

User evaluations helped validate what was working and identify areas for further improvement in the courses. At the end of each course, we created some questions for the user so we could gather feedback on the course. Here are the results of the initial course evaluations:

By taking learner feedback and data into account throughout the design and implementation process, we were able to create courses that were optimized for the specific needs of the CalVCB audience.

Exit mobile version