Expanding eLearning Participation: Case Study From A SaaS platform

Expanding eLearning Participation: Case Study From A SaaS platform
Summary: What are the three Instructional Design rules and metrics that provide success? What works when students have complete control and only engage where they want to? This article shares a few findings from a recent case study of a SaaS platform for graduate education, discussing both what works for eLearning participation and how you measure it.

How To Expand eLearning Participation  

Instructional Design is both an art and a science in that we need to be both creative and data driven. We have found creating without a clear data stream of what students’ want/expect leads to less than stellar engagement. This is a quick overview of some recent case study results shared with the purpose of enticing new designs in graduate education.

Who are we? DoctoralNet’s SaaS technology enhanced learning platform aims to fill in the gaps created when graduate students moved to a 24/7 off campus, online lifestyle. Our purpose is fill in gaps between what universities can offer the distance student and what the Ivy Leagues supply as a matter of course including: Structured thesis guidance, regular academic socialization, and a sense of community, but this time through technology. What makes our business hard is that, for the most part, the students subscribed to our “training” come by free will.

This article answers: What are the lessons for Instructional Design when you cannot “make” the recipient do anything?

Our 3 “What Works” Rules

Three rules guide our development work. Over long strategic conversations, pouring over data, we have come to realize that eLearning participation rates hinge on our hitting the sweet spot at the middle of these three “rules”:

  1. Be Interesting Or Innovative.
    Innovation and fun attract. The fresh idea wins support. This leaves us always watching the sports, fitness, and self help worlds for ideas we can incorporate. The winners here for us are the 365 emails and the 30 day writing challenge (discussed in detail in the next section).
  2. Be Useful.
    The first cadre of services we developed we aimed at usefulness and they still provide the backbone of our services. The three that top our list are: a) maps through the complicated thesis process, b) automations© which are interactive technology mimicking the most common mentor conversations, and c) milestones which give students a manageable accountability system through which to mark progress.
  3. Be Available.
    We realize that online events are our friends – they allow our professors to have the interaction they crave as the technology gives students a medium ground through which to move from their campus experience to our online portal. It is sometimes tough to be in all places at all times, because there is such a wide range online where students meet up. We have our hands in: Live webinars, asynchronous communities, Twitter, Facebook, and LinkedIn.

How We Measure: The Metrics That Work 

Finding quick and easy measurement has not been an easy road. It’s informative to think of how many measurements led us to realize that often we were no wiser after discussing data than we were before. Just recently we settled on two metrics that consistently help research and development.

Opt in rates tell us if the overall instructional design model has appeal. Surprisingly the two recent winners are: 1) 365 emails and 2) a 30 day writing challenge. The 365 “push notification” emails act in place of on campus casual discussions; enlightening and enlivening the graduate journey as they motivate people towards persistence and success. We know they are successful as they enjoy an almost 50% open rate. For the 30 Day Doctoral Writing Challenge, we used five minute videos, sent in daily emails, as the instructional format. Hundreds signed up to participate, and still continue to do so. 40% finished the challenge.

The tougher metric, both to pin down and to win through to, is ongoing eLearning participation over time. Our subscribers also have their campuses and the rest of their academic lives as part of their support systems.  This means our training comes into play when: They are confused, in need of extra support, or at such a distance from campus that they use us as a stand in, or have fallen off the radar of their university and don’t want to come back without results. Capturing usage data is not easy, when the user comes and goes. Still, we are beginning to see metrics similar to what other SaaS solutions report: 1) catch them quick and you have a chance to really hook their ongoing participation, or 2) when they don’t give you serious attention at the beginning of the relationship keep inbound messages in their space to catch them later. Obviously the latter requires much more work.

In Conclusion 

Teaching is and always will be, we hope, a personal experience. Most students lurk online before they show up to engage. Still we realize that in classrooms they are quiet before they raise their hands, or have that private conversation with their teachers after class. While the format has changed, there are ties across learning environments, even in the SaaS world. Like many, we are just learning the translation points.

Honoring the users choice to engage or not requires the best of educators. In our experience, this requires more evaluation, surveys of needs, and desired and watching engagement, than the classroom ever did.

Originally published on September 25, 2015