Learning Analytics & Instructional Design

As an Instructional Designer (ID), or anyone loosely connected to the L&D industry, it’s been just about impossible to avoid mention of “learning analytics”.

You’re never too far from someone bringing it up during discussions of learning design, but usually without any explicit explanation of what learning analytics is . Although there’s a learning curve when it comes to implementing learning analytics, understanding them is actually quite simple. So simple, that we’ll demystify it within the span of a single article.

Orangelineblog.svg 

Learning analytics no more mystery

It's All About that Database 

Despite a fancy name like “learning analytics”, what it’s all really about is data.

In this case, it’s any and all data that captures the interactions between your learners and the learning systems.

The idea is to turn data into insights for better course design that help the success of the learner.

Of course, drop-out rates and other crude data captures have been used to flag problem areas for as long as we’ve had student registrations.

Today it’s not just the fact of analysis.

It’s the sheer volume of learning data. Everything that a student does is recordable and measurable. It’s a treasure trove of potential insights.

How to Mine for Gold 

Raw data is about as useful as a ton of dirt with a ounce of gold in it, so we need to refine it. This is where you either have to get a bona fide data scientist on the job, or you can turn to a number of tried-and-tested tools made for learning analytics. Two relevant tools to highlight are xAPI and the LRS.

xAPI and the LRS

SCORM has been holding the learning software standard banner for years, but in 2011 the custodians of SCORM realized things had to change. xAPI is the result of that revelation and brings the foundation that SCORM build into the modern age.

How is it different to SCORM?

First of all, instead of leaving it to you to build data collection frameworks from the ground up, xAPI provides a consistent methodology for achieving the goal of data collection for learning analytics. The name xAPI is born out of a two parts. API for Application Programming Interface, which is a method of communication between software and the ‘x’ for ‘Experience’.  So-called because the format is designed to measure the actions of the learners.

The LRS or Learning Record Store is where all of this collected data lives, so that there is one standard data storage format that can be shared across systems and organizations.

xAPI and the LRS are both far more advanced and modern than SCORM. Especially when it comes to multiple devices such as smartphones and tablets, which were not an envisioned factor when SCORM first saw the light of day.

Orangelineblog.svg

What Can I Do With xAPI?

So what can you actually do with xAPI? It provides the perfect launching pad to get the analytics ball rolling.
In general there are three clear applications for the technology:

Learning-Analytics-No-More-Mystery-2.svg

 

In other words, when you have a wealth of learning data, you can use it to figure out where the failure points have been for past learners.

Is there a common killer in your course?

Do learners that tend to fail have something in common?

For learners who are currently engaged in the course, you can keep a close eye on how they are faring. If you have well-defined “red flag” conditions you can detect all sorts of important situations.

Is there a lack of engagement?

Are learners not spending enough time on something? 

It’s possible to construct predictive models from historical and current learning data. It can help you identify at-risk groups very early in the learning process. This is a cutting-edge application of analytics systems based on technologies such as xAPI and some care has to be taken with its interpretation and application. In the right hands, it’s one of the strongest examples of analytics technology. 

The Nuts and Bolts

The overall process of setting up and executing learning analytics for various purposes can be summed up in a simple way.

You need to know what data you want and then plan how you’re going to get it. For example, will you get it from a mobile application? Social media?

As mentioned prior, xAPI has much of this pre-baked into it. Then you need to store that data somewhere in a format that’s universally usable. Luckily the LRS we mentioned before also takes care of this problem. So far, so good!

You can’t just go ahead and throw analyses at that raw data, it needs to be cleaned and prepared. Some info will turn out to be useless or there may be cases where the wrong information is captured.

Once you have this data you can apply the relevant analytical method to it. E.g. building a model to explain failure or building profiles of your learner population.

Once you have the results of your analysis, you can build the lessons you’ve learnt back into the revised design and start the whole process over again. Making it more accurate and useful with each pass!

For a more detailed look at the xAPI standard, check out Learning Locker's resource 'What is xAPI'

Orangelineblog.svg

A Bright Future for  Designers 

Instructional designers are absolutely essential in the interpretation of analytics data after the fact of collection.

They should insist on being included throughout the entire process.

Especially since each iteration’s results will feed into subsequent redesigns. A-bright-future-for-designers

If ID’s do not weave themselves into the learning analytics movement, their share in the process may diminish radically.

Something that learning designer Melissa Milloway echoed in our interview published here

Fortunately, designers can harness analytics while staying close to their learning theory origins, as is the case with Kirkpatrick’s Model of Evaluation. 

Kirkpatrick & xAPI 

We've previously wrote about Kirkpatrick's evaluation model in instructional design here. 

But as popular as this model of evaluation has become, implementating it’s four levels (1: Reaction, 2: Learning, 3: Behavior and 4: Results) has been nearly impossible with only manual means.

With the arrival of xAPI and learning analytics, designers now have now greater insight into levels 3 and 4 of the model. The old challenge was, how do we know if learning has stuck if we cannot reliably prove it?

xAPI provides a tighter evaluation framework that, with the right design, can better gather the data. 1-on-1 Observations or Coaching checklists - previously unmeasurable, can  now provide the fine grained data needed for evaluation of learning transfer. 

Better data and better filtering of data, say with an LRS, means more meaningful results - and thus the means to prove if learning has stuck, fully realizing Kirkpatrick’s Model.

Find an example of Kirkpatrick's Model and xAPI working in perfect harmony here

 Orangelineblog.svg 

Start Your Free Trial

Get the latest updates by email