Contents

It’s been said that data is the “new oil”! Today over fifty percent of the top 10 companies are data based platforms such as Google, Facebook, Tencent etc. This implies that the Market today values data. In fact we’ve become such a data driven generation that in 2020, we created 2.5 quintillion (that’s 1018) data bytes every single day! This massive volume of data however, will not prove useful unless it’s analysed. Think of a business – you have many functions such as marketing, manufacturing etc. that are all churning out data.

The ‘Learning’ function within an organisation is no different. A lot of data is generated from ‘Learning’ too. ‘Learning Analytics’ is a tool that helps you collect and store that learning data in order to generate meaningful insights; insights that can be used for decision making and taking the next steps for individual and team learning and development. A research project conducted (2016-18) by the University of Edinburgh, showed how Learning Analytics has provided accurate and relevant approaches to assess critical issues, such as learner experience and retention, setting apart the indicators of current skills acquisition, as well as personalised and adaptive learning. These kinds of insights are even being applied successfully in academic and corporate settings.


What is Learning Analytics?

According to SOLAR (the Society for Learning Analytics and Research), Learning Analytics can be defined as “the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environments in which it occurs.”

This has been hard to do under traditional classroom teaching methods, but has gained impetus with the rise of online learning or the elearning age. elearning makes data collection and analysis much easier and makes learning analytics more effective in assessing learning outcomes. The use of big data and learning analytics can help you capture powerful feedback on learning experiences that can translate into designing learning solutions that are more engaging and effective. Learning analytics has the potential to drive a cycle of continuous improvement that transforms the way you design, develop and deliver learning solutions.

Learning analytics opens up a world of opportunities for the L&D professional. It helps determine what transpired during an elearning session, why something happened and patterns of occurrences. But what kind of data do you need, in order to draw these inferences and make decisions that improve learning?


What kind of data should my learning analytics tool be picking up?

There are three key categories of data you should be monitoring to make effective decisions. These are:

1. User Data

Data pertaining to users or ‘learners’ who are enrolled in your organisation’s elearning is considered as user data. This data describes the learner in the context of the organisation. For example: the user’s role in the organisation, location (in case of a multi-regional office), division (sales, operations, finance, human resources etc.) and level (or seniority such as junior, mid-level, executive level).

When user data is collected, it helps you segment data as part of the analysis, rather than having all the information under a single umbrella. Ungrouped data implies that valuable insights for each group becomes indistinct due to overlapping information. Grouped data always helps bring to light trends or patterns that are relevant to each user group.

How is user data useful? The learning experiences of users with common roles with a department, division or geography can be studied to determine what elements are lacking in the learning programme.


2. Engagement Data

All the elements that reflect how a learner interacts with the content will give you the engagement data. This means examining various aspects such as:

  • The length of time that the learner has the module open,

  • How many unique users started a course?

  • How many unique users completed a course?

  • How many users visited a course more than once?


Some practical implications of engagement data are:

  • Checking whether the learners have completed the course within the allocated time? Do most learners take longer to complete the course? (may be indicative of a course that’s not very engaging) Do most learners take less time to complete the course? This may be indicative of a course that the majority of learners are already familiar with.

  • Did learners open media files where they were supposed to? Or do they skip them altogether?

  • Do learners stop certain media files before completion? This may indicate that the information is too long or that the information is not useful.

  • Do certain design elements engage learners better than others? What mix of media is ideal for the type of content being taught? For Example: Text + video + multiple choice quiz; or video training + simulated scenario-based testing; or video training + pop quiz via mobile learning + social media learning bites; or other media mixes.


3. Performance Data

Data that characterises how well content was recalled or applied by learners; in other words how does the learning impact performance? This is called performance data.

Performance data can be used to analyse whether the instructional design was solid, whether the course authors understood the needs of the audience accurately and whether the content was written in an easy to understand and memorable manner. The L&D professional can take this data  to both assess the effectiveness of the solution and as a baseline to compare to actual performance in the field.


Here are some examples of the kinds of performance data that can help your learning analytics tool provide significant information:

  • Was a question answered correctly on the first try?

  • Results on an “apply” question like scenarios demonstrate if new skills are likely to transfer to the job.

  • Results on a “recall” question provide insight into whether content is presented clearly and in a memorable way.

  • Beginning confidence reflects the effectiveness of the course.

  • Ending confidence reflects the quality or accuracy of content that was conveyed.

  • Was the content relevant to the role?


Learning Analytics at the helm of L&D:

Accurate and relevant data collection is an important part of Learning Analytics. An effective learning analytics tool, will make collecting data on the development, participation and outcomes of learning possible. Using this to measure the effectiveness of the learning, measure the business impact and provide actionable information to improve learning and meet stakeholders’ expectations enables the L&D department to meet all its goals and objectives.

Grab your key to unlock the entire
Kineo Courses library, FREE for 14 days!

Start Free Trial