Take a tour of our end-to-end microlearning platform in under 15 minutes Watch the Demo >

For many years, corporate learning has been executed in the same approach through classroom training and eLearning courses. These traditional learning methods don’t deliver valuable data on learning. In classroom training, learning is often measured with a survey to gather participant feedback and satisfaction on the instructor or the course. An LMS only measures the participation, time spent on a course, learners progress through a module, and then reports become a record of who completed the module. What’s missing in corporate learning today, is the ability to measure different components of a learning and development program at a much more granular level. Below are three areas that must be part of an L&D analytics and measurement model to benefit the learner, instructional designer, program manager, frontline manager, and senior management.

1) Content: Is it the Right Content, Effective, How Do We Know?

Kapp and Robyn Defelice, authors of The Gamification of eLearning and Instruction, found that one hour of learning materials takes on average 90-240 hours to create. Another study by Chapman Alliance, revealed that an hour of eLearning costs approximately $10,000 to produce. It’s no doubt that developing quality content takes a lot of time and money, and if instructional designers don’t have the data to support those costs, they’re wasting their time, money, and potentially putting their job at risk because they aren’t getting buy-in from senior management on the ROI of the work their producing. Too many instructional designers can’t prove that the content they’re writing is effective, or even if the right content is targeted to the right audience.

As instructional designers build microlearning content libraries, capturing historical data at a granular level is valuable to determine:

  • Proficiency Increase: What content authors developed microlearning courses that demonstrated proficiency increases?
    Being able to discover what courses that had low stats will refrain from enrolling new learners into those courses or even making adjustments to the courses to improve the quality.
  • Engagement: Which topics and individual questions saw the highest learner engagement and which caused learners to disengage?
    Uncovering the level of engagement can clearly show topics that are of interest, those that are too complicated, and most importantly the topic areas where learners are not engaging and therefore not learning the skills they need for the job.
  • Relevance: Is the content relevant to the skills that the learner needs most to perform on the job?
    Measuring relevance helps uncover where there may be opportunities to reuse content to further build additional skills.
  • Quality: Did the content lead to changes in behaviors and was it widely understood?
    Find out whether questions are too hard, too easy, if additional clarity is needed (e.g., if there’s a trend of people answering incorrectly over time), and fine-tune content authoring and investments accordingly.

 

Creating rich content libraries is particularly beneficial for large organizations including life sciences companies. For example in large pharmaceutical and medical device companies, sales reps in different countries are selling the same products. Given the nature of these industries being so siloed, marketing departments risk producing duplicative content for the same product and the risk is the same for the training department curriculum. When marketing and training departments have access to the same content libraries and data, sales reps have a consistent learner experience and exposure to the best quality content. This results in sales reps communicating accurate information on products and how those products meet patients’ medical needs with physicians more accurately.

2) Engagement: Competing for Screen Views in a Digital Distracted World

Measuring learner engagement is a major factor to see who is invested in training and later seeing the impact someone’s engagement had towards their own performance. For learning to happen in the first place, both employees and their managers must be engaged in the learning program to produce results. Finding out thorough detail on the learners’ frequency in interacting with a microlearning platform signifies their commitment to caring about improving their professional development, meeting their goals, and contributing to business performance. Corresponding analytics visualize how managers are engaging with their team with coaching efforts related to specific information. For the managers who aren’t engaging with their team in the learning process or not providing coach support, it’s easy to bring up the question that there might be a problem in that one team or region. Engagement at all levels is displayed in real-time for business leaders to take action and acknowledge the situation before it becomes a risk to the business.

Having easily accessible and real-time data is particularly important in large organizations such as life science companies given the nature of how commercial teams operate. Microlearning grants sales reps to engage in learning activities through their mobile devices in-between visits, making learning easily accessible and more frequent. Sales managers don’t have to wait until they are on the road with the rep to provide coaching support when they can engage with the rep for coaching using real-time data. If managers see that their reps aren’t participating in learning activities, they are able to notify them to start engaging with learning content.

3) Proficiency: Going from “Did Someone Attend” to “Did They Understand”

Proficiency helps learning leaders understand how capable each employee is compared to the skills expected in the role. Delivering learning as a microlearning assessment identifies employees’ knowledge and skill gaps, then following up with consistent reinforcement using spaced learning allows people to recall information over time. All this data pertaining to each individual’s responses to questions is gathered and displayed as detailed to a specific question or can be aggregated to view at a higher-level combining data by each team, region, or the organization as a whole.

Data on a person’s response to specific questions visualize the performance skills gaps that exist on the team which is an indicator of them behaving incorrectly on the job. Comprehensive data shows managers the exact coaching actions necessary to fill the performance gap. With access to proficiency data, further analysis can be done by combining learning data with financial, HCM, and CRM data. For example, pharmaceutical companies can use proficiency measurement on product information related to a unique drug to understand how that relates to how much a rep sold on that product line. If a sales rep has low proficiency levels on a particular product, there’s a granular coaching track that can be visualized to deliver additional training and coaching essential to improve their knowledge about the disease state and how the product can benefit the patient.

Learning Data to Show a ROI on Learning Investments

Demonstrating this level of ROI is not possible through an LMS. New learning metrics uncover learner gaps at a granular level and provide a mandate for managers to close those skills gaps: proof learning is happening and people’s proficiency is improving.

Do you need the data to quantify and demonstrate the value of your learning programs? Request a demo of Qstream’s microlearning platform or read our eBook for more information.

Need more information?
We’ve got it covered

PRODUCT

Learn what Qstream is made of and how it can help you

RESOURCES

Browse our library of white papers, e-books, webinars and more

NEWS

Read and see the latest about Qstream and microlearning

Want to see Qstream in action?

Schedule a call >

What are people saying?