Learning Analytics

From Order-Takers to Data-Driven Performance Consultants

Any successful training or performance improvement intervention starts with understanding the business challenge to solve and the performance gap to close.

A PWC Annual Global CEO Survey showed that 73 percent of CEOs have concerns about the skill sets of their employees. They expect their learning department to analyze skill gaps in today’s and tomorrow’s workforce. This analysis requires learning professionals to adopt a consultative approach and embrace an analytical mindset. It is changing the roles of many learning professionals from order-takers to data-driven performance consultants.

The pieces of evidence are there. Learning organizations that successfully implement analytics practices discover new hidden value and produce better business results. Check out this chief learning officer’s study of 467 organizations: “Unleashing the Power of Performance Analytics: Driving performance at the intersection of learning and business.”

But for many, even though the goal of adopting a learning analytics practice is clear, the way there is not easy and straightforward. Analytics can sometimes appear to be out of reach and outside the scope of their learning function. L&D typically lives in a data-poor environment, LMS data aside. Outside of the LMS, 97% of L&D leaders want to improve the way they gather and analyze data (Dixon and Overton, Towards Maturity, 2017). So, let’s start with the data.

There Are Three Types of Data to Consider

  1. Learning data
  2. Talent data
  3. Performance data

Learning data typically reside in the LMS and includes information about offerings and curriculum, classes offered and delivered, course completions, efficiency measures (class size, utilization, costs) and learning evaluations (satisfaction, pass-rates, sometimes learning transfer).

Talent data are harder to get, although the LMS can provide some:

  • Employees’ jobs, location, supervisors.
  • Levels of experience (tenure), job grade, certification.
  • Employee morale, attrition.

Accessing performance data is the jewel of the analytics crown. But it is not always as hard as it seems if you ask the right questions. Here are some examples:

  • Where are the activities or accomplishments of employees stored? Tickets from a call center, repair orders from service technicians, proposals submitted or signed contracts from the sales team, incidents or policy violations, etc.
  • What are the measures of success for employees? Customer Satisfaction ratings, the ability to resolve issues the first time (no comebacks), the average revenue per transaction, the closure time of new tickets, etc.
  • How can we partner with Operations or IT, ask these questions and get access to performance data that matter?

And if nothing else works, we can complement learning data with our own data collection, for instance:

  • Conduct a survey of key stakeholders.
  • Interview learners and their supervisors.
  • Observe employees in their workplace.
  • Implement new types of learning transfer evaluation survey.

In our next blog, we use Gartner’s analytics maturity model (below) to show you how to establish Learning and Performance Analytics Best Practices.

What best practices have you discovered? Do you have other ideas? Or are you interested in an expert partner that can deliver data-driven, tailored solutions? Start a conversation in the comments below or connect with us at @RaytheonRPS using the hashtag #learninganalytics and #performanceanalytics.

To learn more, visit RPS.com.