In a previous blog, we discussed how learning organizations at leading companies have recently used performance analytics to focus the efforts of learning and development organizations to help drive overall business value. Now, we would like to show you the steps and techniques needed to perform these analytics; turning the seemingly endless torrent of data into valuable and actionable information.
The Process Steps
The figure below shows the four steps involved in unleashing performance analytics.
Step 1: Data Collection
The first step is collecting data from various sources. It is important to include data from all relevant sources. Be sure and include all views of our current performance:
- Product or service
This is a wide range of data, and is often contained in a widely dispersed set of data bases. Be careful to consult the right experts and get careful definitions of all data fields.
Step 2: First Pass Analysis
Perform an initial analysis to look for interesting aspects in the data. The analysis consists of roughly two phases: descriptive analysis and inferential analysis. Descriptive analysis is used to summarize the data and determine if trends or changes are occurring, inferential analysis then tests if these trends are significant or builds models to predict the future.
The analysis is performed by combining elements from different data sets together. For example, we might look at combinations of metrics from product or service performance, workforce performance and learning events.
Often termed “data mining” we are looking for valuable nuggets of information in a mountain of meaningless noise. These nuggets might be interesting trends, unusual outliers, or unexpected correlations.
Step 3: Diagnostic
Once the first pass analysis has yielded a set of interesting observations, we need to further diagnose them to determine if they have valuable new insights.
We begin this step by refining our definition of the problem or situation we wish to solve or improve. Next, we identify root causes, validating insights with subject matter experts. Root cause analysis assumes that events are interrelated. Actions in one area triggers actions in another, and so on. By tracing these actions back, you can discover where the problem started and why.
Step 4: Learning Intervention
We are now ready to begin to define a learning intervention to help create the skills and expertise the analysis indicates we need. We are now ready to define the learning and performance objectives for the event. We can determine the best learning modalities. Would new technologies be effective?
Step 4.5: Follow up Evaluations
Before we are done, we should design the methods we will use to evaluate the event. What influence should we observe on key performance indices after this learning is performed? Be sure and consider the full range of KPIs:
- Customer satisfaction KPIs
- Business KPIs
- Cost efficiency KPIs
Let’s look at an example of a world class organization that used analytics to drive business results and see how they approached the task.
This company is a global technology leader. One division a service organization with approximately 4,000 technicians. They handle 800,000 repairs each year. The initial objective was to optimize learning, to improve first visit repair percentage and shorten repair time. They faced the additional challenge. During the coming year, they predicted on-boarding some 30% more new hires.
They performed the first pass analysis, creating a number of observations. One key observation is shown in the graph below. In this graph, performance data has been combined with HR data. It shows seniority vs average time to repair. Two remarkable insights were noted.
First, there are different segments in the workforce who need learning targeted to their unique needs. Second, there are some folks, even some inexperienced folks, who can accomplish the repairs in far less time. This was a revelation, it called for a whole new perspective on how they trained the workforce. They had to consider how to make a better learning investment decision.
So what did they do? They applied a range of new learning technologies, and redesigned the learning events, better targeting the events to the various groups of learners.
What were their results? The on-boarding training was redesigned, and the duration was reduced from 20 weeks to 6 weeks. Other training was optimized, reducing costs 17%. At the same time training completions were up 167%.