Reporting 101: Lessons Learned

This post was kindly contributed by Business Intelligence Notes for SAS® BI Users - go there to comment and to read the full post.

Wow – it’s a very exciting time to have an analytics career!  You may have read the Preparing Yourself for Analytics Job Tsunami post at BeyeNetwork last week, which basically said we are headed for a shortage of analytics or data-minded people.  Oh … music to my little analytic-trained, big-data-lovin’ ears!  The article was pointing to the McKinsey & Company thoughts from last May about the arrival of big data and what it would mean.  I liked this particular point:

 

There will be a shortage of talent necessary for organizations to take advantage of big data. By 2018, the United States alone could face a shortage of 140,000 to 190,000 people with deep analytical skills as well as 1.5 million managers and analysts with the know-how to use the analysis of big data to make effective decisions.“  

McKinsey Global Institute May 2011

  

Analysts with the Know-How to use the Analysis

My favorite moment with customers is when I show them their data as a chart or graph and they say something like  ”This is what I’ve been trying to say!” or “No wonder this situation is happening.”  Suddenly, the issue they  have been having with employee productivity, another department, or customer satisfaction is clear as day on a chart. And I know I’ve really done my job when they want me to leave their office so they can start building the PowerPoint slide package to explain to their manager or team.

In some cases, it’s easier to find the perfect chart or way to visualize the issue than others. Sometimes you’ll find soft skills are the most challenging part of being an analyst. In a way you become a counselor to the customer because you have to draw them out so you have some clues to what the problem is. One case I recall is customer service team lead whose manager thought the team was performing poorly.  The manager was using a metric referred to as Fix Response Time (FRT), which basically measures how many trouble tickets were resolved in the set time frame.  For instance, for tickets considered to be major severity, the team had 30 days to resolve the issue. The target was 90% – so 90% of the major tickets resolved in 30 days.  Problem is the manager’s manager gave it to him, he didn’t know-how to use it effectively.

Getting Know-How Education at School of Hard Knocks

The team lead, Drew, had just been brought into the role, he was full of energy and wanted his team to be the best. He setup Lunch-n-Learn sessions, secured pay raises, and even a program where the older engineers mentored the younger ones. In short, he was really good!  His manager, Biff, was not new to his role but had never had any metrics training but knew that 90% was the target.  All Biff knew was that he could not find a team lead to meet the 90% target set by his boss. It was discouraging, when the monthly metrics report published and Drew’s team was not hitting the 90% target.  Biff would coach, scold, cajole Drew.  

And even worse, both were suspicious that I was the enemy since I was creating the chart packages.  Obviously I was not counting the tickets correctly!  <Ouch>Here’s an example of how the chart would look, you can see the performance was no where near the target.

sas reports fix response time

More Analysis Know-How

It’s easy to see why managers get upset with data and reports – performance ratings and merit increases are determined with them; customers review the data to determine if contracts are awarded.  Drew knew his team was performing, but how to prove it?  Plus another clue, the customer were not complaining.  If the FRT was so poor, surely the customer satisfaction data would also be poor.

Together we started looking at the numbers closer. One thing I did, which I encourage you to do, is listen to your customer.  As we were discussing the issues, Drew commented that “when the tickets escalate to Engineering, it takes them forever.” When I built the reports, I used the requirements they gave me.  I had no idea of the underlying process. What would happen if categorized the tickets by Customer Service and Engineering? Maybe we just had a process problem?

To  make a long story short, I showed him the chart below with the Customer Service and Engineering FRT divided into categories, his team was performing.  And chart 2 showed that when they slipped below the line it was because the incoming tickets had increased to where the team could not handle the volume. “Ahh!” he replied and promptly took the chart I handed him so he could run to Biff’s office.  His team was performing, the metrics were giving the wrong message.

 

 

Turns out there was a process issue.  Engineering would issue a new release every 90 days, but this would cause them to miss the 30 day deadline.  Additionally, if the customer service team did not do a good job explaining the problem in email, the engineering staff would put the ticket in invalid status.  The invalid status may not be discovered for days, sometimes months.  The happy ending:  Drew and Biff were able to work with the Engineering managers to setup a new process that improved the FRT time. 

Lessons Learned

From this simple experience I learned a lot! It shows how a manager even lacking simple understanding of analytics knowledge was harmful to the company and basically the customer.  We were not using our data to be competitive and resolve process issues rather it was lower manager beating stick.  

Here’s my basic lessons:

  • You cannot always build reports based on requirements, you have to understand the underlying data and how it’s collected. You cannot defend your data if you don’t.
  • Don’t assume the report consumer knows anything about analysis, metrics, charts, etc.
  • Listen to your customer for clues about what you are not measuring.
  • Sometimes you are the enemy.

Any lessons you would add?

This post was kindly contributed by Business Intelligence Notes for SAS® BI Users - go there to comment and to read the full post.