It has been said that “you can’t argue with the data!” While this statement is not universally true, most people would agree that data is helpful in many cases, especially in a business sense. The problem, however, is that we do not always have the required data at hand when we need to make decisions. What we can consider up front is our well-informed hypothesis; what must we know to consider our decisions successful? Whether it be to launch a new product or service offering, to demonstrate operational process improvement, institute an organizational incentive plan, make a channel platform decision, or whether to utilize a new service provider, we must identify what future success looks like, and compare the future reality state to this expectation. Data is often a key contributor to this process, but is it the driver or the result?
To illustrate, let’s take a case study from a client project AJC worked on last year. The problem statement was that Inside Sales was unable to keep up with exponential growth of incoming clients to be onboarded to the company’s service platform. Good problem to have, for sure, but a 7x increase in one month was straining existing processes and causing extreme stress for staff. After dealing with it for several months, the client asked AJC to help them figure out how to improve. In this case, success was an improved process that could accommodate the higher client volume Inside Sales now had to accommodate.
The first step was a Needs Assessment of Inside Sales and their supporting team members, and the first Chicken in the metaphoric case study. This particular Needs Assessment included a Current State Process Map for each of the process steps within the overall value stream of bringing a new client onboard. There were several low-hanging fruit opportunities highlighted for quick improvement, but one task in particular stuck out as a huge risk factor in the overall onboarding value stream. This was a gating training session conducted on a one-on-one basis with the client.
With only a handful of qualified trainers able to complete that task, it was likely that this was where the current onboarding process was breaking down the most. At that time, we had no data, but the Needs Assessment articulated our hypothesis. Next, we needed to collect data proving or disproving that this step was or was not a true process bottleneck. Our Chicken was about to lay an Egg.
Luckily for us, this data was actually available to be mined in the form of historical timestamp information in the company’s workflow software. Not every company will have this data right away, and if that is the case, this is when we take the time to collect as much data as possible to give us a good baseline of the current situation. * In our study, we were able to calculate how many clients passed through each process step each week for over 20 weeks. We were also able to show where all currently active clients were currently backlogged in the overall process.
*Anecdotally, minimum sample size is often suggested to be 30 data points, though this is subject to many conditions; for the sake of getting anything done in this lifetime, my recommendation is to just go with as many as you can get, and stop at 30 if it is extremely painful to collect more!
Below is a chart depicting this data. Actual values have been changed, but the trend is representative.
Figure 1: Volume Data by Process Step:
Completion, Backlog; Cumulative
It was pretty obvious to everyone reviewing this chart that
Step Four (the aforementioned one-to-one training step) was the
bottleneck. Not only did the second
fewest client volume pass through this step on a weekly basis, but also the
average and actual bottlenecks at this step dwarfed all the other steps. We easily made our decision: The Fourth Step had to be re-evaluated. The next step was to figure out how, and
continue measuring the flow and backlog at this step to prove that we were
successful. This was done by effectively doubling capacity at Step 4. It actually represented a one-on-one client training session, which was easily transitioned to a two-on-one client to trainer ratio, immediately doubling capacity.
In the case study, we did not start with the data. We had a problem statement that instigated a
Needs Assessment which defined a hypothesis for the strongest driver of our
probable cause. We then gathered and
analyzed data to prove the hypothesis and help us make an informed decision for
action. To follow our metaphor, the data
was the “Egg.” However, once we had the
data, this Egg hatched into a new Chicken, which drove us to define strategies
and implementation plans to improve this particular step, which became our new
Egg.
Is Data the Chicken or the Egg? The best answer truly is: both.
Read
this article and more on AJC’s blog, and sign up for our newsletter online at: http://andreajonesconsulting.com/blog.aspx
If you sign up for our newsletter by
February 10, we will send you a free copy
of AJC’s Prioritization Matrix Template.
No comments:
Post a Comment