Outside-In Insights from the Inside
A technique to get rapid traction on customer improvements.
Insight about Customers without Asking Customers
Many organisations invest in research and feedback methods to understand how customers perceive the experience that they provide and to identify opportunities for improvement. While these surveys certainly have their place, we think that as customers get more fatigued by surveys, organisations will need to get even better at using the interactions they already have, to drive insights and improvements.
In this paper we’ll show how a rapid diagnostic completed internally can reveal quantified opportunities, win-wins for company and customer and get the whole business focused on customer improvements.
We call it an “Outside-In Insights” process because it reveals impacts to customers who are outside the company by probing more deeply on what occurs “on the inside” i.e. within the customer facing operations. This “Outside-In Insights” process can:
quantify issues faster (than surveys and research)
assess the experience in ways that customers can’t
hear the customer’s issues without putting them to work
get to root causes and solutions faster by being closer to the current internal operating model
look more broadly across the experience than customers can by looking right across long running processes
Common Situations:
This rapid diagnostic process has been most valuable where an organisation knows that improvement is needed or has found that surveys and feedback aren’t driving improvement fast enough. The “Outside-In Insights” process has helped address common issues such as failing or sub-par service levels, downward or stuck NPS scores and aggressive cost reduction targets. Recent cases include a Claim’s contact centre struggling with high volumes of work, a Superannuation fund recovering from regulation driven record contact volumes and a Care provider going through dramatic process and tax changes. In two cases the companies had made major investments in their digital offering and but still needed extra staff to manage the increased contacts. Other methods had been tried such as speech and text analytics but hadn’t got to the heart of the issues.
Seeing customer issues from the inside: how it works
The “Outside-In” diagnostic process has three distinct techniques which quantify opportunities in a few weeks:
a) Listening to the voice of customer through the issues and contacts they raise
We find that issues and problems customers face, “wash up” in the contact channels. Spending time in service teams, complaints, email and chat reveals the issues and problems that customers have. Macro data like contact rates describe whether problems are getting better or worse. The diagnostic process includes detailed structured sampling of hundreds of contacts using external and internal experts. It integrates with internal data to highlight what the customer is really experiencing and not liking. These days the process also provides rich insights on digital channels and where they are failing. The process puts real facts against hard to measure customer experience metrics like resolution rates, repeat rates and drivers of negative experiences.
b) Longitudinal analysis or “Follow the X”
The voice of customer analysis shows issues at points of time in a process. In contrast “following the process” shows us what is working and what isn’t in any multi-stage process. Techniques here include things like “happy and unhappy path analysis” where the things that can go wrong are captured and quantified.
c) Maturity analysis of customer facing operations
The third branch of analysis looks internally at how the operating model works. It uses a “maturity assessment” to evaluate six key dimensions: Processes, Resourcing, Incentives and Indicators, Structure, Technology and Management. The assessment uses a combination of observation within the operation, internal data and structured workshops to assess gaps and opportunities in the way the business works. Whilst the voice of customer analysis shows what is going wrong and how often, this analysis identifies many of the causes of those issues.
Examples of the key issues the process uncovers
Within three to four weeks the diagnostic process quantifies a range of customer issues.
1. It quantifies the volume of failure demand
The process analyses contacts into four strategies using what we call a “value/irritant” matrix (shown). In this case over 35% of contacts were customers asking things like “where is my X” and “I don’t understand why”.
Even more startling were the 54% of contacts that were simple transactions that were not using the multiple digital mechanisms that were available. Nearly 90% of interactions represented some form of failure.
The analysis gets to the bottom of why customers were failing in the automation. The most frequent pain point, in this case, was around the login process as password criteria were the most rigorous, we have observed in any industry. Worst still the online help didn’t explain restrictions on key words, special characters, strings and sequences. After multiple attempts the customers would call for assistance.
2. It quantifies the volume of repeat contact
The side by side observation process gives a strong indication of repeats because callers are quick to tell agents if they have been in contact before in any channel (“I already tried to do this online but …”).
By capturing these “reported repeats” the process provides a realistic estimate of the real number of repeats and the channels causing them. It also enables analysis of which processes are driving repeats and why. In one company it showed the high failure rate on the processing of complex forms. The forms were very long and the process showed how often customers would fail to get it right and why. Customers were calling either out of confusion when filling them in or because incomplete information had led to the transaction being rejected by the Back-Office teams. In one client the diagnostic process was also able to highlight why so many customers used the physical forms. It was a combination of missing digital forms, hidden forms and a contact process that encouraged agents to send forms because they didn’t have quick links and access to suitable digital processes.
3. It explains the real rate of first contact resolution
The diagnostic process lifts the carpet on resolution through visibility of the state of the process and customer at the end of any contact.
It classifies whether the customer’s need was met and if not, what will happen next. Thus, it shines a light on the true rates of resolution (because the first contacts aren’t sorted) and the hand offs and necessary customer actions that prevent resolution. In one claim’s call centre, no less than 70% of contacts resulted in an action for either the customer or company and yet resolution was being reported as 80%. The process uncovers the causes of non-resolution by following many of the hand offs to see “what happens next”. For this claim’s operation we enabled a dramatic rethink of the end to end process.
4. It shows where processes are breaking down
During an inside out diagnostic the process variations and issues become very clear. In one company we saw staff trained in the industry and products but not in the best way to solve problems.
The lack of definition of “how to do it” meant that agents created their own methods and workarounds. Some were good but many weren’t and resulted in long contacts and uncertain answers! In another operation the process showed very high rates of hold time and staff seeking help from team leaders and floor walkers. This created uncertainty for customers and kept team leaders too busy to run the operation.
In another diagnostic we were able to see the disconnects between marketing and operations. The marketing team had sent out a badly worded SMS to the entire customer base explaining some product changes. The wording was so bad that many customers called just to check it wasn’t a spam text. Others responded to clarify if it impacted them or not. Worse still this all happened on a busy Monday morning when volumes were 20% higher than any other day.
5. Inside out diagnostics expose the operating model flaws
There is no better way to understand how a place really works than by spending time in the heart of the operations. This analysis benchmarks the management process against models of well-run operations. This is where we assess the six operating model dimensions of Processes, Resourcing, Incentives and Indicators, Structure, Technology and Management.
We find that you have to look at multiple dimensions in parallel. So, a measurement process like service levels, may result in a poor management process. For example, many customers experience wildly fluctuating levels of service because the company manages using averages of service levels over days or months. That’s great for those who ring or write when it’s not busy but poor for those waiting in the busy times. We assess whether management have process to address short periods of poor service levels or just take them for granted within an overall average.
The analysis exposes how processes and teams really work. In one organisation the back-office teams were supposed to spend time calling customers when applications were incomplete. A day in the team showed that no one ever picked up the phone because it was quicker just to send a letter and let the call centre deal with it. This resulted in many delays and complaints.
Watching people do their work reveals the level of disconnect between strategy and execution. Most organisations have invested in digital and want to see these investments exploited. The studies have exposed many instances of processes and incentives fighting the digital strategy. In one sales operation, the sales measures encouraged staff to keep sales away from the web site and self-serve kiosks. In another company we were amazed to hear that a new customer app had been released for self-service because it wasn’t mentioned once in 700 calls and 200 emails. The process was consistent but unfortunately not what the company wanted.
6. Follow the X During the analysis we put ourselves in the customer’s shoes and follow processes “end to end”. This includes looking at the company’s web sites, IVRs and correspondence. A recent IVR review found so many layers of menus that meant it took customers five minutes just to get into the queue. Many of the menus were redundant as customers ended up with the same agents anyway! In one company we found that it took over three minutes of recorded messages before customers even got to a menu.
One company couldn’t understand why their digital application process was generating calls. We demonstrated three places where new customers would get stuck in the application form and have to call. We also highlighted over 20 instances of bewildering jargon on the on-line application form.
After the analysis – example improvements