Myth: Customer experience improvement is all about feedback surveys
Feedback Frenzy
Everyone seems to be jumping on the feedback band wagon. As customers we are often asked what we thought of that last call we had with an agent, the last stay in a hotel, the last flight, last visit or even the last football season of our team (yes that happens). Of course we understand that it’s great to listen to customers and that it can be useful to measure front line staff and processes as well. We’d also agree that listening to some customer feedback is probably better than none at all. However, we do have to watch out for four things:
1. That we really are listening; 2. That we aren’t making worse customer experiences through these listening processes;
3. That we aren’t getting blinkered to other valuable measures (feedback blinkers); and
4. That we really know how to get solutions.
Let us look at each of those in turn:
1. Are we really listening or is it one sided listening?
We’ve seen many post event surveys with no open questions at all, merely scores out of 5 or 10. We’ve seen a number of major companies use surveys that say “This will take 10-15 minutes of your time”. If customers aren’t already put off by that, then some surveys have almost no free format feedback within the survey. We think both of those are examples of companies not really listening as they are only interested in measuring the things they think are important.
This “one sided listening” narrows what an organisation can hear and in fact means that they just re-con rm what they wanted to prove. Listening involves being more open than that. It is more complicated because it’s harder to control and harder to analyse but in our normal conversations it is what we do. For example, we ask our friends and family these more open listening questions like “how was your holiday?” or “how was the dinner?” We don’t ask for scores out of 10.
An irony here is that when customers try to give feedback in ways the company doesn’t expect, often they don’t have a way to listen to it. So while they are investing in surveys, they are deaf to other sources of feedback. For example, their front line staff aren’t given a process to pass on the things they hear every day or really great customer suggestions. We tried it with a travel company last week and the only way to give feedback as a customer was a complaint but we didn’t have a complaint.
Often when scores are requested the opportunity to just go that extra step is missing. We call it dynamic questioning. For example, if you are scoring NPS as part of your surveys, it takes very little extra time to ask dynamically adjusted questions like: If scoring us 1-6, “What did we do wrong?”, or if 7-8, “What can we do to improve?” or if 9-10 don’t forget to ask “What are we doing so well to get this score?”. All three offer valuable feedback information to your Continuous Improvement teams. Don’t forget that ‘Promoters’, ‘Neutrals’ and ‘Detractors’ are sensing different things!
2. Making the experience worse
Let’s not forget that every time we survey the customer, it is effort for the customer. A one minute survey after a five minute call still adds 20% to the customer’s effort. A 10-15 minute survey is a big chunk of anyone’s day (some lawyers and doctors charge for every ten minutes). Organisations which do that are really imposing on the customer. We have seen that some organisations are using surveys to measure and motivate their front line. It’s almost like saying; “we haven’t got time to measure and motivate our team properly, so please do it for us Customer X.”
In response to this barrage of surveys, we see survey fatigue already setting in. Customers may respond to a new survey system or process but it normally falls away as novelty wears off. So we think response rates will drop and only certain customer types will respond, creating potential survey bias. Therefore, survey designers and customer experience teams need to treat the customer’s time with respect. They should see a survey as an imposition on the customer and consider what is in it for the customer and every second of customer time as valuable - nothing should be asked for that won’t be used. If we ask for a verbatim, we better have a process to analyse and use it. Many surveys even ask customers for details the company already knows. How often have you been emailed a survey and been asked for your email address within the survey! Of course its complex to pre-populate surveys but it also shows respect for the customer. So our very simple conclusion here is to re-assess the experience that the survey imposes.
3. Feedback blinkers
In some organisations things like Net Promoter Score (NPS) have become all pervasive. We applaud that leadership teams and boards are starting to worry about customers. However, there are some issues with NPS. One company we know admitted that the most pro table and long lasting customers were those who had never had to call. But those customers gave a lower NPS than those who had a problem fixed and who added cost to the business. So NPS can be biased to the most recent interaction and not indicate deeper preferences and needs. Another risk is that organisations are driven by the score rather than the real business outcomes. They start chasing NPS improvements as an end goal however it’s not an end in itself and can hide opportunities.
We also see organisations focusing on one measure that imposes on the customer while they ignore other critical customer experience measures that don’t impose on the customer. For example, understanding contact types and rates of contacts, may give clearer and more actionable insights than net promoter scores. We don’t have to ask customers if they are unhappy about having to make a repeat contact and yet very few companies can measure this accurately.
4. How to get to solutions
We think the goal of all this measurement is to make things better. Unfortunately, any feedback system just gives indicators of potential issues and often these are averages across all surveys hiding the real detail. The real work starts when we have to figure out the real issues and implement solutions. A common frustration with NPS and Customer Effort score is that they don’t provide any rationale and they are an average of all experiences. They always necessitate a deeper dive to get to the real issues and possible solutions.
Of course going back to a customer who had a bad experience as reported in a survey may repair that relationship, but it doesn’t x the systemic issue. If we only x those who took the survey, how many were too frustrated to bother? Organisations can be so busy working on responding back and xing for low scoring customers that they never work on the systemic. That needs a much deeper dive to understand how common these issues are and what will work as solutions.
Possible Answers
So how do we stop over surveying, creating customer effort, getting obsessed with surveys and get to solutions faster? We are working with companies who are starting to survey less and do periodic deep dives (we call them diagnostics) that get to the heart of what is going on in a business. This type of analysis looks at what the customer is experiencing and the causes and effects at the same time. It cuts the corner on the survey type process and has no imposition on the customers. By assessing a detailed sample of contacts and measuring and listening to what the customer said on those contacts and how they were treated, not only do consistent problems becomes clear, but solutions also emerge. For example, on a range of calls we may hear customers say they can’t understand their statement or that they haven’t received something they expected. If the agent handles that well, the customer may give good feedback in any survey so the organisation never assesses why the contact even occurred. But if we observe the contact in our diagnostic process we see it as an issue in itself and look for a solution. It’s exciting to work with front line representatives who can integrate their daily experiences with the observed data and provide clear, reasoned and quantified solutions to customer problems.
Some of the results of this kind of analysis have been stunning. In one organisation it led to an almost instant x that knocked out 20% of wasted customer and company effort. Another diagnostic led to redesigned teams and processes that delivered 40% less work for the customer and for the company. In another business, once we understood what the contacts were and how they were handled, we were able to make process and structural changes that halved the rate of repeat contacts and reduced the effort the customer made by 30% on each contact.
In summary we hope that companies won’t get too dependent on customer feedback surveys and single scores. We believe they have their limitations, do not re ect what customers may really be feeling, and worst of all, unduly impose on the very customers we are trying to help. If you want quicker results and fast, positive change, we have some amazing tools to share so please feel free to contact us.