Bad customer satisfaction survey

I recently met with a large residential cleaning company. They had been running an online customer satisfaction survey every year for 3 years and wanted to find a new way of doing things.

In the spirit of co-operation, they shared their survey findings with me. What immediately struck me was – even though they asked dozens of questions – how little useful information the survey results actually gave them.

For almost every (bland) question …

  • rate how professional our staff are?
  • rate whether our skills are a good fit for your needs?
  • rate the level of interaction we’ve had with you?

… the company had been rated around the 7 out of 10 mark. Great! Their customers were broadly, but not very, happy.

As I read the results of the survey, two things struck me:

1. It’s very difficult to see what you could do to improve things for customers

The results presented in the survey had been genericised to the point where they were simply not useful. The multiple-choice questions were more of an exercise in the art of form-filling, than a genuine attempt to get to the bottom of how their customers were really feeling. The questions being asked just weren’t getting to the heart of the matter – so no immediate actions were obvious. When I asked what had happened as a result of implementing the survey (surely the major reason you would do one at all is to effect change where it’s needed?)…

“I was told the results had been ‘circulated internally,’ but ‘nothing had happened’ for six months.”

Worse than the pointlessness of the questions being asked was the fact that the survey wasn’t tying feedback to individuals – making the exercise even less actionable. Assessing customer satisfaction shouldn’t be a market research exercise! It simply doesn’t make sense to ask valued customers for their feedback and then make the results anonymous so that you can’t tell who said what!

In summary, the survey was a retrospective of what has gone before, rather than something that could be actioned now. The horse had bolted months ago. Even worse, more than 80% of the customers it was sent to didn’t even start to fill it in, and another 10% dropped out before completing it! It’s not exactly a ringing endorsement for the online survey approach. Do you really want to put your customers, people you are supposed to value above all else, through such a painful exercise?

2. Long, multiple-choice surveys typically ask 20 bland questions which don’t matter, in order to avoid asking the one or two that really do!

Neither me, or the company I was with, had any clue how to start getting 9’s or 10’s instead of 7’s in their ratings.

We started to realise that sometimes the very nature of these surveys mean that you are compelled to ask 20 or 30 questions which don’t matter, because they are set up that way. This makes it easy for every business to ask lots of pointless things and avoid the one or two things that really do matter. Instead of forcing the customer to tick 30 boxes, why not ask them one or two simple questions…?

  • “How did you feel about the clean you got from us?”
  • “How are we doing for you?”

… This is what matters to customers. So it’s what should matter to you, the survey creator.

If you find out a customer is unhappy, it’s your job to find out why and fix the issue, not their’s. That’s great customer service…

“Forcing them to fill in a long form is not great customer service. In fact, you could argue it shows a sizeable disregard for their time.”

Both the cleaning busines and I agreed that the process of finding out how happy customers their are, needed a major overhaul. It needed to be more real for the business, and more fun and engaging for their customers. I’m happy to report that they chose our tool Customer Thermometer to do this and are delighted with the results they are getting in the first month.

Instead of commissioning a report that sits on the shelf this year, why not make 2012 the year you try a different customer satisfaction survey approach. You won’t go back!

3 replies
  1. Anonymous
    Anonymous says:

    “It simply doesn’t make sense to ask valued customers for their feedback
    and then make the results anonymous so that you can’t tell who said
    what!”

    I think you make a great point, especially if there was a situation you could have fixed if you knew who the feedback was coming from. Knowing there was a problem after the fact doesn’t help you improve the customer experience when it needed help.

  2. Andy Stansfield
    Andy Stansfield says:

    I recently got a similarly useless survey after a rather unproductive call to a ‘customer helpline’. At the end of the call I was asked to undergo a survey. All the questions were about the agent, who was fine, answered my questions and so on as best he could. He was polite, helpful and I gave him top marks.

    Unfortunately the survey didn’t ask one important question: how do you like your experience with us? Because that would have given some useful feedback, hopefully actionable, that I was extremely unhappy and likely to move to a competitor.

    The upshot was that a national utility company thinks I rate them highly. I rated a single agent highly. Not the same thing, as they will (or actually probably won’t) find out when I leave.

    • Customer Thermometer
      Customer Thermometer says:

      Thanks for your comments Andy, fascinating, and we completely agree. So often these surveys are positioned to avoid asking the very question that really matters to customers. The results look pretty on a management report but they don’t change anything for you, the customer.

Comments are closed.