We have previously provided some examples of what we have found to be the best customer satisfaction survey questions to ask – along with some useful templates.
But when designing effective customer surveys it’s also important to be aware of bad survey questions and why they are bad. We have previously addressed the topic of bias in survey questions
– and why it’s important to thoroughly assess proposed questions to ensure that bias will not skew the survey results. Here we’ve compiled some examples of bad customer survey questions seen in various surveys.
Here’s the first:
More people use our free post and packing service than our paid services. Have you used our free post and packing service?
This is an example of biased, leading question which sets the stage by telling the respondent that more people are using the free P&P service. To get a more representative response this question could be composed as follows:
Have you used our free post and packing service?
In your opinion how would you rate the speed and usefulness of our service response?
Select from: Excellent, Good, Fair, Poor.
This is a double barrelled question which is asking two questions at once. Better to break this down into multiple, clear questions.
How would you rate the speed of our service response?
Here’s a third example:
Did you hear about our service:
– from a friend?
– from a newspaper?
– from television or radio?
– from a colleague at work?
The answer options are not mutually exclusive. The respondent may have heard from all or none of the listed sources. Better to use something like:
How did you hear about our service? Select all that apply.
Provide a list of sources, allowing the respondent to select multiple and be certain to include a ‘none of these’ option and a text box to enable them to describe how they heard.
Here’s a fourth example question derived from a real survey:
How do you feel about the following statement. We should not reduce the number of options in our product?
The question uses a confusing double negative (should not). The following revised question is clearer and less likely to cause confusion.
How do you feel about the following statement. We should reduce the number of options in our product?
Here’s our fifth example of bad survey questions:
Who did your purchase these products for?
The key issue with this question is the limited number of answer options. These don’t represent all of the possible answers that a customer might want to provide to this question. A more exhaustive list should be offered along with an ‘other’ option and possibly a text box for respondents to let you know who they purchased the product for, if that is important information for your business.
Failing to provide all of the required answer options is a common mistake in multiple choice survey questions.
Here’s sixth bad survey question example:
Are you satisfied with your customer service?
This simple question might at first appear to provide exactly what you want. But it is unlikely to allow your customers to feed back what they want to share. This type of ‘absolute’ question does not provide what respondents need in order to give useful feedback. It would be far more advantageous to most businesses to use a rating scale like this:
On a scale of 0 to 10, where 0 is completely unsatisfied and 10 is completely satisfied, please tell us how satisfied are you with your customer service?
Here’s a seventh poor survey question example:
Our records tell us that you have contacted our service department within the past 12 months. From the following scale would you kindly rate your interaction?
Timeliness is vitally important when surveying customers. Ideally they should be surveyed as soon after the event that is being assessed as possible. Asking customers to remember the details of a transaction that may have been carried out 12 months ago is not likely to elicit accurate or useful responses.
Here’s our eighth bad survey question example:
You recently purchased a [product]. When you use your [product] do you:
This is an assumptive question which assumes that the purchaser of the product is the user. Better to firstly determine if the respondent is the product user and then, if so, ask questions about product use.
Here’s the ninth example of poor survey questions:
How was did our amazing customer support team perform today?
This is clearly a biased question, leading the respondent by describing the customer support team as ‘amazing’. Far better to allow the respondent to impartially feed back their experience of your customer support team.
On a scale of 0 to 10 where 0 is entirely unsatisfied and 10 is entirely satisfied – how satisfied were you with our customer support team today?
Here’s a common survey question that has issues:
Please indicate your age from the following:
18 to 25
25 to 50
50 to 70
This demographic question does not offer mutually exclusive answer options. If a respondent is 25 they fall into two of the answer categories and the same applies if they are 50.
Another important point is that a question like this should always offer a ‘prefer not to answer’ option as some people are not happy to disclose their ages.
Here’s a bad technical survey question example:
Do you think that our software update system is adequate?
This question assumes that the respondent has a lot of technical knowledge. If this question had been preceded by questions determining whether they were the right person to ask such a question it might then be useful, but for an average respondent who doesn’t know about your software update system this question would be considered complex and confusing.
It should be cleat that it’s easy to get it wrong when composing customer survey questions. That’s why it’s vital that you thoroughly evaluate and test your surveys before deployment. For more great examples of real customer survey questions that work take a look around our site and blog.
Want to improve your customer survey response rates? Customer Thermometer’s 1-click survey will up your feedback game. Send yourself an example: