5 Critical Shortfalls of Surveying Customers and How to Avoid Them

We’ve probably all been directed into a customer survey at some point, whether it’s immediately after speaking with a customer service agent, or a few days after purchasing a product, etc. What I do for a living, means I’m more inclined to take the survey and experience the process.

Occasionally, I’m impressed with the survey design. But, more often than not, I’m left frustrated that I can’t provide the type of feedback that I want to and, more importantly, feel that the survey has been designed for analysing results, rather than allowing me to express my personal opinions and thoughts.

Here’s a list of some shortfalls that we’ve have experienced, together with some potential solutions on how to improve your survey process:

1. Immediacy and Personalisation

I’m not a big fan of generic website surveys or bulk email campaigns. There is no opportunity to personalise the survey, based on the nature of an enquiry or the channel being used and the resultant uptake is often very low.

Personally, I am much more likely to complete a survey directly after an interaction. Whether a contact is via voice, email, chat or social media channels, look to provide a post-interaction survey ‘personalised’ for each channel.

2. Generic Surveys

Just because someone thought it was a good idea to measure NPS (Net Promoter Score) three years ago, does not mean that it’s still appropriate today. Please don’t just use the same standard old survey month, after month.

Mix it up. Have a range of surveys, with differing questions, across channel, team and agent. Use surveys that match up to the type of enquiry and nature of contact. Prepare surveys for special events, such as new product releases or campaigns and glean relevant feedback.

3. Dynamic Survey Design

Nothing frustrates me more than when I score a question poorly and the survey design just passes me on to the next question without any sort of recognition of my poor score. I also get bored with the same old 1-10 scoring parameters…

Try and make the survey more engaging with a mixture of open, closed and multiple choice questions. If a customer does score you poorly on a question, recognise it and look to understand more about the issue through dynamic routing.

4. Verbatim Comments

Do not underestimate the value of verbatim comments. Scores and multiple choice answers may be easier to analyse and report on. But, free text comments can be the richest source of customer feedback available, highlighting issues that you weren’t even aware of.

Look to ask open questions and capture verbatim comments throughout your survey, not just at the end. In an ideal world, you should look to stream these comments onto a real-time wall board. By using basic text analytics, you can very easily identify issues that customers are experiencing right now.

5. Coordinate Across Channel

Consumers now contact us across multiple channels and we should be capturing feedback across each of those channels. If I’m unhappy with a brand, I may even turn to social media to vent my frustrations…

Coordinate all of your surveys, across each contact channel. Try to ensure there is at least one common question and metric that can be easily compared across voice, email, chat and social channels. Ideally, bring relevant social mentions into your VoC programme.

For more information on real-time customer satisfaction surveys from CSAT Central, visit our site or follow us via Twitter, Facebook and Linkedin.