Survey Failures in Action? Impact on Strategic decisions for Business | ArticlesAndBlogs

Survey Failures in Action? Impact on Strategic decisions for Business

Hi all. For the people that know me, you know that I could talk about Net Promoter® for a day and a half, but I thought that I would detail just one example of poor NPS leveraging, and the consequences that it has on wider benchmarking.

I was called up for a telephone interview just an hour ago regarding a recent energy installation that was conducted at my property. The pleasantries were nice and appropriate and I thought, "Why not just answer a quick survey?" In my experience these interviews should take between two and five minutes.
To my horror the person said this would take 15 minutes. I didn't think that I would be the first person asking "Why?" on this phone call (NPS people laugh now), but alas, this was the case. I later deduced that this survey is what I would call the "Touchpoint to Death" type of survey. It seemed clear during the call that every single department within the Energy company wanted their own input into the survey. To make matters worse for the respondent, a lot of the questions didn't even make sense in context! To give one example, I was asked if the process used to install X made me want to recommend the brand to a colleague or friend? Clearly I stated that this process is transactional and doesn't affect my relationship with the brand. The response to each of my points was the same, "I don't have an option to take your comments". So they don't have an option to listen to what their respondents are saying...and....
"Computer says no".
Another main issue I observed was their incorrect use of the NPS metric. They used a scale of 1-10. This means that when comparing with the industry (benchmarking), this energy supplier would have skewed data positively by having fewer detractors (should be 0-6) but theirs would end up being 1-6. To those that know how meticulous I am about the correct implementation of NPS, you all know that at this revelation I shuddered.
I suppose because the validity of NPS, to me the most Scientific of all business metrics, seemed to be under threat, I decided to be patient, and detail all of the failings at the end of the survey in the dreaded "Extra Comments" field. It is my vain hope that someone whom owns the survey will look at this and go.... ahhha! Here is the light!
To add insult to injury I was also asked a few demographic and firmographic questions to which I knew perfectly well I had already answered and the information was already available to the survey owners! One example is:
"Did you have X installed?" - of course I did.
The most salient survey best practices that I outlined on the call were:
(1) Keep the survey short and simple with three questions:
(a) The NPS Recommend
(b) Why?
(c) What would you improve?
(2) Analyse and categorise what the customer is saying in response to (b) and (c) and use the customer to create the categories- do not impose your own beliefs or thoughts as that affects the Scientific outcome.
(3) Do not repeat ask demographic and firmographic questions to which you already know the answer!
I did make the point very clear that if you are using these data to compare against the competition it would skew the data set in favour. The person interviewing me did laugh at that.... so perhaps the people on the ground are aware!!! Winking.
Thanks for reading. Thoughts? Comments?