The annual Staffis a commonplace event in today’s corporations, and the format – dozens of multiple-choice questions – is more or less universal. Go to almost any third party organisation that might run your survey for you, and you will be offered more or less the same feature set. Indeed, these offerings are so similar that differentiation between survey providers is by no means an easy task for a company looking to implement a survey. You might reasonably draw the conclusion that an annual multiple-choice questionnaire is the perfect solution, because that is the format that the entire survey industry has converged on, and if everybody does it, it must be the best way. Furthermore, the fact that employers continue to use these surveys suggests that they too are happy with the format, and believe that it continues to meet their needs.
However, it seems that few of these employers ever ask their staff whether this survey format meets their needs. If they did, they might well form a substantially different view of its fitness for purpose. This blog, therefore, focuses on looking at the traditional survey from the point of view of the employee, as a starting point for exploring alternatives.
When it comes to participation, the traditional survey format suffers from one major problem: it’s mind-numbingly boring. Typically consisting of 60, 80 or even more multiple-choice questions, and taking anything up to 30 minutes to complete, the sheer effort required from disinterested employees cannot but affect the quality of the responses. The more dedicated employee will, of course, conscientiously complete the survey accurately, but this won’t be true of all of them; as boredom rapidly sets in, attention levels drop to the point where filling the survey in randomly (or even perversely) becomes the preferred option, just to get to the end as fast as possible so that the torture stops. Why should the employee care? There’s very little in it for them in exchange for their time, so it should not be surprising that these surveys are treated with anything from begrudging acquiescence to outright contempt. The coercion that has to be applied to get a reasonable response rate is further evidence that only a proportion of employees take part willingly.
The questions themselves also don’t help the situation. Who hasn’t wondered whether the same questions were asked last year, and can anybody honestly remember what they gave as a reply? Many questions also seem to be disturbingly similar to each other; the impression this gives is that this is a trap, and that the company is trying to catch employees out. There are apparently good methodological reasons for this, but nevertheless it simply makes employees suspicious, and that’s not good.
Finally, the questions are asked by the company to get answers for the company about the things the company cares about; sometimes there is a nod to things which the employee might care about, but it’s the exception rather than the rule. The opportunity for employees to freely express how they feel about their jobs is severely curtailed by the format, and frequently limited to a single free-form comment. It’s abundantly clear that these surveys are not organised for the benefit of the employee, and it shows. This is never going to help participation rates.
A once-a-year survey has to be held at some point during the year (or rather, during some period, given how long it takes for employees to respond). So when should it be done? What we find is that the date is usually determined by a desire to get the highest response levels. So we have to avoid all the holiday periods (typically, Christmas, Easter, the whole summer), all those times when everybody is more busy than usual (year ends, appraisal times, product release) … so often it’s scheduled for October/November, which is when other stuff mostly doesn’t happen. There’s nothing necessarily wrong with this, but it’s pretty arbitrary, and the employee possibly has a different view.
Ideally, what would employees prefer? Only to be able to give their opinions once a year at a time determined by the company, or to be able to give their opinions whenever they have something important to say? Employees with a problem in January may have to wait until November – nearly a year – before this problem can be communicated via the survey … assuming they are still with the company by then. And changing views during the year can never be communicated, because the survey isn’t there at any other times, so this useful information is never collected.
The once-a-year format is a major obstacle to finding out what’s going on, and any responses rapidly become stale and increasingly unreliable as employees join and leave in the months following the survey, and as opinions change. The format does employees a disservice, and is no longer fit for purpose (not that it ever was). By way of analogy, there are very good reasons why hospitals check the vital signs of their patients several times a day rather than once a week … and the same logic applies here.
It is probably true that many employees are not overly bothered by the typical Likert* multiple-choice format of the traditional survey. And yet, for others, it can reduce them to an incandescent rage. Many surveys are designed not to give their respondents a neutral option; but people do have neutral opinions, and some will feel outraged that they are forced to get off the fence and submit a response that doesn’t reflect their views. The same is true when opinion is such that the employee can’t tick either the “agree” or “strongly agree” options, because they genuinely feel that neither option is appropriate. There are never enough options to cope with the full range of views that employees want to express.
Surveys which insist that every question is answered are also a source of irritation. What if the employee wants to provide views on some subjects but not all? They can’t. Is this reasonable? Of course not; it merely leads to more frustration and less respect for the survey (and those responsible for it). This is possibly not what the organisers had in mind, but they are nevertheless the architects of their own misfortune and of others’ fury.
And what happens to the results? The “agree” and “strongly agree” views often then get added together into a single category, which makes one wonder what the point was of having two categories in the first place. It’s all very unsatisfactory from an employee perspective.
In a similar vein, many surveys now offer the employee the chance to provide one or more free text comments (but typically only once they’ve waded through the rest of the survey … some won’t get that far). Surveys which don’t provide a comment facility restrict the expression of the employee view only to those subjects on which questions are asked, effectively preventing employees from mentioning anything else that might matter to them; the message this sends is “we don’t care what matters to you, we’re only interested in what matters to us”.
Sometimes the amount of text that can be entered is limited (usually due to the survey provider’s limitations on database field sizes); but what if the employee needs to exceed this limit? If unlimited text is allowed, employees still have to be very careful. If there’s one comment box, the employee could easily reveal their identity through the combination of things they want to say. Even if multiple boxes are allowed, the employee has no guarantee that these won’t be presented to the company as being provided by the same (anonymous) author. None of this is helpful in obtaining the employee’s true thoughts.
Many surveys are genuinely anonymous, as they should be. Sadly, this level of professionalism has not reached every organisation, and with some surveys it is transparently obvious to employees that the employer could, if they chose to, work out who thinks what by cross-referencing the details employees are asked to provide (e.g. location, seniority and function). If these details are hidden from the employee (though their behind-the scenes provision by the employer to the survey company), there is always a lurking suspicion that surveys are never quite as anonymous as the company would have everyone believe. Whether or not this is true, it’s the perception that is the issue, because it is the perception (rightly or wrongly) that leads to a lack of trust. Lose enough trust, and the employee will simply not participate. It’s sad that people think this way, but they do; and the worse the general relations are between employees and employer, and hence the more desperately the employer needs the employees’ views, the less likely they are to get them. This cynical cycle of suspicion has to be broken.
Clearly not all employees are going to feel that all the issues listed above apply to them. But it would be a rare employee that had absolutely no issues with the format, and every issue, every doubt, and every dissatisfaction is going to reduce the response rate, reduce the authenticity of the responses, and reduce the accuracy of the survey as a whole. None of this is desirable, yet there has been little innovation from survey providers to reduce these problems, and little demand from employers for better solutions. It’s hard to say why, but it can’t go on like this; employees deserve better. The bottom line is that, no matter how much you’re paying for it, your survey probably isn’t giving you the quality of results you think it is.
It would be irresponsible of us to complain so much without actually offering some helpful suggestions. So part 2 of this blog will take a close look at how all these problems can be solved – because they can, indeed, be solved.
*Likert Scale http://en.wikipedia.org/wiki/Likert_scale
Image courtesy of Ambro at FreeDigitalPhotos.net