Your engagement survey could be fake news

Beware the ‘fake news’ that can be presented in engagement survey participation and results

In some firms engagement surveys are now sent round with great regularity. But frequency isn’t necessarily a guarantee of success. Establishing a well-rounded, well-responded-to survey, where the data gathered gives a true picture of workforce engagement, is more of a minefield than many might imagine. ‘Fake news’ it seems isn’t reserved for social media platforms and political campaigns. Rather, there’s many a headline HR might attach to survey responses that brings little truth to bear.

So what are the most common hoaxes making their way into engagement survey news? Here are the fake headlines to spot and how to avoid falling into the trap of taking them at face value.

Read all about it: The results are gospel

It’s one of the biggest paradoxes of engagement surveys – desire for feedback – yet many agree taking results too literally can be dangerous. “Organisations must view any results with a pinch of salt,” argues Mark Winter, consultant at employee engagement firm Rambutan and former head of internal communication at Tesco. “Surveys often attract those with the most polarised views – good or bad – so making conclusions based on these people alone probably misses the picture.

“At Tesco you’d hear stories of managers not letting people go home ‘til they’d filled their surveys in, giving rise to greater doubt about the worthiness of the information,” he adds.

“Surveys can literally be a measure of buy-in at that precise moment in time, so it can be affected if people feel swamped with work or if they’re having personal problems,” says Michael Silverman, former head of employee insight at Unilever and founder of real-time survey organisation Crowdoscope. “To draw out a whole strategy for the year ahead based on this would be risky.”

Ashridge Business School research supports this thinking. Senior faculty member Amy Armstrong explains that they set out to determine the differences between engaged and disengaged workers and used surveys to find those they thought were examples of each. “But some 14% of the so-called engaged group were only ‘content’ while 29% were what we call ‘pseudo-engaged’: they gave a pretence of being engaged. We later found engagement results fluctuated daily,” she says. “We’d always take the results of surveys circumspectly.”

Read all about it: Questions must be comparative and quantitative

There’s a saying among data analysts that if you ‘put s**t in you’ll get s**t out’, and engagement surveys are no exception.

But it seems poor survey design is a trap many HRDs fall into. “Often there’s huge pressure to keep the questions the same survey after survey because that’s how improvements can be seen,” says David Godden, VP of sales and marketing at engagement survey provider Thymometrics. “But if the business strategy has changed it’s essential new questions are added or old ones are dropped, keeping core ones to a minimum.”

Then there’s the issue of bad question-writing in the first place. There’s a school of thought that encourages adopting vague psychometrics-based questions – those that arrive at deeper-held true ‘values’ by requiring more thought to answer, and which are less likely to succumb to people clicking through quickly once bored.

But others disagree. “If a question’s ambiguous people will give an ambiguous answer,” says Godden. “The worst examples are questions that ask staff to rate things one to 10 when free text is more appropriate – rating how well you get on with your boss is an emotion hard to quantify like this. Better to record the emotion itself rather than an attempt to gauge it.”

Linked to this is survey length. Charmi Patel, associate professor of international HRM at Henley Business School, recommends short surveys because “people have a cognitive span of 10 minutes at most”. Others say they should be shorter still. “Ideally your survey should take less than five minutes to complete,” says Will Craig, managing director of car leasing comparison site LeaseFetcher. “This means HRDs should be looking to include around 10 short questions only.”

Read all about it: Surveys should always be mandated or incentivised

In the chase to garner high response rates all sorts of activities (including offering incentives or virtual compulsion to get the survey in) are applied, none of which speak to the central aims of doing the survey in the first place: encouraging two-way communication and bolstering employee voice.

“The biggest pitfall is managers using surveys to promote themselves; i.e. that they’ve achieved 95% engagement levels and ‘look here’s the proof’,” says Ricky Martin, founder of Hyper Recruitment Solutions. “This forgets the whole point of why the survey is happening, which is to reinforce the message that we’re listening to staff.”

For Jenny Perkins, head of engagement at leadership consultancy Cirrus, mandating or incentivising completion can give a false indication of engagement. “Sometimes people can feel quite cynical about surveys if they think action will not be taken,” she says. “The percentage of employees who respond to a survey is in itself an indication of how engaged your workforce is, so for this reason alone incentives are not a great idea.”

Communicating the purpose of the survey is critical though. As Silverman suggests: “Create some buzz, a reason why it’s being done. Have a three- or four-week countdown with reminders set into a communication plan. This way you’ll have a better chance of reaching the middle third that never respond, and you might even reduce the suspicion that surveys are simply the questions management want to ask.”

Timing is also crucial, adds Martin Blackburn, UK people director at KPMG UK. “Because our annual survey falls around the same time as our salary and bonus review we can sometimes get distorted results around reward,” he concedes. “It’s why we also have quarterly pulse surveys, asking some of the same questions at different times of the year, to give a more meaningful view.”

Read all about it: It’s all bad... or it’s all good

Defaulting to repetitive or one-to-10-based questions only is a sign of one of the most serious pitfalls – not investing in interpretation. “Having a mixture of yes/no questions, scales of strength and free text is the marker of a good survey, but most fail to have this spread because it makes analysis harder,” says Silverman. “The problem is most HR professionals aren’t experts in interpreting data. Yet sentiment analysis, collective intelligence research, social network analysis – they’re all vital to determine human emotion and engagement scores.”

The biggest trap when analysing results is thinking that everything is a negative, says Sarah Robson, senior communications consultant at Aon. “People tend to only write comments if something bad has happened or if they’re dissatisfied with the service. HR departments will need to consider how to rank those who have not left comments, as these people are most likely satisfied and this could outweigh those who are not.”

At the same time there’s a tendency to focus only on the good news, warns internal communications consultant at Spottydog Communications Alissa Burn. “Proper analysis involves getting a realistic view of what’s happening in the business and the information needed to course-correct it,” she advises. One solution KPMG adopts is to split analysis into engagement teams spread throughout the business, explains Blackburn: “They bring people together from all levels of seniority and they’re entrusted with the survey results for their area and empowered to make change”. This, he argues, “makes the experience around the results more personal and meaningful”.