Putting people on the analytics map, part one

People analytics has been around for decades, but HR has yet to fully tap into its bounty. In this month's cover story, Beau Jackson finds out what’s holding them back.

For years, HR has been sailing on a sea of data – engagement, skills, pay, demography – all of it invaluable information about a business's biggest asset: its people.

Lying deep at the bottom of this statistical sea, we’re promised, is a treasure trove of insights that could help the people profession steer their organisations towards better, more inclusive cultures, complete with predictions for potential obstacles ahead. Yet the map to these riches still proves elusive.

A 2018 report from Workday and the CIPD found that almost two-thirds of companies with a strong people analytics culture said business performance is strong compared with the competition.

Only half of HR professionals, however, said they use people data to inform business problems, and roughly the same amount have the confidence in numerical and statistical skills to undertake data analysis. 

Still, the global feeling is that people analytics is being under-utilised. Operational and descriptive data is what HR has been using for reporting for years. Diagnostic, predictive and prescriptive analytics are the next stages in this journey, communicating respectively the reasons behind change, what will happen next and what to do about it.

As the profession strives to be more evidence-based, moving away from biased and unreliable decisions, using data is inevitable. Put simply, Neil Morrison, group human resources director at Severn Trent, argues: “I don’t think you can do our jobs without good data.”

Data is one of the four types of evidence recognised in evidence-based practice by the CIPD. Using it as a basis for practice, therefore, is crucial, Morrison adds. 

“Being able to base our discussions or interventions on evidence is really critical,” he says. “And it will be more and more so as we progress as a profession.”

According to the CIPD, over three-quarters of practitioners globally already have access to data they can use for decision making. 

The quality of data for key areas of risk – like talent management, diversity and equality, and health and safety – is also rated highly. 

So, what is it exactly that’s preventing professionals from turning the data they have at their fingertips into strategic gold?


People analytics FAQ:

The possibilities for prescriptive analytics in HR

People analytics: what decisions should HR be making?

Setting up a people analytics function at GSK


 

Don’t drown in the data

CIPD research found the most common reason why HR professionals aren’t making the most of people analytics is because they don’t know where to start. 

Many of the conversations Melissa Paris, lead people scientist at CultureAmp in EMEA, and her team are having with clients centre on how to use people data to tell a story and drive change.

She says: “Lots of organisations are at different stages with people analytics, but I think it would be safe to say that where we are today, it’s still generally talked down.”

Currently, she says, too many organisations focus only on reading the data, yet the real value of people analytics lies in being able to use it to support decision making.

“When we talk about people analytics, it’s so focused on the actual analytics piece of collecting and understanding the data – it is very much like ‘this is what the data says, these are some insights’,” she says.

Gemma Kelly, head of people analytics at the Office for National Statistics (ONS), recognises this in a lot of organisations.

“There are so many buzzwords about people analytics, it’s a little bit overwhelming, and people probably feel there’s all this stuff they should be doing, and they don’t know how to just get the basics right,” she says.

“It’s almost a ‘I can’t do that, so I won’t do anything’ type approach.”

Having data stored in silos feeds this inertia, says Kelly. In an ideal world, you would have one whole system to bring it all together, but realistically she says it might be more practical to have a team member who understands how to collate it instead.

“It’s easy to say we can’t get all this data in one place so we’re not going to do anything with it, whereas actually, you don’t need to, you just need someone to help you bring that together so that you can look at the data and interpret it,” she says.

To get started with people analytics, the main thing HR practitioners need is a question, or a hypothesis, worth investigating. The idea is then to test it using people data.

Holding on to this question and returning to it throughout the exercise will prevent teams from drowning in the data. 

ONS director of people and business services UKSA and HR director for the Government Analysis Function, Philippa Bonay, explains: “It’s using your judgement to link it back to the strategy of the organisation. It’s asking ‘What’s the bigger picture? What’s my strategic objective? How should I care about this? And what therein should I really care about?’”

In its guide on getting started with people analytics, the CIPD identifies two types of question – non-effect questions and effect questions. 

Non-effect questions are the kind that would typically be asked for reporting purposes – for example, “What is the sickness rate for apprentices?” 

Effect questions, on the other hand, are more descriptive, with a clear link to business aims like improving retention, for example: “Does our learning and development programme reduce employees’ intention to leave?” 

Severn Trent is using people analytics to better understand risk associated with an ageing workforce and skills gaps. Morrison says: “You should always start off with ‘what is the problem I’m trying to solve?’ or ‘what’s the hypothesis?’ and then you test this with the data.”

Once you have the business question, the next step is making sure the data collected is fit to answer it. For this reason, Severn Trent has spent years improving the quality of its core data. “We’ve understood that without that, any level of analytics isn’t going to be particularly helpful, because rubbish in, rubbish out,” says Morrison.

He defines good data as accurate, relevant and understandable. For accuracy, employees are encouraged to access their data and verify it. This also helps with making sure the data is relevant as it is regularly updated, with new skills for example. 

On relevance, the team at Severn Trent are working on building demographic data so they can make moves on diversity and inclusion.

“We’re doing a lot around the inclusion piece and that is really about quality of data,” Morrison adds. “So many organisations make statements about diversity and inclusion, but they haven’t got a clue about what they’ve got in their organisation. How they’re supposed to improve diversity when they don’t even know where they’re starting is pretty shocking.”

The final part – making sure data is understandable – involves combining multiple datasets to paint the bigger picture of the problem you’re trying to solve.

Morrison adds: “That’s when stuff like regression analysis is really helpful because it starts to use statistical tools to really understand exactly what’s going on between two sets of data.”

As an example, Morrison describes how in his previous role at Penguin Random House, the team were able to pinpoint that someone had a higher risk of leaving if they had not been promoted within three years of joining.

This prediction was made by combining data on career progression with turnover – two sources that would otherwise have been stored separately.

 

When do you stop digging?

With the right data, and a valuable question, the next thing HR needs to drive value with people analytics is to test and validate their hypothesis. Missing this step, according to Paris, is one of the most common pitfalls.

“I think the common misconception is that it doesn’t lead to action, which I think is a symptom of the way that we’re using people analytics today,” she says.

If companies are too focused on understanding the data and the different things analysis can tell you, they can forget to do anything with the results to test possible solutions, Paris argues. This can lead companies to fall into analysis paralysis. 

She says: “You could easily get sucked into analysing data until there’s no holes in it, but that stops you from taking action.” 

But can you act too quickly on the results? Paris argues that once you have collected enough information, action is preferable. 

She says: “It’s better to do something with the results even if it’s not the most pristine perfect piece of analytics, then you can just test to see whether what you’ve done is working.”

A test and validate mindset may be more new territory for HR. “In the product world, we have things like A-B testing and in marketing, we have a good method of trialling and failing, but I don’t feel that’s crossed over into HR yet,” says Paris. “You feel like there’s so much pressure for us to get it right the first time.”


"Analytics should build your confidence in the action you're going to take"


Developing this process of experimentation would, Paris says, help HR to be more creative in its approach and bring value to business strategy.

“We see a lot of companies investing so much time and effort into showing impact – for example, what impact does engagement have on performance, what’s the highest driver of retention at our company. That’s great, but you’ve skipped all the steps of doing something with the data,” she says.

Instead of rushing ahead to the results, Paris encourages teams to test out potential solutions so they get a better idea of why something might be having an impact, and what they can do to improve it.

“We should be more open to experimenting and then dropping it if it doesn’t work or just adapting based on future information, because then you can move a little bit quicker and get people comfortable with iterating as you go,” she says.

“The whole point of analytics is to build your confidence that the action you’re going to take is going to be the right one.”

To get better at experimentation, and knowing when to stop digging into the data, Paris advises being clear with aims and intentions. She adds: “If you still have questions – e.g. I have data that says that career progression is important, but I don’t know what would help – maybe you do need to unpack that a little bit and ask a couple more questions or speak to some people to build your confidence that this is the right area.”

Similarly, Sarah Andresen, principal people scientist at Sage AI Labs, says a good plan can help with any reluctance to experiment.

“Often, we don’t feel like we can experiment with things or test hypotheses because we’re not doing it in a deliberate manner. Make it really clear that you’re following this experimental process, and it’s going to be data driven,” she says.

With its hypotheses and experimentation, it can be easy to get caught up in the scientific language surrounding people analytics. Yet Bonay reinforces that the process is simply an extension of what HR does every day without noticing it.

“I tell HR professionals we are analysts; we do that all the time. We’re dealing with people, making decisions and analysing what people are telling us to give them advice,” she says.

“You don’t have to have done data analytics or a finance degree or something like that – it’s something that we’re doing all the time, but we probably don’t recognise it as such.”

Where practitioners sometimes have trouble, she thinks, is connecting the dots between hard data and their intuition – as one may challenge the other.

“I think possibly quite a lot of HR people will go ‘Oh here’s a hard fact’ and they’ll get a little bit nervous,” says Bonay. “They won’t necessarily go ‘I can draw insights form this’ or ‘how do I bring this into the mix?’” 

Morrison suggests that the root of this problem is ingrained in HR’s practice. “Historically, a lot of HR has been based around either belief or looking externally and saying, everybody’s doing this we should be doing this too,” he says.

Although this intuition and knowledge of best practice helps in strategic decision-making, HR needs data too, he argues, to back it up.

He says: “We need to get more evidence-based in terms of understanding why we make decisions, what decisions need to be made, what interventions need to be made, and using the data to help us be more forward-thinking.”

 


Digging deeper in your gender pay gap 

Charles Cotton, senior performance and reward adviser at the CIPD, gives some example analytics questions and metrics that could be used to dig deeper in an organisation’s gender pay gap reporting

Question

Suggested metrics

Is there a problem in attracting female or male candidates?

Number of job applicants by gender, job offers by gender, job acceptances by gender or internal promotions by gender.

Is there a problem in developing male and female performance?

Training days/courses by gender or training assessment by gender.

Is there a problem in how we pay our employees?

Job evaluation scores by gender, pay/bonus award outcomes, pay band progression speed by gender, employee feedback for pay processes and outcomes.

Is there a problem with the benefits we offer our employees?

Pension contribution rates by gender, take up of benefits by gender, take up of flexible working by gender, number of benefit choices by gender, and proportion of males and females in the company share plan.


 

GDPR-iceberg ahead

The Cambridge Analytica scandal, introduction of General Data Protection Regulation (GDPR) and increased media attention on how companies use and sell data has made some organisations cautious about information they hold on employees, and likewise their employees about what information they want to share.

When allaying these concerns, Morrison at Severn Trent says HR should be careful in drawing parallels between what data people share with their employer and what they share on social media.

“People have a different expectation of their relationship with their employer,” he says.

“I don’t think it’s an easy parallel to draw and we’ve just got to be careful about the trust boundaries. That’s where the transparency of the data is really important, so people can see what’s held, and why it’s held, to build trust about how it’s used.”


Privacy and people analytics:

D&I clinic: How can HR encourage disclosure of protected characteristics?

Tricky areas of GDPR compliance

Legal lowdown: Data gathering


Anonymising data is one way HR can provide assurance while making full use of the results. But it’s not without its challenges, for businesses large and small. Adam Penman, associate labour and employment associate at McGuireWoods, says: “You can only attain data for as long as it lawful, but you can anonymise it and then keep it forever. The problem with that is that if you have over 250 employees, to anonymise the data you would have to invest heavily in technology and professional advice in order to analyse that data.”

On the other end of the scale, in businesses of 250 or fewer employees, minority populations can stick out in the data, therefore compromising their anonymity, he says. “It’s going be difficult to anonymise that data,” he says, “But it’s not impossible.”

As with any method, there are limits to data. It can’t tell you everything, and it can sometimes be misleading if you only focus on what you want to see. The saying that springs to mind for Alan Price, founder and CEO of BrightHR, who has seen a lot of scepticism in people analytics from SMEs, is: “There are three kinds of lies: lies, damned lies, and statistics.”

“I think that’s part of the challenge, that it’s so powerful as a tool, but it can be used wrongly if it’s not analysed correctly,” he says.

Best practice to avoid falling into this trap, Kelly says, is about presenting both sides of the story. “The data won’t always be all the answers,” she explains, “So we present it as a starting point to open the discussion.”

Referencing outliers and relevant context that might be skewing the data – for example, a spike in attrition where a project ended and everyone was going to leave – is critical. Kelly adds: “You’re calling out why there might be challenges, but then you have to present it and give people the opportunity to digest it because you’re not acting on those decisions in the right way if you’re only looking at the favourable information.”

HR should also avoid jumping to conclusions, she says: “From a data perspective, as long as you’ve tested it, rationalised it, you can share that data for discussion.

“I suppose a lot of people will have a knee-jerk reaction, so that’s about how you then act on that data and thinking from an HR professional perspective, what’s the best way to tackle that issue.”

Bonay adds: “If you think it could be misinterpreted, it’s about understanding when you’re going to introduce that to a wider audience. That’s about knowing your organisation and how people in your organisation react.”

 

This is part one of an article appears in the July/August 2021 print issue. Check out part two here and subscribe today to have all our latest articles delivered right to your desk.