It sounds like a dystopian sci-fi novel. Employees fitted with headsets that monitor their brainwaves and emotions, badges that give feedback on how often and in what way they interact with colleagues, and sensors in their clothes that file real-time reports on how fast they are working. Increasingly though, this isn’t fiction.
Welcome to the future of work, where the possibility exists to monitor and report on almost everything your workforce does, and you can use that data to maximise efficiency, increase productivity, or encourage greater collaboration.
What HR professionals need to address however is what is desirable and ethical when it comes to collecting and using employee information in a data-rich world. It’s a question not many seem to have engaged with fully yet, but it’s one HR leaders need to get their heads around sooner rather than later. “There is an ethical dimension to this that has to be sorted out,” believes Michael Meyer, professor of philosophy at Santa Clara University. “We are at the front end of new territory, but the risks are genuine.”
“The agenda has been left to technologists and computer scientists when what’s really needed are people who are experts in human behaviour,” feels Colin Strong, global head of behavioural science at Ipsos. “Economists think of work in terms of the market dimension, but there’s a social and psychological dimension and HR people are more sensitive to the social bits,” adds Meyer. “I may be preaching to the choir here, but the choir might not know its role in the new workplace.”
This new workplace territory is characterised by immense and swift technological change. But while the tools exist, our understanding of their implications remains shaky. “Innovation in the world of technology moves at a very rapid pace, but people’s understanding of how to get value from it doesn’t keep pace,” says Anthony Bruce, a partner in the UK HR consulting practice at PwC. “Right now we have solutions looking for problems; hammers looking for nails. The supply side hasn’t been clear enough about what the value proposition is.”
The “hammers” available include wearables like fitness trackers, telematics and sociometric badges (devices capable of measuring face-to-face interaction, conversation time and the wearer’s physical proximity to other people), computer monitoring software and ‘sentiment analysis’ tools, which scan unstructured data (for example email text) to build up a picture of someone’s personality or feelings about a particular subject. Many of them suggest Taylorism’s scientific management taken to its logical, if uneasy, extreme, and are beginning to affect white collar workers rather than just blue. And they are none too popular in the mainstream press. Recent headlines about such devices include ‘Spy in the office’, ‘Bosses track you night and day with wearable gadgets’ and ‘They call it fun but digital giants are turning workers into robots’.
A productivity misfire
“Everyone has a sense of excitement about what’s possible through the use of data, and it is creating exciting opportunities for us to get new forms of insight into human behaviour,” says Strong. “Being able to understand more is an exciting possibility, but we are still at the point of people thinking it’s a good idea but not being sure what to do with it. We can’t just be collecting data for the sake of it.”
This uncertainty can be dangerous if organisations jump into using tools they don’t grasp the potential impact of. Jack Aiello, professor of psychology at Rutgers University, has carried out extensive research on the regulation and control of social interaction. “When I was consulting back in the 80s one of the things that started to show itself clearly was that the psychological contract was vulnerable to changes in technology,” he recalls. “We have seen the darker side of that occur as often as the productivity gains.”
Surveillance, Aiello has found, can often have the opposite effect to what managers might hope for. While most of the tools are intended to increase productivity they can do the exact opposite. “People feel defensive, they aren’t as confident, and that can create all kinds of negative impacts,” he explains. Knowing one is being observed, and judged or ranked on a second-by-second basis, can also lead to people gaming the system. Have you ever been at home waiting for a parcel only to find a note saying: ‘Sorry we tried, but you weren’t in’? You’re not going mad, it’s more likely the delivery person decided to shave a few seconds off their time by not attempting a delivery in the first place. “The ripple effect that’s created to beat the system can be a downward cycle,” Aiello warns.
“Even though some people say we’ve moved to a post-privacy era with social media, privacy still matters to people as a way of shaping their lives,” believes Meyer. “If it became a condition of the workplace that you had to bare your soul for the number of hours you worked that would be pretty terrible.”
However, those creating and using these tools emphasise that the focus should be on individual employees understanding, and therefore improving, themselves and by extension the performance of the wider organisation.
“Technology enables humans to focus on what they are good at,” says Ben Waber, CEO of sociometric badge provider Humanyze. “The badge we use provides data for individuals allowing them to compare themselves to top performers in their role. It provides quantitative objective feedback. It allows managers to look at how teams are spending their time. Assuming we have the data protection in place, it’s about whether you want to learn more about improving yourself and the organisation.” He adds that in Japan the tool is being used to tell managers when people are overworking, allowing for interventions if necessary.
Diana Barea, who leads Accenture’s UK and Ireland talent and organisation practice, says there is a distinction to be made between who or what is looking at the data. “It’s not about your boss looking at you, it’s about a robot looking at you,” she says. “It’s about being enabled rather than being monitored.” The question becomes one of intent (what will the data be used for?) and transparency (do employees know why these tools are being used?). She gives examples of organisations utilising insight from data to “get ahead” of potential problems, such as financial services firms fitting traders with devices that alert managers if someone is more likely to make a risky decision, signalled by raised blood pressure for example, which prompts them to take the individual off the trading floor until the risk has passed. “They are using it to enable people to work more effectively,” Barea adds.
On using health-related data, Chris Tomkins, head of proactive health at Axa PPP healthcare, says third parties can act as “neutral arbitrators” and ensure data is kept safe and used ethically. “With the best of intentions people can unwittingly stray, that’s why this separation between employer and employee is important,” he says. “In the health service space I cannot support an intervention unless I have the data. If we can’t see the data we can’t deliver the service.”
Avoiding a breach
Embracing any of these tools would lead to an organisation being even more awash with data than many already are. And where there are large volumes of data there is risk. That means at a compliance level HR is “absolutely critical” in helping companies to get ahead of the General Data Protection Regulation coming in 2018, says Ardi Kolah, who leads the training programme for data protection officers at Henley Business School. Organisations, he advises, need to be thinking about “reputation not regulation”. “This whole move in the way we process personal data has put a spotlight on how important HR is,” he says.
He advises HRDs to ask themselves two simple questions: “Do you know where your personal data is? And what is the oldest piece of personal data your organisation has? Because you probably shouldn’t have it.” According to Kolah about 60% of data-holding organisations possess data they shouldn’t, and “HR departments are among the worst offenders”.
This reputation point becomes increasingly important when, as Kolah puts it, “everything is transparent” and therefore at risk of being breached. Clive Humby, chief data scientist at Starcount and one of the people behind the data used by Tesco Clubcard, says bluntly that “no-one is immune from a breach”. Nick Taylor, UK and Ireland managing director of Accenture Strategy, is advising clients to approach the new regulation from the perspective of building and earning “digital trust”, rather than simply seeing it as compliance.
There is an issue, however, with legislation failing to keep up with technology. “The difficulty is that legislators are so behind the technology,” says Humby. “As a result they tend to be quite draconian, which isn’t necessarily a good thing. We can’t judge what smart technology can do.” The Internet of Things – smart objects sending data to each other (Waber suggests a coffee machine that moves into the middle of a group of co-workers to encourage them to interact more, should collaboration be lower than bosses would like) – raises more legal questions.
Cool or creepy?
It is somewhat ironic that the man behind the Tesco Clubcard – arguably the most influential example of customer data mining in recent history – is so concerned over employer data gathering going too far. But Humby is worried.
“I have concerns that as a society we don’t know what we are doing yet,” he says. While taking an interest in staff wellbeing can only be a positive, he believes wearable technology could have unintended consequences. “What are the consequences we haven’t thought through yet?” he asks. “Does the fact I use the gym mean I am more committed and therefore more likely to get a promotion?”
Accenture’s Taylor, who until recently lived in South Korea, recounts an anecdote about a friend in a Korean organisation being ticked off for not going to the gym in more than a week and gaining some weight. His wearable device informed his employer of these facts. “There is a broader question of ethics around the extent to which an employer knows about your life outside work,” feels Strong. “Is it right that your employer sets expectations about what is appropriate behaviour in terms of sleep or exercise?” He adds though this is part of a wider conversation about “the data that’s held on individuals and what’s done with it” and that there are “broader conversations to be had about living in a data-intensive society”.
In many ways HR is wrestling with the same dilemmas as marketing. Strong uses the phrase “uncanny valley” to explain the phenomenon of something turning from cool to creepy. In aesthetics the uncanny valley describes the point at which a replica of a human, like a doll or robot, becomes too lifelike, leading to revulsion. In data science Strong explains it as such: “There’s a point with marketing personalisation at which people think: this is too much. They start peeling away and don’t want anything to do with it. If we can get all this personal insight from quite basic data it creates uncanniness. People feel creeped out as they increasingly understand how much information is held about them.”
This means, Humby says, that HR professionals need to ask themselves: “Does this pass the cool versus creepy test – not for you but for your employees?”
In passing that test, what is critical is having a clear sense of why the data is being collected and how it will be used, and sharing this with employees. In the UK employers must justify why they are monitoring employees, whereas in the US, Aiello says, only two out of 50 states have any regulation around telling employees they are being monitored. But beyond compliance it’s about employee voice. “Having a voice is crucial so people don’t feel like they are having something imposed on them,” says Aiello. “The process should be participative.”
Are you in?
Humanyze CEO Waber stresses that people opt to wear his firm’s badges and that implementation is treated like any cultural change programme. “We don’t just show up one day and say ‘this is your sensor’,” he explains. “We go in and talk to employees about what data we collect, that we don’t share it, that it’s anonymous and not about recording your bathroom breaks. It’s a legal contract that people choose to sign.” He says at every client more than 90% of staff opt in, and anyone who doesn’t is given a fake badge so that no-one else knows. Data is only used in aggregate. “You need to let people know what you are doing,” he says. “If you are doing this in the right way people will be excited and want to buy into it. It’s about making the company a better place to work. Do something surreptitiously and it will come back and bite you.”
However, Humby worries that the concept of opting in risks becoming insidious and potentially leading to “digital ghettoisation”. “I’m concerned we will end up with people who will, because of their personal circumstances or beliefs, be disadvantaged because they are unwilling to sign [their data away],” he says.
In consumer life this could mean having higher insurance premiums, say. But in the employer/employee relationship an imbalance of power exists that’s not reflected in the consumer/brand relationship. “It’s more anonymous sharing your data with a brand and the risks are reduced,” says Strong. “There is a power issue with your employer.” Meyer portentously states that opting in to being monitored can become a “different sort of oppression” and that “oppression that people choose is still oppression”.
Anna Gudmundson is group CEO of wellness technology company Fitbug. She has noticed some worrying trends in the US that play into Humby’s digital ghettoisation concept, such as firms excluding employees from health insurance because they won’t take part in wellness programmes. “Excluding that person is going to make them spiral in the other direction and sends a bad signal,” she says. She adds that the “sense of being constantly monitored increases stress”, which undermines any attempt to help people get healthier.
Gudmundson advocates communicating clearly what you are getting out of the data and how you are keeping it safe. She also advises her clients to “take a lean data approach”, as “the more you hold the higher the risk”. “The more aware the HR community is, the more pressure it puts on us as providers to consider these things,” she adds.
Barea and Taylor echo this advice on being transparent about why data is being collected and how it is being used. They speak enthusiastically about the possibilities in tools that scan email, employ facial recognition
or use DNA, but stress that it’s all about improving performance for the common good and understanding your colleagues better. “It’s about how do you get the individual to own how they respond and shape the environment around them using the insight from the data,” says Barea.
“If you are purpose-driven, employees are going to feel their part in driving your purpose is worth giving some information,” she explains. She advises organisations to think about what is acceptable for them, draw a spectrum of possible information-gathering tools (taking in wearables and as far as DNA and genetics) and then work out where they lie on that spectrum.
A question of trust
In terms of how employees feel about being tracked, recent surveys suggest a certain degree of openness. “People are happy to accept there are trade-offs,” says Strong. “There is a willingness to understand themselves a little better.” According to AXA’s Health Tech & You State of the Nation report, 57% of working adults would be open to wearing a fitness band or similar if supplied by their employer, and 58% would be comfortable sharing data with their employer if it was used to help improve health and wellbeing benefits.
However, 31% said they would not be comfortable sharing their data with their employer. And in a separate survey on wearables by PwC, 37% said they did not trust their employer not to use the data against them in some way.
This question of trust is crucial. But Steve Rockey, people director at Home Grown Hotels and Lime Wood Group, argues that “monitoring means there is no trust”. “That relationship starts to break down if you’re reverting to old school command and control, which is what this could mean,” he says. “You’ve got to give people an overview and trust most will do vaguely sensible things. It’s great to get relevant data to support what you do, but never lose sight of the fact you are in charge of people, and that’s not always quantifiable.”
Aviva HR director Nathan Adams feels that although dealing with aggregate data is fine, “when you get down to the individual it will be a whole different ball game”. “People need to trust the organisation to do the right thing with their data,” he adds. “But it would only take one misuse [to break that trust].”
Even though Rockey does not like the idea of deploying monitoring devices he acknowledges that “ignoring it is the wrong thing to do”. “All this stuff is here and real and it is growing,” he says. “As an HR person you should be understanding it and making the right decision for your business. It might be really important to your business; central to who you are and the values. If it’s none of those things, maybe it’s not relevant.”
What is important is that HR is involved in these conversations at the very start, as the boundaries shift and the possibilities unfold, to ensure that any technology that is deployed is done so in the most positive way possible, and that the use of any tools does not backfire. “The reality has not sunk in yet for most businesses that there is a downside to this desire for productivity gains,” Aiello warns. “HR can be the person with the slightly different point of view, expressing some of the concerns and thinking about the implications and ramifications.”
Strong believes the opportunity is there to use technology and data to unpick human behaviour, we just have to be considered in doing so. “This doesn’t have to be dystopian,” he says. “We can use this technology in a positive way. But that’s why it needs to start in HR.”