· News

AI voice cloning is an ‘industrial scale’ threat to business

Employees should look out for voice and video cloning scams, as well as ChatGPT generated emails

HR experts have warned that businesses are at risk from widespread AI scams in 2024.

An estimated one in 12 brits have fallen victim to AI scams, with 77% of victims losing money as a result, according to research from cybersecurity firm McAfee.

Voice cloning is a popular new scam.

According to McAfee, scammers can use a sample of just three seconds of someone's voice, clone it, and send bogus messages by voicemail or voice message.

This voice could be extracted through professional webinars or social media posts.

Other AI scams include ChatGPT-generated phishing emails: these are harder to detect because tell-tale spelling and grammar mistakes are no longer present. 

Tom Holloway, head of cybersecurity at Redcentric said AI scamming will only become more common.

Speaking to HR magazine, he said: “In the past 12 months, we have seen countless examples of businesses being stung by these extremely skillful cyber-criminals.

"Next year, this is something we only expect to become more common as scammers trial new methods and AI tools develop further.

“It is vital that business leaders, finance teams and individual employees understand the risks they are being exposed to. Ultimately, cyber-criminals have the ability to bring an entire business down at the click of a button.”


Read more: BBC, British Airways and Boots payroll hacked


Simon Litt, finance expert at fintech blog The CFO Club said finance teams are particularly at risk.

He said: “We’re used to seeing criminals use AI to create personalised phishing emails appearing to come from senior finance directors and chief finance operators, asking finance teams to transfer large funds. But that isn’t the extent of it any longer. 

“We’re also seeing cyber criminals scrape information from finance teams’ LinkedIn profiles, and use this to create personalised targets. I’d anticipate this is only going to become an increasingly common occurrence in 2024.”

Holloway added that financial leaders should be scrupulous about what they share on social media.

He said: "Exposing the fact that you are responsible for budgets worth over £10 million on LinkedIn is going to attract cyber criminals like a moth to a flame. Once they have this information, they know exactly what they’re working with, and can begin to create their personalised target.

“Whilst it can be tempting to shout out about significant financial responsibilities, or indeed major new account wins, the risks that come with this should not be underestimated.”


Read more: Government invests in workplace cyber security


Litt also said staff should be extra cautious during high risk periods, including CFO appointments, large acquisitions or losses.

He added: “These events provide cyber criminals with the opportunity to create a personalised narrative for their attack, which would appear plausible to most employees. It’s vital that businesses, and particularly finance teams, take extra caution during this time by carefully reviewing every external email they receive.”