Aussie businesses are being warned to treat audio messages with caution after a series of ‘deep fake’ recordings were used to scam senior financial managers overseas earlier this month.

First reported by the BBC, cyber security firm Symantec has tracked at least three successful attacks on private companies where chief executives were impersonated to convince financial managers to transfer money.

In each case, the fraudsters employed so-called ‘deep fake’ technology, where artificial intelligence programs are used to manipulate audio or video for the purpose of impersonating someone.

Any accessible audio sources can be harvested for information, including conferences, keynotes, presentations and media appearances like podcasts.

‘Deep fake’ technology has grown in sophistication in recent years and the scam activity has cyber security experts worried, particularly given something which may seem like a routine conversation with a colleague could be completely faked.

Nik Devidas, managing director of Rock IT, says while it still requires a substantial amount of accessible audio to compile deep fakes prices are falling on the black market as the technology becomes more widely available.

Devidas tells SmartCompany the technology highlights the growing difficulty of protecting against cyber threats, advising businesses to protect themselves with multi-factor authentication on devices and sensitive accounts.

“It could be as simple as some sort of code word you have between yourself and staff,” Devidas says.

Andrew Bycroft, chief executive of the International Cyber Resilience Institute, says the threat of deep fakes is likely to grow as artificial intelligence technology advances.

“This is the sort of thing we used to see in movies like Mission Impossible, where Ethan Hunt would assume the identity of one of the henchmen to gain access to and fool the master villain, but now technology has made this a reality,” he tells SmartCompany.

“It does involve the need to find enough audio samples and is a painstaking process, as much so as the cutting-out-letter samples and construction of ransom notes from the prior century.”

As experts overseas begin to wrangle with the deep fake threat, there are warnings software solutions will be few and far between.

Devidas says deep fakes present an interesting challenge in this respect.

“For this sort of thing, software is only going to solve part of the problem,” he says.

If you get a call, Devidas advises the following: “Hang up, call your boss’ mobile, develop a strategy internally to circumvent it.”

“Always verify, once the money has left its very hard to get back.

“There’s no insurance policy for sending money to an overseas account,” Devidas says.

While there have been no documented video deep fake attacks so far, a video circulated last year of a fake President Obama delivering a piece to camera demonstrates how convincing the fakes can be.

In the cases reported by the BBC, scammers even used customary background noise to mask the less convincing words in their recordings.

Bycroft’s advice to businesses involves adopting a sceptical mindset and listening carefully for anything out of the ordinary in important phone calls, particularly if the circumstances are suspicious.

“Err on the side of caution and take a moment to think about whether the request demanded of you seems legitimate and ethical. If not, hang up the phone and attempt to call the person you believe you were speaking with,” he says.

How deep fakes can be used by scammers
1.Impersonating senior managers to secure fraudulent transfer of funds.
2.Impersonating suppliers or other third-parties to entice fraudulent payments.
3.To gain access to secure company servers or files protected by compromised audio or photographic passwords.
4. To spread misinformation about a business, its directors and/or employees.
What businesses can do
1.Always verify the identity of anyone enquiring about funds transfers or payments.
2.Avoid securing company files with audio, video or photographic passwords (including smartphones).
3.Ensure deep fakes are included in cyber security policies and take active steps to raise awareness about the risk in the workplace.

Source: Smart Company