Experts concerned about increasing level of deepfake attacks

Increase highlights importance of engagement with insurers

Experts concerned about increasing level of deepfake attacks

Cyber

By Duffie Osental

Fraudsters are increasingly using technology to doctor their voices and faces in a con technique called “deepfake” – and cyber security experts and insurers are warning companies to make sure that they have the necessary safeguards in place.

Deepfake attacks involve criminals using artificial intelligence (AI), machine learning, and other technologies to impersonate a company’s most senior executives. By using advanced rendering techniques such as “face-swaps” and syncing video with AI-powered voice recordings, criminals can easily convince targets – often middle-management finance employees – to transfer large amounts of money.

In one case, The Financial Times reported that a company transferred $10 million to fraudsters who managed to convincingly impersonate a senior manager.

“Deepfakes… will allow cyber criminals to up their game in terms of social engineering,” John Farley, managing director, cyber practice group leader at insurance broker Arthur J. Gallagher, told The Financial Times. “There’s a whole host of nightmare scenarios.”

Experts are worried that the frequency of deepfake attacks will only increase as the technology becomes cheaper and more sophisticated. Michael Farrell, executive director of the Institute for Information Security and Privacy at Georgia Tech, told The Financial Times that this is why companies need to engage with cyber security firms and insurers on how to prevent these attacks from occurring.

“Not only are deepfakes evolving rapidly and improving their level of realism, but also the barrier to entry to create and distribute deepfakes is getting lower,” said Farrell. “There is a significant opportunity for cyber security companies to play in this space when it comes to fraud prevention.”

 

Related Stories

Keep up with the latest news and events

Join our mailing list, it’s free!