How WhatsApp voice notes are putting you at risk: Only a 1-min VN needed to clone any voice

Voice notes made via platforms such as WhatsApp or Facebook Messenger, social media posts and phone conversations can be used to clone voices by hackers or cyber criminals. Picture: Dado Ruvic/Reuters

Voice notes made via platforms such as WhatsApp or Facebook Messenger, social media posts and phone conversations can be used to clone voices by hackers or cyber criminals. Picture: Dado Ruvic/Reuters

Published Aug 16, 2023

Share

The use of artificial intelligence techniques to clone voices has created a completely new arena of risk for both businesses and individuals.

Generative AI (GAI) has become a change agent, ushering in new methods of doing business, managing data, gaining insights and curating information.

It has become a significant instrument in the company toolbox as an intelligent and highly competent technology, enabling speedy analysis, support and functionality.

Unfortunately, hackers are taking advantage of GAI's enormous potential, using technology for negative objectives such as constructing convincing deep fakes and executing terrifyingly realistic speech scams.

“The scammers are incredibly clever. Using readily available tools online, they can create realistic conversations that mimic the voice of a specific individual, using just a few seconds of recorded audio.

“While they have already targeted individuals making purchases on platforms like Gumtree or Bob Shop, as well as engaged in fake kidnapping scams, they are now expanding their operations to target high-level executives with C-Suite scams,” said the co-founder and business development director of Nclose, Stephen Osler.

According to Osler, in 2019, the technology was used to imitate the voice of the chief executive of an energy firm in the UK in order to extort $243,000 (approximately R4,676,049).

In 2021, a company in Hong Kong was also defrauded of $35 million (R673,727,600).

Where do these voice excerpts come from? Well, from voice notes made via platforms such as WhatsApp or Facebook Messenger, social media posts and phone conversations, to name a few.

Given the number of individuals who utilise voice notes to swiftly relay instructions to a team member or handle payments, it's simple to see the potential for hackers.

Busy leaders frequently use platforms like WhatsApp to message colleagues while driving or racing between meetings, making it difficult, if not impossible, for staff to detect a false message.

“An IT administrator might receive a voice note from their manager, requesting a password reset for their access to O365. Unaware of the malicious intent, the administrator complies, thinking it’s a legitimate instruction.

“However, in reality, they unintentionally provide privileged credentials to a threat actor. This information can then be exploited to gain unauthorised access to critical business infrastructure and potentially deploy ransomware,” warned Osler.

He added that this is definitely the next level of cyber threats and that deep fake technology will only become better at duping victims and penetrating companies.

So, organisations must ensure that they have extremely robust protocols and procedures in place that demand multiple layers of verification, particularly for financial or authentication-based transactions, to guard against this.

IOL