AI Voice Cloning Enables Sophisticated Voicemail Scams Exploiting Trust
AI voice cloning technology can produce a convincing replication of a person's voice from as little as three seconds of real audio using freely available tools. Criminals use this capability to conduct scams that typically arrive as voicemails or calls from a purportedly distressed loved one claiming an emergency and urgently requesting a money transfer.
The audio used to create these cloned voices can be sourced from social media, by quietly listening to real conversations, or even by calling the target and saying nothing to collect voice data. This tactic forms a type of spear phishing—a highly personalized attack that exploits the victim's emotion and trust to provoke swift action.
Oliver Devane, a researcher at McAfee, highlighted that only three seconds of audio can yield a good voice match, and scammers often leverage publicly available information about the person to enhance their deception. Additionally, criminals employ phone number spoofing to make the call appear as if it is coming from the loved one, adding to the scam's credibility.
Experts advise that verification should always be done via a trusted, separate communication channel before transferring funds. One recommended precaution is to set up a codeword with family members, especially children, to confirm the legitimacy of emergency requests.
If a scam is suspected, recipients are urged to pause, verify through other channels, and avoid transferring money until confirmation has been obtained.