AI exposes your biggest security flaw: your voice

You can make your passwords super-strong, you can protect accounts with multi-factor authentication, but there’s almost nothing you can do to protect what’s fast becoming your biggest security liability: your voice. Or more specifically, AI voice cloning.

Regulators around the world are warning people to be on alert for the rapidly growing wave of AI voice clone attacks.

Victims are being scammed out of huge sums of money, having received plaintive calls from loved ones to send money to solve an immediate crisis. Of course, it’s not a family member on the other end of the line, but a criminal who’s convincingly cloned their voice with low-cost AI tools.

AI has “turbocharged” such scams, the chair of the US Federal Trade Commission, Lina Khan, warned last week, adding that law enforcement needs to proactively tackle the problem. The problem is, it’s a ridiculously hard con to stop.

Voice cloned in seconds

Voice cloning tools are not difficult to come by. A recent study by security firm McAfee claimed to have found more than a dozen that were freely available on the internet, and even that seems like a modest figure given the recent proliferation of AI apps. We’ve previously reported on how a journalist managed to bypass Lloyds Bank’s Voice ID security by using an AI tool. 

What’s more, AI tools can be trained to accurately mimic someone’s voice with only a few seconds of sample audio. “These tools required only a basic level of experience and expertise to use,” McAfee’s researchers found. “In one instance, just three seconds of audio was enough to produce a clone with an 85% voice match to the original (based on the benchmarking and assessment of McAfee security researchers). Further effort can increase the accuracy yet more. By training the data models, McAfee researchers achieved a 95% voice match based on just a small number of audio files.”

Getting hold of a short voice sample is not difficult. Videos posted on social media or voicemail messages can provide a sufficient sample for cloning. And even if a person’s voice isn’t readily available online, calling a victim and pretending to be a salesperson or even a wrong number is often all that’s needed to record a usable voice sample.

The expensive con

Voice cloning has been used to pull off high-value crimes. In the US, a couple were conned out of $21,000 when someone purporting to be a lawyer put their ‘son’ on the line, begging for money to pay his legal fees after he’d been involved in a car accident. The victim, Benjamin Perkins, said the AI-generated voice – believed to be cloned from YouTube videos made about his snowmobile hobby – sounded “close enough for my parents to truly believe they did speak with me,” according to a report in The Washington Post.

Voice cloning has also been used in attempted ransom scams, with the voices of children cloned to extort money from parents. More commonly, it’s used to leave WhatsApp voice messages or voicemails requesting modest amounts of money are sent urgently to help loved ones out of a short-term crisis, such as a car breakdown or phone theft.

How to guard against AI voice cloning

Short of never appearing in any online videos or answering your phone, there’s not much you can do to prevent your voice from being cloned.

Instead, the security experts suggest implementing extra security measures to prevent a voice clone from succeeding. McAfee suggests creating a “verbal codeword” for family and close friends that’s only used in cases of genuine emergency. Asking callers to confirm a piece of information only they could know – such as the family’s Wi-Fi password – is another prevention tactic.

Either way, if you receive a distress call from a family member or friend, don’t necessarily assume the worst.


Avatar photo
Barry Collins

Barry has 20 years of experience working on national newspapers, websites and magazines. He was editor of PC Pro and is co-editor and co-owner of He has published a number of articles on TechFinitive covering data, innovation and cybersecurity.