The ability to clone a human voice has reached new heights with the help of advanced AI tools. These technologies can take a voice sample, process it, and generate a near-perfect replica that can say anything in the original voice. While voice cloning has been possible since 2018, today’s tools are faster, more accurate, and easier to use than ever before.
Don’t believe it? Earlier this year, OpenAI, the company known for ChatGPT, demonstrated a project that proved a voice could be cloned using just a 15-second recording.
Though OpenAI’s tool isn’t available to the public and reportedly includes safeguards to prevent misuse, other services like Eleven Labs offer similar capabilities. For as little as $6, anyone can clone a voice from a one-minute audio sample, and this service is widely accessible.
It’s easy to see how this technology could be exploited. Scammers can gather voice samples from phone calls or social media videos, then use those samples to clone voices and carry out fraudulent schemes.
A prevalent example is the “grandparent scam,” where a scammer clones the voice of a grandchild and calls an elderly family member, pretending to be in urgent need of money.
The scam often involves the “grandchild” claiming to be in an accident or legal trouble, urging the grandparent to keep the call secret—especially from their parents—to avoid revealing the fraud. The emotional pressure can be overwhelming, leading many to fall victim to the scam.
Awareness of voice cloning might help, but in a moment of crisis, even the most vigilant could be fooled. The scam’s effectiveness comes from its ability to manipulate emotions, making it difficult to question the authenticity of the call.
As AI-powered voice cloning technology continues to advance, this type of scam is expected to increase in frequency.
To safeguard yourself and your loved ones, hold a family meeting and agree on a specific code word to use in emergencies. If you receive a distressing call asking for money, ask the caller to confirm the code word. If they can’t, you’re likely dealing with a scammer.