AI Voice Cloning Scams Targeting Seniors: How Scammers Copy Voices to Steal Money
Already been scammed? Read our First 24 Hours Emergency Guide for critical steps to take immediately.
This page is part of our AI-Powered Scams Targeting Seniors series.
FBI IC3 2025: The FBI’s Internet Crime Complaint Center received 22,364 complaints citing AI as a tool in the commission of fraud, with total losses exceeding $893 million across all age groups. Voice cloning was identified as one of the primary AI techniques used in impersonation and emergency/grandparent scams. For seniors aged 60+, AI-related fraud accounted for $352 million in reported losses from over 3,100 victims.
What Is an AI Voice Cloning Scam?
An AI voice cloning scam is a fraud in which criminals use artificial intelligence to replicate the voice of a trusted person — typically a grandchild, son, daughter, or spouse — and then call the victim pretending to be that person. The cloned voice is often indistinguishable from the real thing, making this one of the most emotionally devastating forms of AI fraud.
Modern AI voice cloning technology requires as little as 3 to 10 seconds of sample audio. Scammers can source this from social media videos, voicemail greetings, YouTube content, TikTok posts, or even a brief phone call where they trick the person into speaking a few sentences.
How Voice Cloning Scams Work:
- Step 1 — Harvesting the voice: The scammer finds audio of the person they want to impersonate. This may come from a public social media post, a YouTube video, or even an earlier phone call. Publicly available audio is sufficient — no hacking is required.
- Step 2 — Creating the clone: Using commercially available or underground AI tools, the scammer generates a voice model that can speak any phrase in the cloned person’s voice, with natural intonation and emotion.
- Step 3 — The emergency call: The scammer calls the victim, often spoofing the caller ID to show the impersonated person’s phone number. The cloned voice delivers an urgent, emotional message: “Grandma, I’ve been in an accident,” “Mom, I’m in jail,” or “I’m in the hospital and need money right now.”
- Step 4 — The pressure: A second person may come on the line posing as a lawyer, police officer, or hospital administrator, adding legitimacy and urgency. The victim is told not to call other family members because it could “make things worse.”
- Step 5 — The payment: The victim is instructed to send money immediately via wire transfer, gift cards, cryptocurrency, or cash pickup. Once the money is sent, it’s nearly impossible to recover.
Real-World Cases:
- The “$15,000 Son” Call: In July 2025, Sharon Brightwell of Dover, Florida received a call from what sounded exactly like her daughter, crying and claiming she had been in a car accident, lost her unborn child, and needed money immediately for a lawyer. Brightwell later said: “I know my daughter’s cry.” She sent $15,000 in cash to a courier. The voice was an AI clone harvested from her daughter’s Facebook and Snapchat videos. She only discovered the deception after speaking to her real daughter.
- Philadelphia Grandmother: An 86-year-old grandmother in South Philadelphia lost $6,000 after receiving a call from what sounded exactly like her granddaughter, claiming she had been in an accident and was being detained by police. A man claiming to be a lawyer named “John” then took over the call. Her granddaughter later told NBC10: “It’s not hard to find something of my voice. I’m on social media.”
- Scale of the Threat: A 2023 McAfee study found that 1 in 4 Americans had encountered an AI voice cloning scam or knew someone who had. Of those who engaged with the scam call, 77% reported losing money. Of those who lost money, 36% lost between $500 and $3,000, and 7% lost between $5,000 and $15,000. Alarmingly, 70% of people surveyed said they were not confident they could distinguish a cloned voice from the real thing.
Red Flags of a Voice Cloning Scam:
- An urgent call from a “family member” asking for immediate money
- Caller asks you NOT to contact other family members to verify
- Payment requested via wire transfer, gift cards, or cryptocurrency
- The caller can’t answer personal questions only the real person would know
- Emotional pressure to act immediately — “Don’t tell anyone, just send the money”
- A “lawyer” or “officer” takes over the call to add urgency
How to Protect Yourself:
- Establish a family code word. Choose a secret word or phrase that only family members know. If someone calls claiming to be a relative, ask for the code word before taking any action.
- Hang up and call back. If you receive a distress call, hang up and dial the person’s real phone number directly. If it was a scam, you’ll reach the real person who knows nothing about an emergency.
- Ask a personal question. Ask something only the real person could answer — not information that could be found on social media.
- Limit public audio and video. Make social media accounts private where possible. The less audio of your family members available online, the harder it is to clone their voices.
- Tell your family. Make sure everyone in your family — especially grandparents — knows that voice cloning exists and that a familiar voice on the phone is no longer proof of identity.
If You’ve Been Targeted:
- Report it to the FBI’s Internet Crime Complaint Center (ic3.gov) — mention that AI voice cloning was used
- Contact your local police department
- If you sent money, contact your bank or the payment service immediately
- Report to the FTC at reportfraud.ftc.gov
Sources: FBI Internet Crime Complaint Center (IC3) 2025 Annual Report; FBI IC3 Public Service Announcement, December 2024: “Criminals Use Generative Artificial Intelligence to Facilitate Financial Fraud.” View the full AI Scams hub page with state-by-state data.
Back to AI-Powered Scams Hub | 2025 FBI Elder Fraud Data | Find Your State Attorney General
