AI-Powered Romance Scams Targeting Seniors: Fake Faces, Fake Feelings, Real Losses
Already been scammed? Read our First 24 Hours Emergency Guide for critical steps to take immediately.
This page is part of our AI-Powered Scams Targeting Seniors series.
FBI IC3 2025: Romance scams cost seniors $584 million in 2025. The FBI specifically highlighted AI as a growing factor, noting that scammers now use AI-generated profile images that defeat reverse image searches, AI chatbots that maintain convincing emotional conversations around the clock, and deepfake video for “proof of identity” calls. Many romance scams now transition into investment fraud (“pig butchering”), with AI managing both the emotional manipulation and the fake trading platform.
What Are AI-Powered Romance Scams?
AI-powered romance scams are an evolution of traditional romance fraud where criminals use artificial intelligence to create entire fake identities — faces, voices, and personalities — to build emotional relationships with seniors and ultimately steal their money. In 2025, romance scams cost American seniors $584 million, and AI is making these scams more convincing and scalable than ever.
How AI Transforms Romance Scams:
- AI-generated profile photos: Criminals use AI to create photorealistic images of attractive, trustworthy-looking people who don’t actually exist. These faces won’t appear in any reverse image search, defeating what was once a reliable detection method.
- AI chatbots for conversations: Large language models can maintain emotionally engaging conversations 24 hours a day, remembering details, expressing affection, and adapting to the victim’s personality. One scammer can now run dozens of “relationships” simultaneously.
- Deepfake video dates: Some scammers use real-time deepfake technology to appear on video calls as the fake persona, eliminating the last line of defense victims had: “I’ve seen them on video, they must be real.”
- AI voice messages: Scammers send personalized voice messages in a consistent synthetic voice, adding another layer of believability to the fake relationship.
Real-World Cases:
- The AI Boyfriend Fraud: In multiple cases reported to the FBI, elderly women developed months-long relationships with people who existed entirely as AI constructs — AI-generated photos, AI chatbot conversations, and AI voice messages. Some victims reported the “partner” being available to chat at any hour, never making mistakes about previous conversations — hallmarks of AI that felt like attentiveness.
- Romance-to-Investment Pipeline: The FBI’s 2025 report documented a major trend where romance scammers use AI to build trust, then gradually steer victims toward “investment opportunities” — a technique known as “pig butchering.” Seniors lost an average of $116,000 per victim in romance-turned-investment scams, with AI making the initial romance phase faster and more convincing.
- Deepfake Video Dates: A growing number of victims reported having video calls with their romantic interest that appeared completely natural. In reality, the scammer was using real-time deepfake software to appear as the AI-generated person in their profile photos. Victims who had been cautious enough to insist on a video call were still deceived.
Red Flags of an AI Romance Scam:
- The person’s profile photo looks perfect but appears nowhere else online
- They are always available to chat, respond instantly, and never seem to need sleep
- The relationship moves unusually fast — strong emotions and talk of the future within days or weeks
- They always have a reason they can’t meet in person (deployed military, working overseas, travel restrictions)
- Financial requests begin — medical bills, travel costs, investment opportunities, or emergency needs
- They ask you to move communication to a private platform away from the dating site
How to Protect Yourself:
- Never send money to someone you haven’t met in person. This remains the single most important rule, regardless of how real the person seems on screen.
- Be aware that video calls can be faked. A person appearing on a video call is no longer proof they are real.
- Talk to family or friends. Share the relationship with people you trust. Scammers work to isolate victims — resist that isolation.
- Be cautious with anyone who won’t meet in person. If they always have a reason they can’t meet face-to-face, that itself is a warning sign.
If You’ve Been Targeted:
- Report it to the FBI at ic3.gov
- Report the profile to the dating platform
- Contact your bank if you sent money
- Report to the FTC at reportfraud.ftc.gov
Sources: FBI Internet Crime Complaint Center (IC3) 2025 Annual Report; FBI IC3 Public Service Announcement, December 2024: “Criminals Use Generative Artificial Intelligence to Facilitate Financial Fraud.” View the full AI Scams hub page with state-by-state data.
Back to AI-Powered Scams Hub | Traditional Romance Scams Guide | Find Your State Attorney General
