AI Deepfake Video Scams Targeting Seniors: When You Can’t Trust What You See

Already been scammed? Read our First 24 Hours Emergency Guide for critical steps to take immediately.

This page is part of our AI-Powered Scams Targeting Seniors series.

FBI IC3 2025: The FBI reported that deepfake video technology is now routinely used in investment fraud, romance scams, and authority impersonation targeting seniors. Deepfake-related fraud in the US surged 700% in early 2025. Total deepfake-related losses in the US reached $1.1 billion in 2025, tripling from $360 million in 2024. The FBI’s December 2024 public advisory specifically warned that real-time deepfake video calls are being used to defeat identity verification during financial transactions.

What Is a Deepfake Video Scam?

A deepfake video scam uses artificial intelligence to create or manipulate video in real time, making it appear as though someone is saying or doing something they never actually did. In the context of senior fraud, criminals use deepfakes to impersonate trusted individuals — bank officials, financial advisors, romantic interests, or even family members — during live video calls.

Unlike pre-recorded fake videos, today’s deepfake technology can operate in real time, allowing scammers to have interactive video conversations while wearing someone else’s face. This makes video calls, once considered a reliable way to verify identity, no longer trustworthy.

How Deepfake Video Scams Work:

  • Romance scams: A scammer uses AI-generated video to “prove” they’re real during video dates. The senior sees a convincing face that matches the profile photos, building deep trust before the financial requests begin.
  • Investment fraud: Criminals create deepfake videos of celebrities or financial experts endorsing an investment opportunity, or impersonate a “personal advisor” on video calls to guide victims through depositing money into fraudulent platforms.
  • Authority impersonation: A scammer video-calls a senior while wearing the deepfaked face of a bank manager, government official, or law enforcement officer, creating intense pressure to comply with demands.
  • Family impersonation: Combined with voice cloning, a scammer can appear and sound exactly like a family member on a video call, making emergency scams nearly impossible to detect through visual or audio cues alone.

Real-World Cases:

  • The $25.6 Million Video Call: In early 2024, a Hong Kong finance worker was tricked into transferring $25.6 million after a video call with what appeared to be his company’s CFO and several colleagues — all of whom were deepfakes. The criminals recreated every participant’s face and voice using publicly available footage. While this case involved a corporate employee rather than a senior, it demonstrates the technology now being deployed against individuals of all ages.
  • Deepfake Romance Video Dates: The FBI reported multiple cases in 2025 where romance scammers conducted deepfake video calls with elderly victims, appearing as the attractive persona they had created. Victims who had been cautious enough to demand a video call — once considered proof of identity — were completely deceived.
  • Fake Financial Advisors: Scammers have used deepfake video to impersonate real, licensed financial advisors during video consultations. Seniors who verified the advisor’s name and credentials online found real SEC-registered professionals — not realizing the person on their screen was an impostor wearing a digital mask.

Red Flags of a Deepfake Video Scam:

  • The person on video avoids turning their head or making sharp movements (deepfakes sometimes glitch with sudden motion)
  • Lighting on the face doesn’t match the background
  • The person keeps the call short or makes excuses about poor connection quality
  • Lip movements may be very slightly out of sync with audio
  • The person refuses to meet in person or at an official location
  • You are being asked to make financial decisions during or immediately after the video call

How to Protect Yourself:

  • Never make financial decisions based on a video call alone. Always verify through a separate, independent channel — call the person’s known number, visit the bank in person, or consult a trusted family member.
  • Ask the caller to perform unusual actions. Ask them to hold up a specific number of fingers, touch their ear, or turn their head rapidly. Deepfakes may struggle with unexpected movements.
  • Be cautious of “video proof.” A video call is no longer reliable proof of identity. Treat it as one factor, not the only factor.
  • Use your family code word if the caller claims to be a relative.

If You’ve Been Targeted:

  • Report it to the FBI at ic3.gov — note that deepfake video was involved
  • Contact your local police and your bank if money was sent
  • Save any screenshots or recordings of the video call as evidence

Sources: FBI Internet Crime Complaint Center (IC3) 2025 Annual Report; FBI IC3 Public Service Announcement, December 2024: “Criminals Use Generative Artificial Intelligence to Facilitate Financial Fraud.” View the full AI Scams hub page with state-by-state data.


Back to AI-Powered Scams Hub | 2025 FBI Elder Fraud Data | Find Your State Attorney General