AI deepfake scam cost this woman her house and ,000

AI deepfake scam cost this woman her house and $81,000

Quick answer: A California woman lost $81,000 in cash and was manipulated into selling her paid-off house at a steep discount after scammers used AI deepfake video of actor Steve Burton to convince her she was in a relationship with him. The scam used persuasive AI-generated video calls and voice messages that her family said were indistinguishable from the real person. This case is part of a growing wave of AI-powered romance fraud that has devastated Americans $1.3 billion by 2024.

She thought she was talking to a famous actor. The video calls seemed real. The voice sounded real. By the time her family discovered the truth, she had lost her home, her savings and her retirement security.

According to one Fox News investigationa Southern California woman identified as “Abigail” lost more than $81,000 in cash and was manipulated into selling her paid-off home – worth about $550,000 – to a real estate wholesaler for just $350,000. Her family estimates that the total financial damage is more than $280,000.

How the AI ​​Deepfake scam worked

The scam followed a methodical pattern that exploited both technology and human psychology:

  • First contact on Facebook: A profile claiming to be actor Steve Burton (of “General Hospital” fame) sent Abigail a direct message
  • Moved to encrypted messages: The scammer moved conversations to WhatsApp, isolating Abigail from friends and family who might recognize the fraud
  • AI deepfake video calls: Scammers used artificial intelligence to generate a realistic video of ‘Steve Burton’ speaking directly to Abigail. Her daughter told Fox News: “It wasn’t grainy… you couldn’t see it with the naked eye”
  • Escalating financial demands: Starting with a $500 gift card, requests grew to 110 gift cards ranging from $25 to $500, then cash and Bitcoin
  • Required confidentiality: The scammer told Abigail to hide the ‘relationship’ from her family, a textbook isolation tactic
  • Pressured home sales: The scammer claimed they would buy a beach house together and convinced Abigail to sell her paid-off apartment at a deep discount.
  • Almost more shipped: Abigail was about to fork over another $70,000 from the house sale before her family discovered what was going on
$81,000Money removed

$200,000Lost equity in the house

$1.3 billionTotal Romance Scam Losses (2024)

The technology has changed: This is not the old email from the “Nigerian prince”. AI-generated deepfake video can now reproduce a person’s face, expressions and voice in real time. As Ricardo Amper, CEO of Incode Technologies, told Fox News: “It’s scary because biology has taught us to believe the person we look at and the voice we hear. This isn’t really about your instincts.”

The human cost

By the time Abigail’s daughter, Vivian Ruvalcaba, discovered the scam, the damage was catastrophic:

  • Abigail’s husband, in his mid-seventies and still working, lost the house that was his retirement plan
  • The couple was evicted from their own former home and locked out
  • Abigail now lives with family out of state
  • LAPD appointed an investigator but acknowledged “how little they can help” as the scammers operate abroad
  • The initial attorneys quoted more than $150,000 to pursue the case

The most dangerous scams don’t look like scams. When AI can replicate someone’s face and voice in a video call, the rules of ‘trust but verify’ have fundamentally changed.–Steve Rhode

Why deepfake fraud with AI is different

Traditional romance scams relied on stolen photos and text messages. AI deepfakes have eliminated the last reliable way most people identified fraud: seeing and hearing the person.

Old scam tactics

  • Stolen photos from someone else’s social media
  • Communication only via text (always an excuse to avoid video)
  • Broken English or generic messages
  • Easy to find by reverse image search

AI Deepfake Tactics

  • Real-time AI-generated video of a person’s face
  • Voice cloning that reproduces tone, cadence and speech patterns
  • Personalized conversations that reference shared ‘memories’
  • Almost impossible to detect without technical tools

The The FBI has warned that criminals are increasingly using generative AI to facilitate financial fraud, including creating realistic video and audio to impersonate trusted individuals.

The magnitude of the problem

Abigail’s case is devastating, but far from unique:

  • Romance scams are costing Americans $1.3 billion by 2024the FTC said
  • The average romance scam victim loses about $15,000
  • The total number of fraud losses in the US has been exceeded $12.5 billion by 2024
  • A French woman lost about $850,000 to scammers who used an AI-generated Brad Pitt impersonation
  • Only about 22% of scam victims contact the FBI; less than 30% contact local police
Why Shame Keeps Victims Silent: Most victims of fraud never report the crime. Shame and embarrassment are the scammer’s greatest allies: they keep victims isolated and prevent others from noticing the warning signs. If someone you know has been scammed, judgment only makes the problem worse. Understanding makes it better.

Warning signs to look out for

  • Platform change: Contact starts on social media and then quickly moves to encrypted apps like WhatsApp or Telegram
  • Confidentiality requirements: “Don’t tell your family about us” is always a warning sign
  • Request a gift card: No legitimate person asks for gift cards as payment or proof of love
  • Requirements for cryptocurrency: Scammers prefer untraceable payment methods
  • Urgency and emotional pressure: Each request is urgent and connected with an emotional appeal
  • Video that seems “too good”: If a celebrity video calls you on WhatsApp, it’s not real
  • Don’t assume that a video call proves someone’s identity; AI deepfakes can fool anyone
  • Don’t send money, gift cards, or cryptocurrency to someone you’ve only met online
  • Don’t let shame keep you from telling your family or reporting the crime

How to protect elderly relatives

Abigail’s case highlights why proactive protection is more important than reactive advice.

  • Have regular conversations about online scams – no lectures, conversations
  • Set account alerts on bank and credit card accounts for transactions over $100
  • Establish a family verification protocol – agree on a code word or question that no AI can replicate
  • Consider a living trust or power of attorney to prevent the sale of real estate without a trusted co-signer
  • Monitor social media activity for new “friendships” with celebrities or unknown contacts
  • Freeze credit reports at all three agencies to prevent scammers from opening new accounts
Before you sign anything: If you are being pressured to sell real estate, sign contracts or transfer money as part of a ‘relationship’, stop. Run each contract or agreement through the Contract Decoder first. It’s free and it can save you from a decision you can’t undo.

What to do if you or someone you know has been scammed

  • Stop all contact contact the suspected scammer immediately
  • Please contact your bank to freeze accounts and dispute transactions
  • File a report with the FBI’s IC3 bee ic3.gov
  • Report to the FTC bee reportfraud.ftc.gov
  • Contact your local police to create a paper trail
  • Consult a lawyer about recovering assets, especially if real estate was sold
  • Seek emotional support — Victims of fraud often experience trauma similar to domestic violence
The reality: Getting money back is extremely difficult once money has been sent via gift cards, cryptocurrency or bank transfers. Prevention is the only reliable defense. If you or a loved one are dealing with debt due to scams, take the Find Your Path quiz to explore your options.

Key Takeaways

  • A California woman lost $81,000 and her paid-off house after AI deepfake scammers impersonated actor Steve Burton in realistic video calls
  • AI-generated deepfake video and voice clones make it virtually impossible to tell real from fake based on images and sound alone
  • Romance scams will cost Americans $1.3 billion by 2024 – and AI is making them more persuasive
  • Scammers isolate victims from family, demand secrecy and escalate from small gift cards to real estate sales
  • Protect elderly family members with account alerts, verification protocols, and regular conversations about online safety
  • If someone you’ve only met online asks for money, gift cards, or cryptocurrency, it’s a scam, no matter what the video shows

(Sources: Fox News | FTC Consumer Sentinel 2024 | FBI IC3 PSA)

Frequently asked questions

How do AI deepfake romance scams work?

Scammers use artificial intelligence to generate realistic videos and voices that mimic a real person, often a celebrity. They connect on social media, move to encrypted apps, build an emotional relationship, and then gradually escalate financial demands from gift cards to cash, cryptocurrency, and even real estate sales.

Can AI really create convincing fake video calls?

Yes. Current AI deepfake technology can generate real-time video that reproduces a person’s face, expressions, and voice with enough fidelity to fool most people. In Abigail’s case, her daughter said the deepfake video “wasn’t grainy” and that “you couldn’t tell with the naked eye” that it was fake.

How Much Do Americans Lose to Romance Scams?

According to FTC data, romance fraud will cost Americans $1.3 billion in 2024 alone, with an average loss of about $15,000 per victim. The actual figure is probably higher, as most victims never report the crime out of shame and embarrassment.

How can I protect elderly relatives from deepfake scams?

Set up bank account alerts for transactions over $100, set a family codeword for verification, have regular non-judgmental conversations about online safety, consider a living trust to prevent unauthorized real estate sales, and monitor social media for new unknown contacts. A video call is no longer proof of identity.

What should I do if I have been scammed by a deepfake?

Stop all contact with the scammer immediately. Contact your bank to freeze accounts and dispute charges. File reports with the FBI’s IC3 (ic3.gov) and the FTC (reportfraud.ftc.gov). Contact your local police. Consult an attorney, especially if property or large sums of money are involved. Recovery is difficult, but documentation is essential for any legal action.

Consumer debt expert and investigative writer. Survivor of Personal Bankruptcy (1990). Award-winning author of the Washington Post. Exposing debt fraud since 1994.

#deepfake #scam #cost #woman #house

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *