- Spies, Lies & Cybercrime
- Posts
- Deepfake Kidnappings
Deepfake Kidnappings
Spies, Lies & Cybercrime by Eric O'Neill
BIG NEWS! Operation October 7: Your Mission Awaits
The countdown is on. October 7 marks the official launch of my new book, Spies, Lies, and Cybercrime. Consider this your invitation to the mission briefing at the International Spy Museum and sponsored by DeleteMe.
The launch event is running in person and online—so whether you’re reporting to HQ or tuning in from a safehouse, you can be part of it. Secure your clearance now.
Intel update: every week until launch, one lucky subscriber will win a free copy of the book. Entry is simple—stay subscribed and leave a comment on the newsletter. That’s all it takes to get your name on the list.
The winning Spy Hunter from Last week is: Janice M. I’ll be sending you a free signed copy of Spies, Lies, and Cybercrime!
Agents, the mission is live. Don’t sit this one out.
In This Issue
Title Story: A Florida mother’s terrifying $15,000 loss shows how AI is turning fake kidnappings into a global epidemic.
Cybersecurity Tip of the Week: How you can spot deepfakes and avoid this insidious scam.
Cybersecurity Breach of the Week: Divorce by chatbot—why outsourcing your marriage to AI is a breach of trust you won’t recover from.
Tech of the Week: A Virginia grandmother asked ChatGPT for Powerball numbers, won $150,000, and gave it all away.
Appearance of the Week: Live near Atlanta? Join me at Georgia Tech for a book talk.
Closing Mission Brief: Join the October 7 launch of Spies, Lies, and Cybercrime—and don’t forget, subscribers who comment are entered to win a free copy every week.
Title Story
The Deepfake Kidnapping Scam

“When I picked up the phone, it was my daughter’s voice,” Sharon Brightwell recalled. “It was her crying voice, she was hysterical.”
The Dover, Florida, mother froze. On the other end of the line, her daughter begged for help, sobbing and gasping between words. Sharon recognized the cadence, the tone, the way her daughter cried when she was younger. Every nerve in her body told her it was real.
Her daughter explained that she had been in a car accident. She’d been texting while driving, had struck a pregnant woman, and was now in serious trouble. Her phone had been confiscated by authorities, which explained the unfamiliar number.
Then another voice took over — a man who introduced himself as a public defender. He said Sharon’s daughter had been taken into custody and needed $15,000 in bail money immediately.
“I said, ‘You have got to be kidding me,’” Sharon remembered.
But the fear was stronger than the doubt. Desperate to help, she drove to her bank, withdrew the cash, and waited at her home for a so-called “legal courier.” She handed the money over without hesitation.
“When I saw them pull off, I had the most sick feeling in my stomach,” she said.
Her daughter hadn’t been in an accident. She wasn’t in jail. The crying voice had been an AI clone, generated from scraps of audio stolen online. Sharon had just handed over $15,000 to a crime that didn’t exist. Hillsborough County detectives are now investigating the case, but with only slim leads and fewer prospects to catch the criminals.
A Familiar Scam With a Terrifying Upgrade
I’ve written before in [newsletter] about “virtual kidnappings,” where scammers call a parent and bluff their way through a ransom demand.
AI voice-cloning technology, available to anyone with a laptop and a social media clip, can replicate a person’s speech in seconds. That’s all it takes to spin a plausible story and keep a victim hooked. Sharon didn’t stop to question what she heard because the voice was real enough in the one moment when fear overwhelmed logic. Her ordeal isn’t isolated.
Families in Washington state’s Highline School District have reported nearly identical calls — cloned voices of their children crying for help, paired with ransom demands. Schools rushed to alert parents and suggested creating family “safe words” to guard against the scam. In New York’s Westchester County, parents were targeted too, forcing police and school officials to issue public warnings.
Outside the U.S., the crime has become an industry. In Brazil, police have disrupted organized groups running sophisticated false-kidnapping operations with trained callers and coordinated payment systems. In India, where hundreds of millions rely on WhatsApp, AI-generated voices are being blasted at scale, tricking families into ransom payments or “emergency” transfers.
And the tools are only getting easier to access. On Telegram and other underground forums, criminals can now buy deepfakes as a service. For a few hundred dollars, a scammer with no technical skills can purchase a convincing voice or video tailored to their next target.
Why It Works
Technology is part of the story, but psychology is the engine.
The human brain is wired to recognize — and trust — the voices of loved ones. When a parent hears their child screaming, instinct takes over. There is no careful analysis, no skepticism, no time to double-check. The only thought is to act.
Scammers know this. Urgency is the weapon. It’s the same tactic behind ransomware — act immediately or lose everything. Here, the loss feels even greater: the life of a child.
Cybersecurity Tip of the Week
Act Like a Spy Hunter
There’s no easy technological fix for AI deepfake scams on the horizon. For now, defense must come from awareness and preparation.
Family code words. A secret phrase only you and your children know, to verify an emergency call. In my family we use the first line from an obscure poem.
Secondary channels. If a call feels wrong, reach out to someone who can confirm the situation. Text your loved one or someone who can contact them while you stall the “kidnapper.”
Community training. Schools, churches, and employers can prepare families before scammers strike.
Just as we teach children to “stop, drop, and roll,” families now need a new reflex: pause, verify, and have a safe word.
Cybersecurity Breach of the Week
When AI Breaks a Marriage
Not every breach involves stolen data. Some tear through trust.
A husband of 15 years thought he and his wife were repairing their marriage. Then he discovered she wasn’t turning to him or even a therapist—she was leaning on ChatGPT. She fed their arguments, her frustrations, and even their son’s pleas for them to stay together into the chatbot. What came back was a mirror of her anger, polished and amplified.
Soon she began using ChatGPT’s words in texts and conversations. Old conflicts resurfaced, magnified. Issues they’d resolved in therapy re-emerged stronger. Within four weeks, the marriage collapsed.
According to reporting, more couples are pulling AI into their most intimate conflicts. But chatbots aren’t therapists. They’re validation engines. Ask them to confirm your pain, and they’ll hand it back to you wrapped in authority. That feedback loop can turn a rocky patch into a full-blown collapse. If you’re outsourcing your marriage to a chatbot, don’t worry—AI can draft the divorce papers too.
Act Like a Spy Hunter: AI is not a substitute for human connection or professional counseling. It will tell you what you want to hear, not what you need to face. And sometimes, that’s the most dangerous breach of all. Human connection is the glue that holds a marriage together.
Tech of the Week
Grandma, ChatGPT, and a Powerball Surprise

Carrie Edwards, a grandmother from Virginia, wanted to shake up her usual Powerball routine. Instead of birthdays or “lucky numbers,” she asked ChatGPT to pick for her. The chatbot spit out a set of digits, and Carrie rolled the dice.
Lightning struck. Her ticket matched enough numbers to win $150,000.
Most of us would be planning a vacation, a car, or at least a bigger grocery budget. Carrie didn’t keep a dime. She donated the entire prize to charity, giving to causes close to her heart—including dementia research in honor of her late husband, Steve, who passed away in 2023.
After the heavy stories of the week, it’s refreshing to see technology spark joy instead of fear. Sometimes AI doesn’t need to change the world. Sometimes it just needs to pick the right numbers—and remind us what people can do with a little good fortune.
Appearance of the Week
Friends in Atlanta! Join me on September 29 at Georgia Tech for a Scholars Event and book signing.
Closing Note
We’ll wrap this week on a high note. My new book, Spies, Lies, and Cybercrime, officially launches on October 7—and you’re invited to the celebration.
The launch will be both in person and virtual, so whether you can join me on-site or from your desk, you can be part of the event. You can register HERE.
And don’t forget—the promotion is still live. Every week, I’m giving away a copy of Spies, Lies, and Cybercrime to one lucky subscriber. Entering is simple: stay subscribed and leave a comment on the newsletter. That’s it. It’s my way of saying thanks for being part of this community as we head toward the book’s release. I can’t wait to share it with you.

Become An AI Expert In Just 5 Minutes
If you’re a decision maker at your company, you need to be on the bleeding edge of, well, everything. But before you go signing up for seminars, conferences, lunch ‘n learns, and all that jazz, just know there’s a far better (and simpler) way: Subscribing to The Deep View.
This daily newsletter condenses everything you need to know about the latest and greatest AI developments into a 5-minute read. Squeeze it into your morning coffee break and before you know it, you’ll be an expert too.
Subscribe right here. It’s totally free, wildly informative, and trusted by 600,000+ readers at Google, Meta, Microsoft, and beyond.
Like What You're Reading?
Don’t miss a newsletter! Subscribe to Spies, Lies & Cybercrime for our top espionage, cybercrime and security stories delivered right to your inbox. Always weekly, never intrusive, totally secure.
Are you protected?
Recently nearly 3 billion records containing all our sensitive data was exposed on the dark web for criminals, fraudsters and scammers to data mine for identity fraud. Was your social security number and birthdate exposed? Identity threat monitoring is now a must to protect yourself? Use this affiliate link to get up to 60% off of Aura’s Cybersecurity, Identity monitoring and threat detecting software!

Use this Link to get a 30 days trial + 2-% Beehiiv!

Ready for Next Week?
What do YOU want to learn about in my next newsletter? Reply to this email or comment on the web version, and I’ll include your question in next month’s issue!
Thank you for subscribing to Spies, Lies and Cybercrime. Please comment and share the newsletter. I look forward to helping you stay safe in the digital world.
Best,
Eric
Let's make sure my emails land straight in your inbox.
Gmail users: Move this email to your primary inbox
On your phone? Hit the 3 dots at top right corner, click "Move to" then "Primary."
On desktop? Close this email then drag and drop this email into the "Primary" tab near the top left of your screen
Apple mail users: Tap on our email address at the top of this email (next to "From:" on mobile) and click “Add to VIPs”
For everyone else: follow these instructions
Partner Disclosure: Please note that some of the links in this post are affiliate links, which means if you click on them and make a purchase, I may receive a small commission at no extra cost to you. This helps support my work and allows me to continue to provide valuable content. I only recommend products that I use and love. Thank you for your support!
Reply