A New Type of Sextortion Is Emerging — Powered by AI and Targeting Small Business Owners

Online sextortion has long been a weapon of manipulation and fear. Traditionally, scammers would lure victims into compromising situations via video chats, and then record or fake explicit materials to use as blackmail. But a more insidious method is now on the rise — one that no longer requires direct contact with the victim at all!

With the rise of AI technologies, scam cartels — many operating overseas — have evolved their tactics. They no longer need to convince a victim to undress or engage in explicit conversations. Instead, they leverage artificial intelligence to create fake, hyper-realistic images or videos of a person, often depicting them performing sexual acts on a simulated video call.

Who Are They Targeting?

These scammers aren’t randomly casting a net anymore. They strategically research their victims, using publicly available data from social media, business listings, directories, and local news.

They’re looking for a specific profile:

  • People who are financially stable but not high-profile

  • Individuals from smaller towns, who are less likely to have access to immediate legal help or high-tech resources

  • Those in their late 40s or older, often with limited technological fluency

  • Small business owners with minimal or part-time staff, that have LinkedIn and/or Facebook Business Page

These scams tend to focus on service-based businesses, including:

  • Independent contractors

  • Real estate agents

  • Owners of small financial or accounting practices

  • Local therapists

  • Landscaping and home repair providers

  • Tutors and private educators

  • Cleaning services

  • Beauty and wellness professionals

  • Mobile car detailers

  • Dog groomers and pet sitters

We’re seeing a disproportionate number of cases reported from the Eastern U.S., particularly in Georgia, North and South Carolina, Virginia, and Pennsylvania. The pattern is striking — and deeply concerning.

A Real Case We Recently Handled

Just a few weeks ago, a man contacted us in a panic. He owns a small service business in Pennsylvania and had received a threatening email, with AI-edited images of himself attached — made to look like he was engaging in explicit acts during a video call.

The scammers gave him a 48-hour deadline and demanded $1,500 in Bitcoin, threatening to send the footage to his family, employees, neighbors, and clients. They weren’t bluffing. They had already run a background check and provided him with accurate personal information — including the names of his children and business partners, and even his neighbours that were never on social media.

Terrified, he paid the $1,500. But that wasn’t the end. Days later, they demanded an additional $5,000.

Interestingly, a few months prior to the extortion, he had received a FaceTime call that appeared to come from a deceased relative, at least that’s what it said when he saw the FaceTime request on his phone. Confused and emotional, he answered the call. He said “hello” several times but got no response. The call lasted about a minute before he hung up. At the time, he brushed it off as a strange glitch — but now, it’s clear this was likely a setup. The perpetrators most likely recorded that interaction and later used AI tools to fabricate realistic footage for the scam.

Thankfully, that’s when he reached out to us. We were able to step in, assess the threat, block further damage, and shut down the blackmail operation. But this situation highlights how far these scammers are willing to go — and how real the threat has become.

What You Can Do

If you run a small business and receive any suspicious messages or AI-generated images:

  • Do not respond to the blackmailer

  • Do not pay — this only invites further demands

  • Document everything — screenshots, emails, usernames

  • Contact a professional immediately

We specialize in helping victims of online sextortion and impersonation. Our team works discreetly and quickly to assess your situation, stop the damage, and protect your name.

Need help now? Contact us 24/7. The consultation is free. The solution is real.

by Benjamin J., March 18, 2025.