Pics AND it Didn’t Happen: Sex Deepfake FBI Alert

Sextortionists stealing your innocent pictures to make AI nudes.

The Federal Bureau of Investigation is warning of an uptick in sextortion: Scrotes are targeting victims by feeding their headshots into deepfake apps and shaking them down for money. Generative AI is now making horrifically lifelike videos and images.

Lock up your social media! In today’s SB Blogwatch, we fear it’s probably too late.

Your humble blogwatcher curated these bloggy bits for your entertainment. Not to mention: Trek technobabble.

Fake Pr0n Hint

What’s the craic? Bill Toulas reports—“Sextortionists are making AI nudes from your social media”:

Creates a hostile environment
[There’s] a rising trend of malicious actors creating deepfake content. [They] threaten their targets with publicly leaking explicit images and videos … to scare victims into paying an extortion demand.

In many cases … compromising content is not real: … Sextortionists are now scraping publicly available images of their targets [which] are then fed into deepfake content creation tools that turn them into … sexually explicit content. [But] they look very real, so they can serve the threat actor’s blackmail purpose: [It] could still cause victims great personal and reputational harm. … This media manipulation activity has, unfortunately, impacted minors too.

There are multiple content creation tool projects available for free via GitHub, which can create realistic videos from just a single image of the target’s face. [It] creates a hostile environment for all internet users, particularly those in sensitive categories. … The UK has recently introduced a law … that classifies the non-consensual sharing of deepfakes as a crime.

Claroty

Fake news? Not according to Fortesa Latifi—“Deepfake porn victims are speaking out”:

The law lags
When Lauren* watched a video of her having sex with a man she had never been intimate with, she had a panic attack. … It was undeniably her face. She couldn’t stop crying. How could it look so realistic?

“I’d never had sex with Dan* so it didn’t seem possible.” But thanks to the rise in deepfake and face swap technology, it was possible: Dan had created a video that appeared to show him and Lauren having sex. … “Even though it was fake, it still made me feel really ashamed.”

Lauren turned to a lawyer who advised her … she couldn’t sue under revenge porn laws because it wasn’t technically revenge porn. … The law lags behind technological advances.

*—not their real names.

Horse’s mouth? Los Federales publish a PSA—“Malicious Actors Manipulating Photos and Videos”:

Exercise caution
Technology advancements are continuously improving the quality, customizability, and accessibility of … AI-enabled content creation. The FBI continues to receive reports from victims, including minor children and non-consenting adults, whose photos or videos were altered into explicit content.

the FBI has observed an uptick in sextortion victims reporting the use of fake images or videos created from content posted on their social media sites or web postings, provided to the malicious actor upon request, or captured during video chats. … The FBI urges the public to: …

    • Monitor children’s online activity and discuss risks. …
    • Exercise caution when accepting friend requests. …
    • Use discretion when interacting with known individuals online who appear to be acting outside their normal pattern of behavior. …
    • Secure social media and other online accounts using complex passwords … and multi-factor authentication.


Uh-oh. Unintended consequences ahoy. It’s vindication for people like ArsScene:

I hope that everyone who welcomes AI because it enables this or that trivial pursuit … takes a moment to reflect on this story. This sort of malicious use of AI is an inevitable consequence.

And it cuts both ways. As narcc observes:

AI isn’t the problem: … The problem is that deepfakes are going to be used to cast doubt on real photos and recordings. The last thing we need in this post-truth world.

Too late. YetAnotherLocksmith has the key: [You’re fired—Ed.]

Elon has tried to use “it could be a deep fake” as a defence in court — despite him saying … Tesla would be full self driving within the year … onstage in front of thousands of people, more than once … (which he’s now being sued over).

Hang on, is this sustainable? Coppercloud makes an interesting point:

I’m not sure this will work out for the extortionists. … A lot of people are going to go, “Well, if it’s that easy … making fake nudes … paying you isn’t going to stop anyone [else].” This is, unfortunately, a lose-lose situation.

What to do in our new un-utopia? Helcat suggestifies thuswise:

Unless it’s verified, don’t believe that it’s true. There’s been some very convincing Photoshopped pictures over the years: That’s been enough to question the validity of anything.

Meanwhile, I fear aliksy might be disappointed:

I hope to one day live in a world where a nude or sexual photo of someone is treated with a shrug, and not something to be blackmailed over.

And Finally:

Treknobabble

Previously in And Finally


You have been reading SB Blogwatch by Richi Jennings. Richi curates the best bloggy bits, finest forums, and weirdest websites … so you don’t have to. Hate mail may be directed to @RiCHi or [email protected]. Ask your doctor before reading. Your mileage may vary. Past performance is no guarantee of future results. Do not stare into laser with remaining eye. E&OE. 30.

Image sauce: Vladislav Nahorny (via Unsplash; leveled and cropped)

Richi Jennings

Richi Jennings is a foolish independent industry analyst, editor, and content strategist. A former developer and marketer, he’s also written or edited for Computerworld, Microsoft, Cisco, Micro Focus, HashiCorp, Ferris Research, Osterman Research, Orthogonal Thinking, Native Trust, Elgan Media, Petri, Cyren, Agari, Webroot, HP, HPE, NetApp on Forbes and CIO.com. Bizarrely, his ridiculous work has even won awards from the American Society of Business Publication Editors, ABM/Jesse H. Neal, and B2B Magazine.

richi has 658 posts and counting.See all posts by richi

Application Security Check Up