Deepfake porn: where AI goes to die

In Dante's Inferno, Hell is imagined as a conical pit with ever-deepening rings dedicated to the torment of worse and worse sinners. At the very bottom is Satan himself, constantly gnawing on Judas, the betrayer of Jesus Christ.

While much of Dante's mediaeval imagery would be lost on most people today, we still recognize a connection in language between lowness and badness. Calling deepfake porn the nadir of how artificial intelligence is used expresses my opinion of it, and also the opinion of women who have become victims of having their faces stolen and applied to pornographic images.

A recent article by Eliza Strickland in IEEE Spectrum shows both the magnitude of the problem and the largely ineffective measures that have been taken to mitigate this evil—for evil it is.

With the latest AI-powered software, it can take less than half an hour to use a single photograph of a woman's face to produce a 60-second porn video that makes it look like the victim was a willing participant in whatever debauchery the original video portrayed. A 2024 research paper cites a survey of 16,000 adults in ten countries and shows that 2.2 percent of the respondents reported being a victim of "non-consensual synthetic intimate imagery," which is apparently just a more technical way of saying "deepfake porn." The US was one of the ten countries included, and 1.1 percent of the respondents in the US reported being victimized by it. Because virtually all the victims are women and assuming men and women were represented equally in the survey, that means one out of every fifty women in the US has been a victim of deepfake porn.

That may not sound like much, but it means that over 3 million women in the US have suffered the indignity of being visually raped. Rape is not only a physical act; it is a shattering assault on the soul. And simply knowing that one's visage is serving the carnal pleasure of anonymous men is a horrifying situation that no woman should have to face.

If a woman discovers she has become the victim of deepfake porn, what can she do?

Strickland interviewed Susanna Gibson, who founded a nonprofit called MyOwn to combat deepfake porn after she ran for public office and the Republican party of Virginia mailed out sexual images of her made without her consent. Gibson said that although 49 of the 50 US states have laws against nonconsensual distribution of intimate imagery, each state's law is different. Most of the laws require proof that "the perpetrator acted with intent to harass or intimidate the victim," and that is often difficult, even if the perpetrator can be found. Depending on the state, the offense can be classified as either a civil or criminal matter, and so different legal countermeasures are called for in each case.

Removing the content is so challenging that at least one company, Alecto AI (named after a Greek goddess of vengeance) offers to search the Web for a person's image being misused in this way, although the startup is not yet ready for prime time. In the absence of such help, women who have been victimized have to approach each host site individually with legal threats and hope for the best, which is often pretty bad.

The Spectrum article ends this way: "…it would be better if our society tried to make sure that the attacks don't happen in the first place." Right now, I'm trying to imagine what kind of a society that would be.

All I'm coming up with so far is a picture I saw in a magazine a few years ago of Holland, Michigan. I have no idea what the rates of deepfake porn production are in Holland, but I suspect they are pretty low. Holland is famous for its Dutch heritage, its homogeneous culture, and its 140 churches for a town of only 34,000 people. The Herman Miller furniture company is based there, and the meme that became popular a few years ago, "What Would Jesus Do?" originated there.

Though I've never been to Holland, Michigan, it seems like it's a place that emphasizes human connectedness over anonymous interchanges. If everybody just put down their phones and started talking to each other instead, there would be no market for deepfake porn, or for most of the other products and services that use the Internet either.

As recently as 40 years ago, we had a society in which deepfake porn attacks didn't happen (at least not without a lot of work that would require movie-studio-quality equipment to do). That was because the technology wasn't available. So there's one solution: throw away the Internet. Of course, that's like saying "throw away the power grid," or "throw away fossil fuels."

But people are saying the latter, though for very different reasons.

icon

Join Mercator today for free and get our latest news and analysis

Buck internet censorship and get the news you may not get anywhere else, delivered right to your inbox. It's free and your info is safe with us, we will never share or sell your personal data.

This little fantasy exercise shows that logically, we can imagine a society (or really a congeries of societies—congeries meaning "disorderly collection") in which deepfake porn doesn't happen. But we'd have to give up a whole lot of other stuff we like, such as the ability to use advanced free Internet services for all sorts of things other than deepfake porn.

The fact that swearing off fossil fuels—which are currently just as vital to the lives of billions as the Internet—is the topic of serious discussions, planning, and legislation worldwide, while the problem of deepfake porn is being dealt with piecemeal and at a leisurely pace, says something about the priorities of the societies we live in.

I happen to believe that the devil is more than an imaginative construct in the minds of medieval writers and thinkers. And one trick the devil likes to pull is to get people's attention away from a present immediate problem and onto some far-away future threat that may not even happen. His trickery appears to be working fine in the fact that deepfake porn is spreading with little effective legal opposition, while global warming (which is undeniably happening) looms infinitely larger on the worry lists of millions. 


Do you have any ideas about how to combat the scourge of deepfake porn? 


Karl D. Stephan is a professor of electrical engineering at Texas State University in San Marcos, Texas. His ebook  Ethical and Otherwise: Engineering in the Headlines is available in Kindle format and also in the  iTunes store.

This article has been republished, with permission, from his blog Engineering Ethics.

Image credit: Bigstock


 

Showing 11 reactions

Please check your e-mail for a link to activate your account.
  • Tim Lee
    Mucho gracias, Mrs Cracker! Have a blessed day too!
  • mrscracker
    What a kind and decent man you are, Mr. Lee.
    May God bless you today. 🙏
  • Tim Lee
    I shared my comment with a friend and she said that my calling the evil of deepfake porn an expression of the loss of personal integrity in an age of instant gratification sounded cruel to the victims. If you are reading this and have experienced or know someone who has experienced deepfake porn’s “shattering assault on the soul”, I am deeply sorry. Its evil goes far beyond an (un)natural progression from porn.
  • mrscracker
    It’s sad to see how selectively we can understand the issues of privacy, humiliation & pornography.
  • James Dougall
    According to Wikipedia: In September 2023, a Republican operative provided The Washington Post with videos showing Gibson performing sex acts with her husband on the adult streaming site Chaturbate. No law violated as she, her husband, and viewers were all consenting adults. The videos had been illegally publicly archived on the website Recurbate, which, according to her lawyers, Gibson was not aware of and did not authorize.13

    Gibson characterized the situation as “an illegal invasion of my privacy designed to humiliate me and my family”
  • Tim Lee
    Exploding hacked pagers and walkie talkies in Lebanon and Syria has led some to ask if this can happen to our mobile phones. It’s not a credible threat; our electronic devices are far more likely to be hacked in other ways, often with severe consequences:
    https://news2.greatnortherneis.org/page/can-your-phone-explode-the-truth-about-hacker-threats

    We live in an age of instant gratification and there’s a price to pay in loss of personal integrity. Deepfake porn is one of the most horrendous expressions of this but it is an (un)natural progression from porn, which offers disembodied sex at the expense of personal relationships.

    Instant knowledge has distanced us from deeper thinking and instant connection has distanced us from deeper bonding. We are becoming islands in a ocean of lonely souls. The power at our fingertips is turning us into gods of our own little worlds. It’s not exploding devices that we need to fear but imploding minds and hearts.
  • Lakshmi Mullahoo-Garro
    commented 2024-09-20 02:41:00 +1000
    “Republican party of Virginia mailed out sexual images of her made without her consent.”
    This is a false statement. Gibson took and streamed the videos herself while soliciting viewers for money. Deep fake porn is terrible, but it has nothing to do with the Gibson porn videos.
  • Emberson Fedders
    commented 2024-09-19 10:06:27 +1000
    I agree with the sentiments of this essay. Governments need to crack down hard on this.
  • mrscracker
    I thought climate change was related to everything Mr. Mouse. I’m surprised you didn’t see that connection here.
    :)
  • Anon Emouse
    commented 2024-09-18 23:18:20 +1000
    Karl,

    How are climate change and deep fake porn necessarily related? Why are you bringing up topics out of left field when you had a good essay before bringing it up?
  • Karl D. Stephan
    published this page in The Latest 2024-09-18 23:13:23 +1000