A.I.,  Guest posts

Deep Fakes Lead to Deep Trouble

Deep Fake: AI generated monkey
AI generated monkey

After I read & blurbed Deep Fake Double Down, I asked my friend and TKZ blogmate, Debbie, if she’d give us an in-depth look at deep fakes. The more we learn, the better we’ll be able to protect ourselves. Or at least pause — take a breath — before we presume everything we see online is real.

“Reality bleeds through the pages of Deep Fake Double Down—creepy, intense, and unputdownable. Burke’s intersecting plot lines and endearing characters strangle-held me from the first page to the last. A memorable thriller!”— Me 😉

The floor is yours, Debbie!

 

What is Deep Fake?

Suppose one day you check your social media feed and there’s a new video of you, wearing clothes you’d never wear, speaking words you never said, and passionately making out with someone you’ve never met.

The face is yours, the body is yours, the voice is yours. But it isn’t you.

That’s called deep fake. Artificial intelligence (AI) software can clone you (or anyone) and create video and audio that is indistinguishable from the real you.

Deep fakes have been called Photoshop on steroids.

Almost as soon as deep fake software was introduced in 2017, celebrities became popular targets. Jim Carrey’s face was swapped for Jack Nicholson’s in The Shining.

But there are darker sides to deep fakes, that could radically change world events like Zelenskyy ordering Ukrainian troops to surrender.

On a more personal level, what if you’re in a custody dispute with the ex and fake video goes viral that shows you abusing your child?

Or fake video captures you committing a crime. How do you defend yourself against such evidence?

Revenge deep fakes affect the lives of celebrities, world figures, and just plain folks.

Even scarier is how easy deep fakes are to create.

A few short years ago, one had to be a tech wizard with expansive computer memory and expensive graphics processing units (GPU) to create deep fakes. Today, dozens of easy programs are free to download to your phone.

According to a March 2023 NPR article, Wharton business professor Ethan Mollick created a deep fake video of himself delivering a lecture written by ChatGPT that he never gave. Time to create: eight minutes. Cost: $11.

The speed at which deep fake technology is advancing makes one’s head spin. I started researching deep fakes around March, 2022, for a new thriller I was writing. The premise is how the technology can be (mis)used and weaponized to implicate an innocent woman for crimes she didn’t commit.

Deep Fake Double Down In my book, Deep Fake Double Down, a corrupt prison warden covers up the murder of an inmate by creating deep fake videos. They show that inmate escaping with the aid of a female guard who appears to be his lover. When the videos go viral, social media mobs pressure law enforcement to shoot first and ask questions later.

I wanted to explore how easily evidence can be created and/or manipulated to make an innocent person appear guilty. Turns out it’s pretty darn easy. And that’s scary.

When I started drafting the book, a trained eye could detect early versions of deep fakes. Blinking, mouth movements, and different lighting often gave clues. If you superimposed someone else’s face on a moving body, the planes and shadows of the face often were lit differently, not matching the movement.

I incorporated descriptions of those detection methods into the story. But, by the time the book was finished in April, 2023, those methods were obsolete because of improved software. Fortunately, an AI expert reviewed the manuscript and gave me updated information.

Deep Fakes: NerFs & Voice Cloning

NerFs (neural radiance field) can now take a two-dimensional photo and recreate it as a three-dimensional virtual person, moving and speaking as naturally as real life. To detect fakes, you need to go through a video frame by frame to determine which were original and which were computer generated.

Voice cloning adds another layer of synthetic authenticity. With only a few seconds of someone’s voice—a snippet of audio from a social media clip, a brief phone conversation, even the outgoing message on voicemail—AI recreates your voice perfectly with the same tone, tenor, expression, and speech mannerisms.

I had to do some hasty rewriting to meet the publishing deadline, but I made it. Deep Fake Double Down was current at that moment in time.

Now, a few weeks later, the software has probably leapt forward even farther.

No one can keep up, not even the tech engineers who develop AI.

AI trains itself. Here’s an explanation from U.S. Naval Institute:

“Deep fakes are most commonly described as forgeries created using techniques in machine learning (ML)—a subfield of AI—especially generative adversarial networks (GANs). In the GAN process, two ML systems called neural networks are trained in competition with each other. The first network, or the generator, is tasked with creating counterfeit data—such as photos, audio recordings, or video footage—that replicate the properties of the original data set. The second network, or the discriminator, is tasked with identifying the counterfeit data. Based on the results of each iteration, the generator network adjusts to create increasingly realistic data. The networks continue to compete—often for thousands or millions of iterations—until the generator improves its performance such that the discriminator can no longer distinguish between real and counterfeit data.”

Big Tech & Deep Fakes

Competition among Big Tech companies like Google, Microsoft, Meta (Facebook), and Amazon is ferocious. Without enough testing or oversight, the virtual arms race is already leading to unexpected, unpleasant surprises. AI “hallucinations” and “crazy and unhinged things” are being reported.

A March 2023 NPR article covered incidents of Microsoft’s new Bing chatbot talking back to the humans testing it. One bot became hostile and insulting to a reporter. Another bot professed love for the human using it.

“The bot called itself Sydney and declared it was in love with him. It said [New York Times reporter Kevin Roose] was the first person who listened to and cared about it. Roose did not really love his spouse, the bot asserted, but instead loved Sydney.”

Remember HAL in 2001: A Space Odyssey?

Maybe I should write sci-fi romance. How about: The Bot Who Loved Me.

In March, 2023, more than 1000 software developers, engineers, and scientists published an open letter asking for a six month pause on AI development. Tech giants like Elon Musk, Steve Wozniak, Andrew Yang, and many more are concerned about “the profound change of history of life on earth…AI labs [are] locked in an out-of-control race to develop and deploy ever more powerful digital minds that no one – not even their creators – can understand, predict, or reliably control.”

I wrote Deep Fake Double Down as an entertaining, fast-paced beach read that would also make people think.

Every day there’s more to think about.

 

Debbie Burke is privileged to blog with Sue on The Kill Zone and writes the award-winning Tawny Lindholm Thriller series.

Please visit Debbie’s website for a chance to win a signed copy of Deep Fake Double Down and a custom-designed pen.

 

 

 

 

Sue Coletta is an award-winning crime writer and an active member of Mystery Writers of America, Sisters in Crime, and International Thriller Writers. Feedspot and Expertido.org named her Murder Blog as “Best 100 Crime Blogs on the Net.” She also blogs on the Kill Zone (Writer's Digest "101 Best Websites for Writers"), Writers Helping Writers, and StoryEmpire. Sue lives with her husband in the Lakes Region of New Hampshire. Her backlist includes psychological thrillers, the Mayhem Series (books 1-3) and Grafton County Series, and true crime/narrative nonfiction. Now, she exclusively writes eco-thrillers, Mayhem Series (books 4-9 and continuing). Sue's appeared on the Emmy award-winning true crime series, Storm of Suspicion, and three episodes of A Time to Kill on Investigation Discovery. When she's not writing, she loves spending time with her murder of crows, who live free but come when called by name. And nature feeds her soul.

26 Comments

  • Diana Peach

    Wow. That was super scary. I clicked over to the Zelensky video and my heart nearly stopped. That’s so dangerous! I can just imagine this being used to ruin lives, cast blame, and generally create chaos. Human beings can’t be trusted with any of this! If we can find a way to misuse it, we will. Eek. A fascinating post, Debbie. I can see how you had to stay on your toes to keep up while writing your book. Congrats on the new book. Thanks for hosting, Sue.

    • Sue Coletta

      My thoughts exactly, Diana. Humans should not have this kind of power. I fear for my grandchildren. Will they ever be able to trust their own eyes? Imagine what that instability will do to a young mind. Frightening.

  • Steve Hooley

    Great blog post, Debbie and Kay. And great book, Debbie! This is certainly an area that wil bear watching as things get worse and new technology is unleashed for nefarious purposes.

  • sherry fundin

    The book sounds almost as terrifying as reality. I was ‘chosen’ by Google to participate in an AI study and I didn’t even think about it…I declined. I have gotten into reading a lot of apocalyptic/dystopian/science fiction novels that make me think of what our future could be. Our disregard for consequences may be our own undoing. BUt, I can’t help myself. No matter how terrifying the stories, series and movies, I can’t look away.
    sherry @ fundinmental

    • Sue Coletta

      I also find it terrifying, Sherry. In the not-so-distant future, we won’t be able to trust ANYTHING we see online. Imagine what that type of paranoia will do to the younger generation.

    • Debbie Burke

      “Our disregard for consequences may be our own undoing.” Sherry, you are sooooo right. Can’t stuff that genie back in the bottle.

      Sue, one recent article I read said, in 2030, people will not know what is real history. I think that’s already happened.

  • Vera Day

    I didn’t know AI was THAT advanced. How scary. But what a fascinating topic, Debbie, and congratulaitons on the recent release of Deep Fake Double Down. It sounds fantastic!

    • Debbie Burke

      Thanks, Vera. AI is fascinating at the same time it’s scary. Kinda like movies where a character grasps the knob to open the door and the audience yells, “No! Don’t open that door!” b/c they know the monster is on the other side.

      Forgot to mention: Deep Fake Double Down on sale for $.99 on Kindle, Nook, Apple, and more.

  • Staci Troilo

    Fabulous post. I have to say, this topic freaks me out. My son and I have watched hours of specials on AI, and the entire thing leaves me on edge. The creators have a sense of hubris that is going to come back and bite them, but unfortunately, it will also bite us. And likely be unstoppable.

    Your book sounds fascinating, Debbie. Wishing you much success with it.

    Thanks for hosting today, Sue.

  • Garry Rodgers

    Fascinating subject, Debbie & Sue. I’ve spent the past three months down the AI rabbit hole, and in those three months the advancements in AI have been immeasureable. I can’t say if the cons outweigh the pros – time will tell – but one thing is for sure and that’s that AI technology has only just begun and the race is on. BTW, best for your book, Debbie!

    • Debbie Burke

      Garry, I’m following your research with great interest. You’re braver than I am cuz you’re actually digging into the process! Keep us posted. If a chatbot insults you, Sue and I will go after it!

    • Sue Coletta

      This stuff freaks me out, Garry. How you venture down these rabbit holes is beyond me. Stay safe!

      • Debbie Burke

        Sue and friends, please forgive my short responses. My computer is in the shop and now HAL is trying to take over my phone!!!
        Will respond later when I get my computer back.

        • Sue Coletta

          Oh, no, Debbie! I had a similar thing happen last Friday… right before I was scheduled to go live on TikTok. Not fun.

          • Debbie Burke

            Sue, don’t these stinkers pick the wrong time to go wacko? But I’m sure you did a great job on TikTok.

            Just brought my computer home from the shop. All is well…so far! At least I can finally answer comments from your great readers.