The truth about “Deep Fakes”


Siwei Lyu, Deep Fake Specialist

Our Full Measure investigation into fascinating and fast-moving “Deep Fake” technology explains why we in the news media must be more mindful than ever not to jump to conclusions based on what we think we see with our own eyes. And why you should be extra skeptical, too.

Watch the video at this link and read the transcript below:

Today, a fascinating and somewhat frightening trip down a technological rabbit hole. To a place where it’s becoming nearly impossible to believe your own eyes. Deep Fakes are videos that use Artificial Intelligence to make it look like someone said or did something they never said or did. As you can imagine, they’re raising all kinds of legal and policy concerns. That’s today’s cover story.

Siwei Lyu is our intrepid Deep Fake detective.

He’s an associate professor in Computer Science at the University at Albany, New York.

Sharyl: In very simple terms, can you define Deepfake?

Lyu: Deepfake is just an AI-based algorithm software that can swap faces.

Before we go there, it’s worth noting that we’ve been faking images for as long as we’ve been recording them.

Support independent journalism. Donate to SharylAttkisson.com by clicking here.

Lyu: It’s actually not a real photograph of Lincoln; is actually a composition of Lincoln’s head with somebody else’s body.

Sharyl: Maybe this faked image crossed your internet path in 90s.

Sharyl: So, this says it’s a real photo, but it’s not?

Lyu: This is not.

The shark was added.

Lyu: And with the help of Photoshop, you can load these two images in, crop the original out, and put it back into compose generating this fake photograph in about 10 minutes.

Full Measure’s David Bernknopf and Bryan Barr with Professor Lyu in Albany, New York

The meteoric leap forward came in December 2017. An anonymous internet user by the name “deepfakes” demonstrated new face-swapping Artificial Intelligence capability using it to insert celebrity faces into porn videos.

One unfortunate victim of the X-rated face puppetry was Wonder Woman star Gal Gadot. “Deep fakes” made the technology available to anyone through a free app. And pretty soon, countless technophiles joined the party.

One popular extrapolation involves swapping actor Nicholas Cage’s face into all kinds of scenarios— Nicholas Cage as Stephen Carrell.

Video: No God, please no. No.

Nicholas Cage as captain of the Star ship Enterprise Using Cage is an inside joke — he starred in the 1997 film Face/Off 

Face/Off: I will become him

where his character switches faces with John Travolta’s.

Face/Off: Let’s just kill each other.

Sharyl:This is Nicholas Cage’s face?

Lyu: Right, deep faking Nicholas Cage’s face into different person including President Trump, Tom Cruise, Ben Stiller

Sharyl: In simple terms, the process has to do with taking hundreds or thousands of images of the person to be swapped in and sending them through an automated training process – Putting Hollywood-quality special effects within most anyone’s reach. By the way, that’s Donald Trump as Frankenstein’s monster.

Jennifer Lawrence: My favorite is probably Lisa Vanderpump.

Here, the face of actor Jennifer Lawrence is swapped out with actor Steve Buscemi…

Lawrence: I don’t know what to say, because who knows when you’re gonna run into these people.

Which Buscemi seemed to find pretty creepy when he was shown a clip on a comedy show.

Steve Buscemi: It makes me sad that someone spent that much time on that. I’ll bet that was hard to do.

Not as hard as it used to be. To see how easy it can be to make someone say words they never actually uttered, Lyu’s team created a demonstration for Full Measure. First, I make it clear that I don’t like donuts.

Sharyl: I did not take the donuts from the break room. I have never eaten a donut in my life.

Next we record Full Measure correspondent Lisa Fletcher saying the opposite.

Lisa: I love donuts and I ate all of the ones that were in the break room.

Face swapping technology literally puts Lisa’s words in my mouth.

Lisa, deepfake: I love donuts and I ate all of the ones that were in the break room.

Now, on the right, I’m even blinking and moving my head in Lisa’s pattern, not my own.

Lisa, deepfake: I took the donuts from the break room, I have eaten donuts all my life.

But it’s hardly all fun and games.

Obama deepfake: We’re entering an era in which our enemies can make it look like anyone is saying anything at any point in time.

Filmmaker and comedian Jordan Peele was part of an effort to put words into former President Obama’s mouth as a warning.

Obama deepfake, Jordan Peele: You see I would never say these things at least not in a public address. But someone else would. Someone like Jordan Peele. This is a dangerous time. Moving forward we need to be more vigilant with what we trust from the internet.

Sharyl: What are the potential dangers of this technology?

Lyu: If somebody wanted to manipulate the stock market, generating a short video of a company CEO announcing the performance of the company, this probably will cause a stir and then cause some movement in the market. So, that’s a likely scenario.

To try to stay ahead of the bad guys, Lyu and his team are working under a contract from the military’s Defense Advanced Research Projects Agency, DARPA. Early on, he discovered one sign of a Deep Fake.

Lyu: And one day I realized something not correct, not right. And that is those figures in the fake videos, they don’t blink. They never blinked actually. Their eyes keep open.

As shown in this unblinking Deep Fake of Nicholas Cage as Tom Cruise. But no sooner do analysts like Lyu figure out how to detect a Deep Fake, than the method becomes obsolete.

Lyu: This detection of fake media and the synthesis of fake media is playing a cat-and-mouse game. We always trying to beat the other side. So, once they notice there’s a way fake videos can be detected, they actually improve their algorithm.

Sharyl: Now they blink?

Lyu: They blink now, yes.

Kalev Leetaru: Historically in Hollywood that took a lot of resources.

Sharyl: Special effects?

Leetaru: Exactly. Exactly. And now using deep learning and all this AI technology, machines are able to do that almost in point and click simplicity.

Kalev Leetaru is an analyst with the think tank: The Center for Cyber and Homeland Security. He worries about the accessibility of Deep Fake technology. The app is free. There are tutorials online.

Sharyl: If a Deepfake were used for malicious purposes, what is the fear?

Leetaru: ..what Deepfakes has done, its lowered that bar where you can just literally take a video of Donald Trump speaking, load it into a package and then set the camera, have yourself talking and literally make him talk, hit record, submit it back to Twitter and suddenly you have a video of him giving a speech somewhere.

Recently, Congressman Adam Schiff sounded warnings about deep fakes and the 2020 presidential campaign.

Rep. Adam Schiff: That concern now is heightened exponentially given that this new technology would allow the Russians or any other foreign actor, or any other malicious domestic actor, to push out in proximity to an election video or audio which is completely a forged product.

Meantime the technology moves forward. Early iterations relied on using a lookalike to make the face swap more convincing. Now, that’s no longer needed.

Beckham: Malaria isn’t just any disease.

English soccer star David Beckham appears to be flawlessly speaking multi-languages in this video about malaria.

Beckham deepfake: (Arabic) “And it still kills a child every two minutes.” (French) “But we can end it. We have the knowledge, we have the opportunity.”

Whether the technology is used for good or evil there’s a brave new artificial reality.

Sharyl: Are we quickly approaching a time do you think when we shouldn’t believe at face value anything we see online?

Lyu: I think, at least, I will say everybody should be careful and keep vigilant about this kind of visual media we’re seeing, simply because we have the capacity of changing them.

Lyu is working on ideas to make photos that we post online on social media harder to steal and use in face swapping technology. Also with the warning not to believe anything, is undermining our trust in all videos including those that are genuine, it’s own problem? When we can no longer be sure what is real and what is not? And there’s the issue of plausible deniability. People can say something real “is not me.”

Watch the video investigation by clicking the link below:

https://fullmeasure.news/news/cover-story/deep-fakes

Visit The Sharyl Attkisson Store today

Shop Now

Unique gifts for independent thinkers

Proceeds benefit independent journalism


Leave a Comment

Your email address will not be published. Required fields are marked *

9 thoughts on “The truth about “Deep Fakes””

  1. since the criminal media (saw it called that and I agree) is so bad at vetting their information, or just plain producing lies, I see this being not only an online issue, but making it difficult for viewers to differentiate truth from error. AI was a terrible thing.

Scroll to Top