Artificial Intelligence is advancing faster than the speed of light, it seems, and a majority of Americans say they’re concerned about realistic Deep Fakes impacting the election Here’s Scott Thuman.
The following is a transcript of a report from “Full Measure with Sharyl Attkisson.”
Watch the video by clicking the link at the end of the page.
Perhaps it is a measure of just how close this election is, or how divided the nation has become.
Throughout the campaign season, AI-generated fake videos and images have proliferated.
Sometimes the aim is to amuse, but mostly, the idea is to mislead voters, or throw a wrench into the wheels of democracy.
As happened last January in New Hampshire, when a robocall with a voice sounding like President Biden appeared to tell voters to skip the state’s primary election.
The political consultant who allegedly created that call was later indicted for deterring people from voting.
To better understand deep fakes, we had a Zoom call with Oren Etzioni in Seattle.
An AI expert, he’s launched a website that uses human and computer tools to inspect files for telltale signs of AI generation.
Like this one recently of a fake call between Kamala Harris and Tim Walz.
Scott: You know, we throw around that phrase ‘deep fake’ because we understand it, but I’m not sure the average person understands just how big or wide that scope is. Explain to someone who’s never heard of a deep fake why they should.
Oren Etzioni: When you see an image or a video, we’re visual animals, it’s natural to say, okay, I see this with my own eyes, I believe it. Unfortunately, we’ve reached the point where AI is sophisticated enough that it’s easy to fake these images, to fake these videos so that they look real, but they’re not.
Scenes that look unlikely, like President Trump hugging Dr. Anthony Fauci can still look convincing. In this case, in a video attacking President Trump’s decision not to fire Anthony Fauci.
Peter Loge is director of the Project on Ethics in Political Communication at George Washington University in the nation’s capital.
Peter Loge: The real downside of this, though, is because we’re all afraid of AI because there’s so much stuff. Anybody following these campaigns is looking at a wall of noise. We don’t know what’s true, what’s not, what’s fake, what’s not fake, what the candidates are saying, what people are pretending. So, maybe, we don’t believe any of it,
Loge says the big campaigns have largely avoided AI for their advertising and videos because of the risk of being caught out. But smaller campaigns, and foreign actors seeking influence can have a large impact without spending much.
Scott: So, if you or I had malicious intent and we wanted to alter the way people saw a candidate in this election, it wouldn’t take much?
Etzioni: It wouldn’t take much. It would take a few dollars. We’re not even talking about a hundred dollars, we’re just talking about a few dollars and existing tools.
Leaving voters to wonder if what they’ve seen and heard over the campaign is real or a sophisticated con.
For Full Measure, I’m Scott Thuman.
Sharyl: A number of states passed legislation to ban or punish deep fakes in elections. But California’s law was recently ruled unconstitutional.
A number of states passed legislation to ban or punish deep fakes in elections. But California’s law was recently ruled unconstitutional.
Watch video here.
