(WATCH) AI Ethics


From humanoid robots to advanced AI, we’re not just approaching a new era; we’re already in it. Raising never-before-addressed legal and ethical questions about crimes involving altered or entirely fabricated images, or even AI robots. Lisa Fletcher reports

The following is a transcript of a report from “Full Measure with Sharyl Attkisson.”
Watch the video by clicking the link at the end of the page.

This video introduces Aria, a humanoid robot designed not to calculate difficult problems but for social reasons.

But there’s a dark side to the latest exciting advances, and it’s forcing society to confront new dilemmas: Should harm or exploitation of an artificial being, whether it’s made by AI like this or a human-looking robot, be a crime even if no real person is hurt?

There are already early cases of new technology testing the question. Often, the first reports of child sexual abuse involving AI come here, to the National Center for Missing and Exploited Children outside Washington D.C. Kathryn Rifenbark is director of the cyber tipline.

Kathryn Rifenbark: Last year, we really started to see reports coming into our cyber tipline regarding generative AI images that were being used to exploit children. As we started to see those reports come in, they originally were coming in more from our traditional social media or other online companies, not as much from the AI companies. Last year, we saw over 4,700 of those reports come into the National Center.

Lisa: Describe what they said or what you saw.

Rifenbark: We have cases where offenders are using generative AI to create completely fake images of a child that is engaged in a sex act. We also see situations where they’re taking existing child sexual abuse material and using generative AI to change the sex acts or the egregiousness of that picture.

For nearly a hundred years, Hollywood has been playing with the idea of realistic robots and ethical dilemmas. In the movie “AI”, a robot boy eventually abandoned by his adoptive parents.

Tori Hirsch, a lawyer for the National Center on Sexual Exploitation, says the ethical dilemmas seem increasingly imminent, with some states already passing legislation.

Tori Hirsch: So, certain states, like Wisconsin, for example, created a separate crime of virtual child pornography, whereas other states, like Arizona, made it a civil wrong to create sexually explicit deep fakes. But they put that in with their code about creating political deepfakes so there’s a lot of nuances and variations across the states, but they are acting much quicker than the federal legislature on these issues.

Then there are AI images that are entirely computer-generated and not based on any real person.

A case now in court in Wisconsin involves a man charged with creating more than 13,000 AI-generated obscene images of children. A judge recently threw out one of the charges, saying the First Amendment protects possession of virtual child pornography in one’s home. Prosecutors are appealing that ruling.

When it comes to humanoid robots, CASR – the Campaign Against Sex Robots has pushed for bans on robots designed to emulate sexual offenses, arguing they reinforce harmful norms.

Hirsch says lawmakers across the country and in Washington should be thinking about this problem now.

Hirsch: I don’t think our laws have gotten there to even conceptualize what that would be, but again, a robot is not a person. Even if they’ve taken on some very personal attributes through their programming, I don’t think we can call that a person. So I hesitate to say that the law would apply to that situation. But then again, we have to think about the ramifications, the repercussions of what that would be.

Two years ago, attorneys general across the country called on Congress to act, and last year, a bill was introduced to create an AI-child exploitation commission to study the issue and propose new legislation, but so far, it’s just an idea; nothing has been passed.

For now, this is a legal gray area, where rapid advances in technology and artificial intelligence are creating potential opportunities for exploitation and harm that our existing laws don’t cover.

For Full Measure, I’m Lisa Fletcher in Washington.

Watch video here.


Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top