AOC Reacts To Her Horrifying Sexually Explicit Deepfake Video: 'It Has Real, Real Effects'
Alexandria Ocasio-Cortez, also referred to by her initials AOC, broke her silence on being the latest victim to AI technology.
Ocasio-Cortez, 34, reacted to a recent AI generated image of herself which has been circling the internet, involving herself in a deepfake video, performing sexually explicit acts.
The politican was mortified, saying she randomly stumbled onto the video while scrolling social media in the middle of the day.
"There's a shock to seeing images of yourself that someone could think are real," the activist began. "As a survivor of physical sexual assault, it adds a level of dysregulation," she told Rolling Stone.
RELATED: Taylor Swift Joins Forbes Magazine's Billionaire List In 2024
AOC explained to the publication the trauma surrounding the mental image of seeing the deepfake version of herself performing sexual acts. The U.S. democratic representative suggested that these digital images are damaging to viewers and that they don't and "can't leave a person," even if they know the content is fake.
"It has real, real effects not just on the people that are victimized by it, but on the people who see it and consume it," she said. "And once you've seen it, you've seen it."
According to the New York Post, Ocasio-Cortez believes the harm from "digitizing violent humiliation" is akin to physical rape. "Kids are going to kill themselves over this. People are going to kill themselves over this," she forewarned.
The Bronx native isn't the only victim of explicit deepfake content.
The New York Times reported fake and explicit images of Taylor Swift appearing on 4Chan's thread in January, prompting Congress to propose the No AI Fraud Act. The bill aims to protect one's voice and likeness, and extend first-amendment rights to entities.
Chairman Mitch Glazier called the law a meaningful step towards a "safe, responsible, and ethical AI ecosystem."