ST. PAUL, Minn. (news agencies) — Molly Kelley was stunned to discover in June that someone she knew had used widely available “nudification” technology to create highly realistic and sexually explicit videos and images of her, using family photos that were posted on social media.
“My initial shock turned to horror when I learned that the same person targeted about 80, 85 other women, most of whom live in Minnesota, some of whom I know personally, and all of them had connections in some way to the offender,” Kelley said.
Backed by her testimony, Minnesota is considering a new strategy for cracking down on deepfake pornography. A bill that has bipartisan support would target companies that run websites and apps allowing people to upload a photo that then would be transformed into explicit images or videos.
States across the country and Congress are considering strategies for regulating artificial intelligence. Most have banned the dissemination of sexually explicit deepfakes or revenge porn whether they were produced with AI or not. The idea behind the Minnesota legislation is to prevent the material from ever being created — before it spreads online.
Experts on AI law caution the proposal might be unconstitutional on free speech grounds.
The lead author, Democratic Sen. Erin Maye Quade, said additional restrictions are necessary because AI technology has advanced so rapidly. Her bill would require the operators of “nudification” sites and apps to turn them off to people in Minnesota or face civil penalties up to $500,000 “for each unlawful access, download, or use.” Developers would need to figure out how to turn off the function for Minnesota users.
It’s not just the dissemination that’s harmful to victims, she said. It’s the fact that these images exist at all.
Kelley told reporters last month that anyone can quickly create “hyper-realistic nude images or pornographic video” in minutes.








United Arab Emirates Dirham Exchange Rate

