ST. PAUL — A Minnesota Senate committee heard a bill Wednesday that would prohibit technology companies from allowing software that is used to create AI-generated nude photos to be downloaded or accessed by users.
The bill, SF 1119, would require companies and company owners in Minnesota to turn off the “nudification function” that allows users to generate pornographic and nude images of non-consenting individuals through AI technology.
ADVERTISEMENT
The chief author of the bill, Sen. Erin Maye Quade, DFL-Apple Valley, said widely available apps have this technology accessible.
“These apps are available on every cellphone for any person of any age, on every computer, and all downloadable on the app store,” Quade said Wednesday, Feb. 19, during a hearing of the Judiciary and Public Safety Committee. “It’s creating problems, especially for kids.”
Testifier Megan Hurley spoke Wednesday about an experience she had in June 2024 with deepfake technology.
“A man I had known for 20 years, and at one point considered a dear friend of mine, had used a picture I posted to my private Facebook page, in an easily accessible website, to create hyper-realistic nude images and pornographic videos of myself and around 80 other women,” Hurley said. “This has created irreversible harm to me and these other women, and I cannot overstate the damage this technology has done.”
According to the current version of the bill, an individual who is victimized by a violation of this bill would be able to sue for “mental anguish or suffering” and a person who violates this bill could be forced to pay a fine up to $500,000.
Sen. Maye Quade introduced the bill Feb. 6, and it was referred to the Commerce and Consumer Protection Committee. It was later re-referred to the Senate Committee on Judiciary and Public Safety.
After making several amendments to the language of the bill, the committee tabled it for potential additional edits at a later date.
ADVERTISEMENT
If passed, the bill would go into effect Aug. 1, 2025.
A bill passed in 2023, also authored by Maye Quade, targeted users who share this type of content by banning the dissemination of deepfake sexual images.