People have been “nudifying” photos since Photoshop. This is entirely theatrical and won’t really keep it from happening, just not appearing as easily in public.
As opposed to real AI nudes?
Under the law, developers of websites, apps, software, or other services designed to “nudify” images risk extensive damages[…]
I’m not going to take the time to read the language of the law, but I worry that “designed to” could give developers a lot of deniability.
“No, see… My app is designed to show you what you look like in user-created outfits. Like a virtual closet mirror! What do you mean users are trying on tiny bikinis and clear cellophane dresses? How could I ever have planned for that?”
Still, a good step in the right direction, I suppose.
(I’m citing the law, not the article)
There’s a few things that I think help prevent something like that from happening.
“Nudify” or “nudified” means the process by which: an image or video is altered or generated to depict an intimate part not depicted in an original unaltered image or video of an identifiable individual
“Intimate parts” includes the primary genital area, groin, inner thigh, buttocks, or breast of a human being.
So a reasonably sized bikini probably wouldn’t qualify, because it still covers intimate areas to some degree, but anything too skimpy would.
The prohibitions in subdivision 2 do not apply when the website, application, software, program, or other service requires the technical skill of a user to nudify an image or video.
So something like Photoshop wouldn’t qualify because you’d need the skills to actually edit images yourself.
I think this:
“No, see… My app is designed to show you what you look like in user-created outfits. Like a virtual closet mirror! What do you mean users are trying on tiny bikinis and clear cellophane dresses? How could I ever have planned for that?”
Would be prevented by this law, but with very good reason. Anyone developing a feature like that could very well simply develop a filter that can tell if too much of a sensitive area is being exposed that wasn’t previously there. If they put technical safeguards in place, and it takes reasonably large amounts of effort for a user to bypass, then the site wouldn’t be liable because it would require “technical skill of a user”.
A site like that can exist, and being able to digitally try on outfits is nice, but it shouldn’t be allowed to ignore the obvious consequences of not putting restrictions on how much skin can be shown.
I hope they can actually get the money.
instead of this, how about a tax on companies that replace employees with AI?
or how about a tax on products that come from companies that use AI?
it’s “impossible” to tell if the product benefited from research that came from the state, so all states should benefit from it, no?
Way to go, Minnesota! 👏
500k fines? lmao. So just a rounding error?
Did you read the article? Its 500k per nude.
So just a rounding error?
Grok generated about 6,700 undressed images per hour.
That would be 3.35 BILLION dollars per hour if the maximum 500k fine was enforced for each.
These things are nonconsensual sexual image machine guns, dude.
And this company’s revenue is in the billions per year. BILLIONS. A 500k fine is not even spare change for them, is literally a rounding error. It’s as if when you go buy groceries, they round up a $0.95 price to $1.
That for a fucking CSAM picture.
Let’s say a user generates one image per minute. If they do that for an hour, that’s $30M. If they did that once a day for a month, that’s almost a billion dollars.
These services have hundreds of millions of users. Collectively, if they don’t stop, the fines will be total-world-GDP levels in days or weeks.
Unfortunately it’s unlikely a court will fine them based on a single person generating thousands of images. And it’s also unlikely they will fine them based on many people generating a few each, as it would be a slow and arduous process. I just can’t see it happening in reality.
They need to fear the first image generated, not the 1000th one.
If a user generates 1 pic a minute, in a day he generates 1440 pics, that means potentially 1440 people hurt by a single asshole in a day.
I don’t care how much it will cost to xitter, I care for how much will those people suffer.
Xitter needs to be worried that a single user generates one image, not that a bunch of assholes generate some thousands or even a million images because for each image, there’s gonna be a person (or more) suffering.
Obviously yes, one image is too many. But companies only care about money, and if their money is going to evaporate because of their stupid technology, they’ll figure out a way to shut it off before it costs them.
The alternative is the status quo, which is that there are no consequences so there are no changes. Or maybe you make the fine ten quadrillion dollars for the first image generated, which is never gonna fly.




