• wrinkle2409@lemmy.cafe
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    1 hour ago

    People have been “nudifying” photos since Photoshop. This is entirely theatrical and won’t really keep it from happening, just not appearing as easily in public.

  • spizzat2@lemmy.zip
    link
    fedilink
    English
    arrow-up
    4
    ·
    6 hours ago

    Under the law, developers of websites, apps, software, or other services designed to “nudify” images risk extensive damages[…]

    I’m not going to take the time to read the language of the law, but I worry that “designed to” could give developers a lot of deniability.

    “No, see… My app is designed to show you what you look like in user-created outfits. Like a virtual closet mirror! What do you mean users are trying on tiny bikinis and clear cellophane dresses? How could I ever have planned for that?”

    Still, a good step in the right direction, I suppose.

    • AmbitiousProcess (they/them)@piefed.social
      link
      fedilink
      English
      arrow-up
      3
      ·
      5 hours ago

      (I’m citing the law, not the article)

      There’s a few things that I think help prevent something like that from happening.

      “Nudify” or “nudified” means the process by which: an image or video is altered or generated to depict an intimate part not depicted in an original unaltered image or video of an identifiable individual

      “Intimate parts” includes the primary genital area, groin, inner thigh, buttocks, or breast of a human being.

      So a reasonably sized bikini probably wouldn’t qualify, because it still covers intimate areas to some degree, but anything too skimpy would.

      The prohibitions in subdivision 2 do not apply when the website, application, software, program, or other service requires the technical skill of a user to nudify an image or video.

      So something like Photoshop wouldn’t qualify because you’d need the skills to actually edit images yourself.

      I think this:

      “No, see… My app is designed to show you what you look like in user-created outfits. Like a virtual closet mirror! What do you mean users are trying on tiny bikinis and clear cellophane dresses? How could I ever have planned for that?”

      Would be prevented by this law, but with very good reason. Anyone developing a feature like that could very well simply develop a filter that can tell if too much of a sensitive area is being exposed that wasn’t previously there. If they put technical safeguards in place, and it takes reasonably large amounts of effort for a user to bypass, then the site wouldn’t be liable because it would require “technical skill of a user”.

      A site like that can exist, and being able to digitally try on outfits is nice, but it shouldn’t be allowed to ignore the obvious consequences of not putting restrictions on how much skin can be shown.

  • GreenKnight23@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    7 hours ago

    instead of this, how about a tax on companies that replace employees with AI?

    or how about a tax on products that come from companies that use AI?

    it’s “impossible” to tell if the product benefited from research that came from the state, so all states should benefit from it, no?

          • AbsolutelyNotAVelociraptor@piefed.social
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            3
            ·
            15 hours ago

            And this company’s revenue is in the billions per year. BILLIONS. A 500k fine is not even spare change for them, is literally a rounding error. It’s as if when you go buy groceries, they round up a $0.95 price to $1.

            That for a fucking CSAM picture.

            • becausechemistry@piefed.social
              link
              fedilink
              English
              arrow-up
              12
              ·
              14 hours ago

              Let’s say a user generates one image per minute. If they do that for an hour, that’s $30M. If they did that once a day for a month, that’s almost a billion dollars.

              These services have hundreds of millions of users. Collectively, if they don’t stop, the fines will be total-world-GDP levels in days or weeks.

              • Pyr@lemmy.ca
                link
                fedilink
                English
                arrow-up
                2
                ·
                9 hours ago

                Unfortunately it’s unlikely a court will fine them based on a single person generating thousands of images. And it’s also unlikely they will fine them based on many people generating a few each, as it would be a slow and arduous process. I just can’t see it happening in reality.

              • AbsolutelyNotAVelociraptor@piefed.social
                link
                fedilink
                English
                arrow-up
                2
                arrow-down
                2
                ·
                14 hours ago

                They need to fear the first image generated, not the 1000th one.

                If a user generates 1 pic a minute, in a day he generates 1440 pics, that means potentially 1440 people hurt by a single asshole in a day.

                I don’t care how much it will cost to xitter, I care for how much will those people suffer.

                Xitter needs to be worried that a single user generates one image, not that a bunch of assholes generate some thousands or even a million images because for each image, there’s gonna be a person (or more) suffering.

                • becausechemistry@piefed.social
                  link
                  fedilink
                  English
                  arrow-up
                  4
                  ·
                  14 hours ago

                  Obviously yes, one image is too many. But companies only care about money, and if their money is going to evaporate because of their stupid technology, they’ll figure out a way to shut it off before it costs them.

                  The alternative is the status quo, which is that there are no consequences so there are no changes. Or maybe you make the fine ten quadrillion dollars for the first image generated, which is never gonna fly.