• AbsolutelyNotAVelociraptor@piefed.social
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    3
    ·
    16 hours ago

    Yeah, but let’s get to the worst case scenario: A bunch of assholes star generating images like crazy. They manage to create some million images of innocent people. So many images that xitter can’t pay the fine, get dissolved and that’s all.

    Meanwhile, millions of people have pornographic deepfakes of them on the internet, FOREVER.

    No. We can’t wait for that to happen, xitter needs to close the technology before the first image puts them in trouble.

    • AmbitiousProcess (they/them)@piefed.social
      link
      fedilink
      English
      arrow-up
      2
      ·
      7 hours ago

      We can’t wait for that to happen, xitter needs to close the technology before the first image puts them in trouble.

      That is why this bill imposes heavy fines. No company has good reason to operate a tool like this if every single nude generated can cost them half a million dollars each.

      This bill isn’t a “you can do this but we’ll give you a slap on the wrist each time you do” bill, it’s a “if you build a tool like this and let it loose, your company is going bankrupt and we’re talking your life savings, so you’d better not”