• moroninahurry@piefed.social
    link
    fedilink
    English
    arrow-up
    14
    arrow-down
    5
    ·
    12 hours ago

    Laws like this are great for these companies. This is how they will justify removing access to useful information and putting it behind paywalls. But oh your need a prescription so now the insurance companies are involved (spoiler: they already are) and so you don’t even have access to pay out the nose for medical information.

    Then when Google search has been completely replaced with AI, you won’t even be able to search for medical information.

    Healthcare companies aren’t about to provide anything for free.

    • Routhinator@startrek.website
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      4
      ·
      10 hours ago

      Most of the medical information coming up these days is garbage and you should be going to a known, reputable site and searching their database. LLMs have been trained on absolute garbage. There is nothing of value being kept from anyone here.

    • Soup@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      2
      ·
      9 hours ago

      LLMs and chatbots should not be giving medical advice. You are afraid of the private healthcare system, not the lack of access to the most janky bandaid fix for its failures.

      • moroninahurry@piefed.social
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        8 hours ago

        Neither should Wikipedia or Google. So I guess by your logic nobody should search or learn about medical conditions on a computer.

        • Soup@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          2
          ·
          7 hours ago

          You know damn well there’s an important difference related to the confidence of a bot that has been a key problem since this whole thing started.

      • douglasg14b@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        2
        ·
        9 hours ago

        The line between medical advice and personal research is pretty freaking gray, so banning medical advice. Does that also ban talking to llms about anything that is medical adjacent?

        Does medical adjacent mean personal disabilities? Drug related interests? Pet health?

        …etc

        It’s a slippery slope and we don’t need to be sliding down it

        • moroninahurry@piefed.social
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          edit-2
          8 hours ago

          People are so vicious over this tech they would rather have disabled poor people with cancer suffer and die under inadequate care than do anything about the inadequate care. Ban the tech, but let this all go on.

          If you are perfectly able and well, you can ignore all advice that isn’t perfect.

          The perspective they seem to lack is frightening. The empathy they refuse to engage is massive. This is able-ism.

          Tech companies are bad, but use of tech will cure and ease cancer, HIV, and chronic disease. Bring on the downvotes.

          • Soup@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            1
            ·
            7 hours ago

            “Would rather have disabled people with cancer suffer and die…”

            My guy, that’s not a lack of LLM access, it’s a completely fucked US healthcare system that forces people onto the internet because they can’t get what they need from the state, you goofy-ass weirdo.