• BarrelAgedBoredom@lemmy.zip
    link
    fedilink
    arrow-up
    2
    ·
    19 hours ago

    I may only be in a respiratory therapy program, but I’ve been an EMT for 10 years prior to that. If that experience is worth anything, I’d say verifying information before making a clinical decision is a far more important habit to build than memorizing two obscure values for a test (that you’ll almost certainly forget by the time you’re a licensed physician).

    An AI study guide is liable to make mistakes, but the bigger problem here is a prospective physician who can’t be bothered to make sure that they had the correct information before acting on it. Ditto for the lawyers or researchers relying on AI to do the work for them (an inappropriate use of AI imo). Throwing a practice test together and drafting legal paperwork/writing an academic piece are planets apart

    • medgremlin@midwest.social
      link
      fedilink
      arrow-up
      3
      ·
      18 hours ago

      The AI alleviates the process of critical thinking though. I make my own review notebooks for my boards and for clinical rotations by taking the time to figure out what’s important and what I don’t know to put those things in my notebooks. I write these out by hand on paper, so I have to be judicious about what is going to actually be important, and just the process of making those priorities helps me to have a better understanding of my own deficiencies.

      Making a good study guide requires critical thinking skills, and if that gets outsourced to AI, that means the critical thinking isn’t being done by the human that needs to learn that skill.