• Dave@lemmy.nz
      link
      fedilink
      arrow-up
      21
      arrow-down
      1
      ·
      edit-2
      1 day ago

      Ah this is a different risk than I thought was being implied.

      This is saying if a doctor relies on AI to assess results, they lose their skill in finding them by themselves.

      Honestly this could go either way. Maybe it’s bad, but if machine learning can outperform doctors, then it could just be a “you won’t be carrying a calculator around with you your whole life” type situation.

      ETA: there’s a book Noise: A flaw in human judgement, that details how whenever you have human judgement you have a wide range of results for the same thing, and generally this is bad. If machine learning is more consistent, the standard of care is likely to rise on average.