In addition to making people stupid, I wonder what affect will LLMs like Claude will have on programmers? How will new programmers learn if companies start using Claude?

  • iglou@programming.dev
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    54 minutes ago

    Again an article that draws the wrong conclusions.

    No, it does not make people stupider. It makes people lazier. Just. Like. All. Tech.

    How many of us do research in libraries rather than on the internet these days? Back when internet became popular there were similar criticisms to what we have today on AI.

    Essays are AI generated, show poor critical thinking, and you can tell? Great, grade it like what it is. A piss poor work. Just like someone who would copy a wikipedia article 15 years ago would be graded like shit, perhaps even considered cheating and given a 0 (or F or whatever is the worst grade in your system)

    If you can’t tell, then the tool was properly used. If you can’t tell the difference between an AI generated essay and a human-made essay, then perhaps essays are no longer good tests of someone’s abilities.

    Rather than pushing back against a tech that is probably never going away, even when the bubble pops, how about we start thinking productively and adapt how we learn, evaluate, and work instead?

  • LedgeDrop@lemmy.zip
    link
    fedilink
    English
    arrow-up
    3
    ·
    5 hours ago

    Has anyone found an effective way to pair-up and “learn” the syntax faster/better compared to not using AI?

    I’ve written a lot of code in the past, but recently started doing more with golang… and have been using AI for an assist, but at the end of the day (and enough reiterations) - it creates readable and maintainable code. But (unfortunately), I don’t think I could rewrite it.

    I was contemplating seeing how I could change my workflow, so I’d write the code, but AI would offer fast guidance.

    • StenSaksTapir@feddit.dk
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 hour ago

      invest more time before you ask for help. Same as when you’re working with a person.

      Also, add instructions to your agent to use a Socratic question teaching mode.

      You need to force yourself to think, else you won’t learn.

  • nonentity@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    3
    ·
    6 hours ago

    s/could\ be/are/

    They’re lobotomising tools. Vibe coding is just shoving one end of an ice pick up your nose, setting the other on a keyboard, and replacing its handle with a mains powered personal massager.

  • leftzero@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    4
    ·
    7 hours ago

    I have no doubt “AI” companies are sitting on studies proving their shit causes irreversible brain damage, much like tobacco cartels used to sit on studies proving their shit caused cancer.

    By the time the bubble pops and their shit gets properly regulated it’ll have crippled a whole generation (on top of all the other damage like destroying the Internet, causing unfathomable damage to science culture, and society in general, and infecting any information produced after this shit became commonly used).

    I have very little hope for our civilization being able to survive this self inflicted disaster (and given how we’ve squandered natural resources and caused a runaway greenhouse effect that’ll make our world mostly uninhabitable for humans without massive industrial effort that will be impossible after our fall, no new civilization will be rising after this dark age). But hey, at least some sociopath CEOs will have made a lot of money out of it. Who cares if they murdered the future for their short term profit.

  • Em Adespoton@lemmy.ca
    link
    fedilink
    English
    arrow-up
    2
    ·
    6 hours ago

    I wonder… are Google and Bing search indexes being intentionally left to moulder specifically to drive people to Gemini and ChatGPT?

  • minorkeys@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    1
    ·
    5 hours ago

    Substituting your own information syntheses, memory and deduction Willa trophy this faculties, leading to people who can’t think. Everything their mind needs to complete thought, particularly complex thoughts, connect to an external device. Their thoughts will resemble Swiss cheese, partial ideas with numerous large gaps.

  • BaroqueInMind@piefed.social
    link
    fedilink
    English
    arrow-up
    17
    arrow-down
    6
    ·
    9 hours ago

    If any of you actually read the article, they only tested 54 college students with writing a fucking essay. This is also undergoing a limited peer review.

    Ultimately tells you no results that reflect reality, and only provides justification for certain people’s feelings who were anti-AI already

    • themachinestops@lemmy.dbzer0.comOP
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 hour ago

      The study might not be through, but it is pretty obvious if you don’t use your brain your are going to lose your skills. Even without the study this should be obvious, the more you practice your skill the better you get if you rely on Chatgpt you will not get better and might get worse.

    • givesomefucks@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      3
      ·
      9 hours ago

      The sun still rose every day before we knew the planet spins…

      Tobacco still caused cancer before the studies came out…

      If you’ve studied biopsychology, you know what happens to any offloading of a cognitive function.

      That it’s being handed off specifically to AI, just doesn’t matter in the slightest. Because the offloading itself is what causes the atrophy.

      The issue here, is what is being offloaded is critical thinking…

      Which makes it incredibly difficult to explain what is happening to someone who is experiencing it, somewhat like Alzheimer’s.

      People reliant on chatbots to do their critical thinking, simply don’t have the critical thinking to understand the problem. The only way to get them out of it, is making them go cold turkey like with drugs. And eventually the brain will begrudgingly start doing critical thinking again, but it’s gonna take a while, because offloading cognitive tasks from the conscious mind is literally why humans are the dominant species.

      It’s why it takes so little time for people to become reliant on it.

  • givesomefucks@lemmy.world
    link
    fedilink
    English
    arrow-up
    42
    arrow-down
    3
    ·
    12 hours ago

    The vast amount of people don’t understand how their brain works…

    What we think of “us” isn’t our brains, it’s just our consciousness. And that’s just a middle manager that’s getting all types of shit thrown at it.

    Our consciousness can’t tell the difference between the prefrontal lobe handling something, or a laptop with a chatbot open.

    It just takes the input and processes it.

    When we throw stuff to an AI, the part of our brain that normally handles it, just starts doing other stuff.

    If you don’t have the AI, your prefontal lobe doesn’t want to take the old stuff back, it’s already got its plate full with the new stuff it picked up.

    Your consciousness knows the chatbot can puke out an answer, so when your prefrontal love won’t/can’t do it, you just got hyper focused on getting access to the chatbot.

    It’s “making people stupider” but the real problem is it’s abusing how every mammals brain has worked for millions of years. It’s not something people can resist,bits the brain as a whole working as intended. We just didn’t evolve for something that at any moment could become prohibitively expensive.

    Think of how Uber was cheap till people needed it.

    If people get hooked on cheap AI, they’re not gonna be able to survive without it and will pay anything. I think this is why its pushed on coders so hard, they want everyone to use it so everyone becomes dependent on it. Instead of paying for 4-8 years for a degree, people will have to pay monthly for an AI just to earn a living

    That’s the end goal of the techbros. No one being able to work unless they pay for AI.

    • Em Adespoton@lemmy.ca
      link
      fedilink
      English
      arrow-up
      1
      ·
      6 hours ago

      It’s pushed on coders because it gives every developer a team of never sleeping junior devs for a fraction of the price.

      And if the competition is doing it, you won’t compete unless you do it too. Until the price matches that team of junior coders.

      • hikaru755@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        3 hours ago

        a team of never sleeping junior devs

        As a senior dev, that sounds like my worst nightmare tbh

    • classic@fedia.io
      link
      fedilink
      arrow-up
      7
      ·
      11 hours ago

      Asking from a place of agreement, curious if you have any readings to suggest, on that impact to the brain. Always looking for solid content to send along to others

      (beyond this article and the MIT research it cites)

      • givesomefucks@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        1
        ·
        edit-2
        11 hours ago

        on that impact to the brain.

        I mean, I pulled a whole bunch of stuff together in that comment, I’d be shocked if any source existed that touched on every part.

        As far as “us” delegating tasks to other parts of the brain, this looks pretty good:

        The driving force behind human brain evolution

        Although many species can transfer behavior from volitional to habitual function (Poldrack et al., 2005; Barton, 2007; Seger and Spiering, 2011; Krubitzer and Seelke, 2012; Barton and Venditti, 2013), the shift from quadrupedal to bipedal locomotion nonetheless may have been a powerful driver for the rapid elaboration of the distinctively human “delegation” mode of information processing. Bipedality is rare in mammals, seen commonly only in humans and in some apes (Hardman et al., 2002; Alexander, 2004; Doyon et al., 2009). Although bipedality plausibly affords a number of adaptive advantages (e.g., it facilitates surveillance in densely vegetated areas, and frees the arms for other tasks Carrier, 2011), it also imposes a massive information-processing challenge. Compared to the stability conferred by quadrupedal locomotion, a bipedal organism rests its body mass on only two support points. This inherently unstable posture means that even a tiny shift in position will cause a fall, unless the animal instantly detects and responds to that change. Presumably for this reason, quadrupedal animals that resort to bipedality for surveillance typically do so only briefly, or in highly stereotyped poses (as is the case with meerkats). Moving about while bipedal poses extraordinary challenges, whereby the individual must constantly respond to ever-changing subtle shifts in weight distribution (Preuschoft, 2004), reducing its ability to attend to other aspects of its environment (such as the detection of food sources or approaching predators).

        Despite these challenges, adult humans spend little time consciously thinking about maintaining their balance as they move around, except when placed in a challenging circumstance, such as walking on a narrow beam or when leaving a pub. The means of achieving that liberation is very clear as one watches a young child learning to walk. This is a long process, with every step initially requiring full concentration. Through time, however, the skills develop as control over fine motor movements improves–and full concentration on movement is no longer needed as the tasks involved become “automatized” and are delegated to other parts of the brain, such as the basal ganglia (Poldrack et al., 2005; Ashby et al., 2010; Seger and Spiering, 2011; Sepulcre et al., 2012) and the cerebellum (Duncan, 2001; Desmurget and Turner, 2010; Balsters and Ramnani, 2011; Callu et al., 2013). Plausibly, then, the adoption of bipedalism in proto-humans posed a strong selective advantage for individuals with brains capable of using their full processing power to learn bipedalism, but that were also able to delegate the basic tasks of walking and running to “lower” neural centers, freeing up the higher segments for detecting unpredictable opportunities and challenges (be they related to predators, food, or social cues), and rapidly responding to that information.

        In summary, we suggest that (1) the ability to delegate routine tasks from the cortex to other parts of the brain is more highly developed in humans than other species; and (2) that elaboration arose during our evolutionary history because the computational challenges associated with balancing on two legs enhanced individual fitness in proto-humans who were capable of transferring the control of routine tasks in this way. To this we can add (3) that once this “delegation” mode of neural functioning had evolved, it was co-opted for many other cognitive tasks–essentially, liberating the cortex to deal with novel unpredictable events.

        https://pmc.ncbi.nlm.nih.gov/articles/PMC4010745/

        Although to be upfront I didn’t take the time to read the whole study, I just skimmed it. I was already aware of how this works from school and just searched real quick for a source

        But that study starts out assuming what pushed us to delegation forward brains was how fucking hard it was to stand on two feet without a giant tail. And once we got good at delegating that away from conscious thinking, why wouldn’t we keep delegating everything else as long as there isn’t an immediate negative consequence?

        • classic@fedia.io
          link
          fedilink
          arrow-up
          1
          ·
          4 hours ago

          Thank you. That was a cool read. We squander this amazing organ. I’m with you that it’s hard to find all these concepts in one place. Sapolsky’s lectures captures some of it.

    • WhoIzDisIz@lemmy.today
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      4 hours ago

      If you’re relying on AI, well then…

      (Yes, I know they can have their uses if done properly, but for too many that’s a HUGE “if”.)

  • mermella@piefed.social
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    1
    ·
    9 hours ago

    If anything, it increased my threshold for complexity and I’m tits deep in some very cool projects. The concern is they are going to have to make it super expensive if they can’t get any meaningful efficiency gains.

    • givesomefucks@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      9 hours ago

      The concern is they are going to have to make it super expensive if they can’t get any meaningful efficiency gains.

      They’re 100% going to make it more expensive regardless…

      Like, they’re pushing it on coders like crack dealers give out their first rock.

      Like, you just said you can do things with it you couldn’t without it. How long until you can’t do what you could before without it?

      Have you tried lately? Not a guess of what it would be like if you tried. Actually trying to code without it. If you haven’t, you’re going to be shocked how hard it is to resist, and how bad you are at it if you manage to do it manual.

      It’s going to be cheap till everyone is hooked, till they’ve gotten a promotion using AI, or forgot how to work without it.

      When the choice is to send half your paycheck to the AI or get fired, a lot of people are going to sign over half their checks, just for the health benefits of employment.

      They don’t have to replace humans with AI, they know they can’t do that.

      But they absolutely can trick people (at least coders) into be coming reliant on something that can increase in price a thousand fold overnight.

      C’mon bro, think of Uber or any “disruptive” tech, there’s always what they say they want, and the actual goal that would have stopped anyone from using it to begin with. This shouldn’t need pointed out to people “in tech”

  • Franconian_Nomad@feddit.org
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    7 hours ago

    They can also make you smarter if you use them right. Key is to use local models and not giving the techbros any money.

    • StarryPhoenix97@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      6 hours ago

      That’s on my to-do list. I’m currently reworking my entire build because I realized I had enough last generation parts to build a media server. Once I have windows set up to only run on VM and get my stuff moved and backed up I’m going to install an LLM