• stoy@lemmy.zip
    link
    fedilink
    English
    arrow-up
    78
    arrow-down
    1
    ·
    1 day ago

    I’d much rather have a more powerful generic CPU than a less powerful generic CPU with an added NPU.

    There are very few people who would benefit from an added NPU, ok I hear you say what about local AI?

    Ok, what about it?

    Would you trust a commercial local AI tool to not be sharing data?

    Would your grandmother be able to install an open source AI tool?

    What about having enough RAM for the AI tool to run?

    Look at the average computer user, if you are on lemmy, chances are very high that you are far more advanced than the average computer user.

    I am talking about those users who don’t run Adblocker, don’t notice the YT ad skip button and who in the past would have installed a minimum of of five toolbars in IE, yet wouldn’t have noticed the reduced view of the actual page.

    These people are closer to the average users than any of us.

    Why do they need local AI?

        • unexposedhazard@discuss.tchncs.de
          link
          fedilink
          English
          arrow-up
          13
          ·
          22 hours ago

          The fact that i didnt know about those means that consumers have zero need for them and building them into consumer hardware is just an attempt to keep the AI bubble afloat.

      • stoy@lemmy.zip
        link
        fedilink
        English
        arrow-up
        10
        ·
        1 day ago

        Exactly!

        I could even see the cards having ram slots, so you can add dedicated ram to the NPU to remove the need for sharing ram with the system

    • biggerbogboy@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      7
      ·
      20 hours ago

      There’s also the fact that many NPUs are pretty much useless unless used for a very specific model built for the hardware, so there’s no real point having them

    • tal@lemmy.today
      link
      fedilink
      English
      arrow-up
      11
      ·
      edit-2
      1 day ago

      My understanding from a very brief skim of what Microsoft was doing with Copilot is to take screenshots constantly, run image recognition on it, and then make it searchable as text and have the ability to go back and view those screenshots in a timeline. Basically, adding more search without requiring application-level support.

      They may also have other things that they want to do, but that was at least one.

      EDIT: They specifically called that feature “Recall”, and it was apparently the “flagship” feature of Copilot.

      • Spuddlesv2@lemmy.ca
        link
        fedilink
        English
        arrow-up
        21
        ·
        1 day ago

        Do you mean Copilot, the local indexer and search tool or do you mean Copilot the web based AI chat bot or do you mean Copilot the rebranded Office suite or do you mean … etc.

        Seriously, talk about watering down a brand name. Microsoft marketing team are all massive, massive fuck knuckles.

        • ozymandias117@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          ·
          21 hours ago

          Hey, the last one is great.

          Now when I get asked “what do you think about Copilot,” I can just say, “I prefer LibreOffice”

    • Endmaker@ani.social
      link
      fedilink
      English
      arrow-up
      7
      ·
      1 day ago

      an added NPU

      cmiiw but I don’t think NPUs are meant to be used on general-purpose personal computers. A GPU makes more sense.

      NPUs are meant for specialised equipment e.g. object detection in a camera (not the personal-use kind)

      • vithigar@lemmy.ca
        link
        fedilink
        English
        arrow-up
        14
        ·
        1 day ago

        They are in general purpose PCs though. Intel has them taking up die space in a bunch of their recent core ultra processors.

      • altkey (he\him)@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 day ago

        Probably not even general purpose GPUs, although we sucked it up when RT and Tensor cores were put on a plate whenever we like it or not. These though at least provided something to the consumer unlike NPUs.