• TommySoda@lemmy.world
    link
    fedilink
    English
    arrow-up
    158
    arrow-down
    2
    ·
    18 hours ago

    This isn’t sustainable. Almost all of our infrastructure runs on computers and eventually it will reach a point where you have a computer in charge of vital infrastructure that won’t be able to buy replacement part and it’ll just fail.

    • SuperSpruce@lemmy.zip
      link
      fedilink
      English
      arrow-up
      7
      ·
      9 hours ago

      We used to get by with much less. If only we could start writing more efficient software again…

    • Samskara@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      2
      ·
      9 hours ago

      There‘s existing infrastructure, that runs on hardware from the 1980s. Especially in industrial applications there are still plenty of gigantic machines controlled by a 386 or a C-64.

      The used vintage market can keep these running for a long time. Eventually you replace them with an emulator or an FPGA that runs the same software.

      Big banking, insurance, airlines, shipping, governments, militaries bought huge IBM mainframes from the 1960s onwards. They ran for decades. Many of these were transformed into virtual machines, still running their ancient FORTRAN code.

      There’s also the story of (IIRC Minutemen) nuclear missiles needing 5.25 floppies to program their guidance systems. These were still operational in the early 2000s. Lots of military weapons systems run on ancient hardware.

      • frongt@lemmy.zip
        link
        fedilink
        English
        arrow-up
        1
        ·
        7 hours ago

        Banks, insurance, and aviation all run on very well-tested established code, and are very, very resistant to change.

        But people who know cobol and fortran are getting fewer and further between, so they are slowly changing. Fortunately with modern software development practices, you can much more easily write verified software.

    • imjustmsk@lemmy.world
      link
      fedilink
      English
      arrow-up
      78
      arrow-down
      1
      ·
      17 hours ago

      nah all of the datacenters they build for AI, will come to use then.

      they will say"Need computing? Don’t worry, just rent from us, for an ever increasing and enshittifying subscription"

      • TommySoda@lemmy.world
        link
        fedilink
        English
        arrow-up
        18
        ·
        16 hours ago

        We can’t even get them to upgrade our infrastructure to the 21st century in some cases so good luck with that. We still got shit running on Windows 7 or even Windows XP.

        • BritishJ@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          10 hours ago

          Windows 7. Don’t moan, it was the last good windows. Plus all the themes and hacks you could get for XP. Times were good

      • Link@rentadrunk.org
        link
        fedilink
        English
        arrow-up
        7
        ·
        14 hours ago

        You can’t interact with a computer in the cloud though without some kind of computer in front of you.

        • HereIAm@lemmy.world
          link
          fedilink
          English
          arrow-up
          10
          ·
          edit-2
          13 hours ago

          We’ll just return to terminals. Just a screen, and input devices connected to a server :(

          • Link@rentadrunk.org
            link
            fedilink
            English
            arrow-up
            7
            ·
            12 hours ago

            Right but surely you still need a CPU and RAM at the very least to process the Remote Desktop connection.

            • HereIAm@lemmy.world
              link
              fedilink
              English
              arrow-up
              6
              ·
              12 hours ago

              I bit of ram, but then I’d imagine you only need some purpose built chip for the connection, input and display logic. Effectively you’d need little more than a chrome cast-like device.

              • ParlimentOfDoom@piefed.zip
                link
                fedilink
                English
                arrow-up
                6
                ·
                10 hours ago

                Chips of any kind are the issue. All the silicon fabs are being diverted to cover these insane datacenter orders. Like they’re backordered out over a year at this point. All to boost a tech bubble for a product that doesn’t work

    • 8oow3291d@feddit.dk
      link
      fedilink
      English
      arrow-up
      11
      ·
      12 hours ago

      Stuff is just getting more expensive, because of demand competition with AI. There is no reason to think that production for non-AI computing will ever hit literally zero.