My rack is finished for now (because I’m out of money).

Last time I posted I had some jank cables going through the rack and now we’re using patch panels with color coordinated cables!

But as is tradition, I’m thinking about upgrades and I’m looking at that 1U filler panel. A mini PC with a 5060ti 16gb or maybe a 5070 12gb would be pretty sick to move my AI slop generating into my tiny rack.

I’m also thinking about the PI cluster at the top. Currently that’s running a Kubernetes cluster that I’m trying to learn on. They’re all PI4 4GB, so I was going to start replacing them with PI5 8/16GB. Would those be better price/performance for mostly coding tasks? Or maybe a discord bot for shitposting.

Thoughts? MiniPC recs? Wanna bully me for using AI? Please do!

  • Melvin_Ferd@lemmy.world
    link
    fedilink
    English
    arrow-up
    12
    arrow-down
    11
    ·
    2 days ago

    The AI hate is overwhelming at times. This is great. What kind of things are you doing with it?

    • nagaram@startrek.websiteOP
      link
      fedilink
      English
      arrow-up
      11
      ·
      1 day ago

      Not much. As much as I like LLMs, I don’t trust them for more than rubber duck duty.

      Eventually I want to have a Copilot at Home set up where I can feed a notes database and whatever manuals and books I’ve read so it can draw from that when I ask it questions.

      The problem is my best GPU is my gaming GPU a 5060ti and its in a Bazzite gaming PC so its hard to get the AI out of it because of Bazzite’s “No I won’t let you break your computer” philosophy, which is why I did it. And my second best GPU is a 3060 12GB which is really good, but if I made a dedicated AI server, I’d want it to be better than my current server.

      • mierdabird@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        1 day ago

        I’m actually right there with you, I have a 3060 12gb and tbh I think it’s the absolute most cost effective GPU option for home use right now. You can run 14B models at a very reasonable pace.
        Doubling or tripling the cost and power draw just to get 16-24gb doesn’t seem worth it to me. If you really want an AI-optimized box I think something with the new Ryzen Max chips would be the way to go - like an ASUS ROG Z-Flow, Framework Desktop or the GMKtek option whatever it’s called. Apple’s new Mac Minis are also great options. Both Ryzen Max and Apple make use of shared CPU/GPU memory so you can go up 96GB+ at much much lower power draws.