• brucethemoose@lemmy.world
    link
    fedilink
    English
    arrow-up
    42
    arrow-down
    1
    ·
    edit-2
    11 hours ago

    Trying to run Borderlands at 4K sounds about as stupid to me as…

    On the contrary, it should be perfectly runnable at 4K because its a 2025 FPS game and the cel-shaded graphics should be easy to render.

    ‘Unreal Engine’ is no excuse either. Try something like Satisfactory rendering literally thousands of dynamic machines on a shoestring budget with Lumen, like butter, on 2020 GPUs, and tell me that’s a sluggish engine.

    This is on Gearbox, who’ve developed on Unreal for 2 decades. And ‘sorry, we’ll work on it’ would have been a fine response…

    • CheeseNoodle@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      ·
      11 hours ago

      My understanding is that Unreal is a garbage engine for optimization at face value, it has a lot of useful tools and a lot of incorrect/dated documentation for those tools some of which are also just kind of configured wrong as their default settings. If effort is put into optimization to configure things correctly and only use the various tools like nanite or lumen in their actual use cases (rather than just throwing them on everything) you can get some pretty fantastic optimization.

      TLDR: Good but complex tools marketed as low effort with bad defaults and exagerated marketing.

      • brucethemoose@lemmy.world
        link
        fedilink
        English
        arrow-up
        20
        ·
        edit-2
        11 hours ago

        Gearbox has developed on Unreal Engine since 2005. They have ~1,300 employees.

        I’m sorry, I know game dev is hard. But if small, new studios can get it to work, Gearbox should get it to fly. They have no excuse.

    • cerebralhawks@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      12
      ·
      11 hours ago

      Right because Cyberpunk looks amazing in 2025 and it ran like shit five years ago at launch.

      But 4K gaming is more demanding than people realise and gaming companies shouldn’t be getting so much flak for it.

      • acosmichippo@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        1
        ·
        8 hours ago

        why are we comparing a game 5 years ago to one released today? hardware is much more capable now.

        • paraphrand@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          7 hours ago

          Eh, I guess it is. But it also isn’t. We have stagnated a bit in the rasterization and VRAM departments when you talk about affordable entry level cards that most people buy.

          And the PlayStation 5 has not changed since it launched…

          Sure, newer cards have Ray Tracing stuff and new video encoder pipelines and some other things. But when you look at classic rasterization, and look at xx60 and xx70 class cards, it’s not the sort of leaps we use to have. Node shrinks are not what they use to be.

          A 3060 is as fast as a 2070 super. Just as an example I have first hand experience with. There use to be much larger performance gaps between generations.

          • zalgotext@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            1
            ·
            3 hours ago

            You shouldn’t need leaps in hardware to render a highly stylized cell shaded video game in the year 2025.

            • paraphrand@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              1 hour ago

              Yeah, but it’s not just a toon shader/cel shaded shader.

              I’ve seen footage, this game has full (highly stylized) PBR and all sorts of fancy lighting along with the cell shaded look. Reducing it all to “cell shading” isn’t the whole story.

              Hi-Fi Rush is much closer to classic Borderlands than BL4 is.

              • zalgotext@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                2
                ·
                1 hour ago

                Right, I’ve seen the footage too, and I don’t think I’d recognize it as a Borderlands game if the name wasn’t included in the ads. I’m struggling to understand why they chose to go that route, rather than sticking to the beloved, iconic, performant art style of the other games in the series.

                • paraphrand@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  edit-2
                  59 minutes ago

                  That’s a fair take. I’m sure they feared looking dated. I think that’s a misguided fear that you have when you don’t think the other aspects can live up to expectations.

                  Which is weird because it sounds like they tried really hard with the story, as in they tried to elevate it beyond the usual dick and fart jokes and snark. (They didn’t eliminate that stuff, but rather tried to have a balance) BL3 was terrible in that dept. it went way too hard on that shit imo.

      • brucethemoose@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        11 hours ago

        Honestly Cyberpunk’s raytracing runs like poo compared to Lumen (or KCD2 Crytek) compared to how good it looks. I don’t like any of the RT effects but RT Reflections; both RT shadows options flicker, RT lighting conflicts with the baked-in lighting, yet doesn’t replace it if you mod it out.

        Most of Cyberpunk’s prettiness is there from good old rastarization, more than most people realize.

        PTGI looks incredible, but it’s basically only usable with mods and a 4090+.