• leaky_shower_thought@feddit.nl
    link
    fedilink
    English
    arrow-up
    35
    arrow-down
    3
    ·
    4 months ago

    the premise seems flawed, i think.

    i feel what he’s saying is: we suck optimizing gfx performance now because gamers deem ai upscale quality as passable

    this feels opposite to what the ps poll says that gamers enable performance mode more because the priority is more stable frames than shiny anti aliasing/post processing.

    • MudMan@fedia.io
      link
      fedilink
      arrow-up
      15
      arrow-down
      3
      ·
      4 months ago

      I don’t see how that’s the case. Most people prefer more fps over image quality, so minor artifacting from DLSS is preferable to the game running much slower with cleaner image quality. That is consistent with the PS data (which wasn’t a poll, to my understanding).

      I also dispute the other assumption, that “we suck at optimizing performance”. The difference between now and the days of the 1080Ti when you could just max out games and call it a day, is that we’re targeting 4K at 120fps and up, as opposed to every game maxing out at 1080p60. There is no target for performance on PC anymore, every game can be cranked higher. We are still using CounterStrike for performance benchmarks, running at 400-1000fps. There will never be a set performance target again.

      If anything, optimization now is sublime. It’s insane that you can run most AAA games on both a Steam Deck and a 4090 out of the same set of drivers and executables. That is unheard of. Back in the day the types of games you could run on both a laptop and a gaming PC looked like WoW instead of Crysis. We’ve gotten so much better at scalability.

      • leaky_shower_thought@feddit.nl
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 months ago

        Most people prefer more fps over image quality, so minor artifacting from DLSS is preferable to the game running much slower with cleaner image quality.

        I don’t think we’re not much different in this portion. AI upscale is passable enough that gamers will choose it. If presented with a better, non-artifacting option, gamers will choose that since the goal is performance and not AI. If the stat is from PS data, and not from a poll, I think it just strengthens that users want performance more.

        There will never be a set performance target again.

        It’s not that there’s no set performance target. The difference is merely one, on the CounterStrike era, vs. many, now. Now, there’s more performance targets for PC than Counter Strike days. Games just can’t keep up. Saying “there will never be a set performance target” is just washing hands when a publishers/ directors won’t set directions and priorities which performance point to prioritize.

        It might be that your point is optimizing for scalability, and that is fine too.

        • MudMan@fedia.io
          link
          fedilink
          arrow-up
          2
          arrow-down
          1
          ·
          4 months ago

          Yeah, optimizing for scalability is the only sane choice from the dev side when you’re juggling hardware ranging from the Switch and the Steam Deck to the bananas nonsense insanity that is the 4090. And like I said earlier, often you don’t even get different binaries or drivers for those, the same game has to support all of it at once.

          It’s true that there are still some set targets along the way. The PS5 is one, the Switch is one if you support it, the Steam Deck is there if you’re aiming to support low power gaming. But that’s besides the point, the PS5 alone requires two to three setups to be designed, implemented and tested. PC compatibility testing is a nightmare at the best of times, and with a host of display refresh rates, arbitrary resolutions and all sorts of integrated and dedicated GPUs from three different vendors expected to get support it’s outright impossible to do granularly. The idea that PC games have become less supported or supportive of scalability is absurd. I remember the days where a game would support one GPU. As in, the one. If you had any other one it was software rendering at best. Sometimes you had to buy a separate box for each supported card.

          We got used to the good stuff during the 900 series and 1000 series from Nvidia basically running console games maxed out at 1080p60, but that was a very brief slice of time, it’s gone and it’s not coming back.