• Fushuan [he/him]
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 year ago

    I got quite the good AA by rendering the screen at 4k and letting the graphic card underscale it into the screen’s 1080p resolution. No AA needed, looks fiine.

    • LouNeko@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      That is basically MSAA without the edge dection. Rendering in 4K and downscaling is the dirtiest but most effective AA method. But downscaling the whole screen also applies to UI elements, this often times results in tiny blurry fonts if the UI isn’t scaled appropriately. But more and more games have started to add a render resolution scale option that goes beyond 100% without affecting the UI. Downscaling also causes latency issues. I can run Metal Gear Solid 5 at a stable 60 FPS at 4K but the display latency is very noticeable compared to 1440p at 60.
      I miss the time when you could just disable the games native AA and force MSAA through Nvidia control panel. But most newer titles dont accept Nvdias override, especialy Unreal games.

      • beefcat@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        1 year ago

        MSAA only samples the geometry multiple times, not the whole scene. It doesn’t work very well in games with a lot of shaders and other post process work, which is basically every game made in the last decade.

        What GP is describing is SSAA (Super sampled anti-aliasing).

        • LouNeko@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          Thats what I meant by edge detection. I think part of the downfall of MSAA in modern gaming is foliage. Nowadays every field in videogames is filled with lush grass, same goes for trees and bushes. They aren’t flat textures of low poly models anymore. Most engines use completely different rendering methods for foliage to get the 1000s of swaying leafs and grass on screen with minimum performance impact. But having to detect all the edges of every single piece of grass and apply oversampling to it, would make any game run at single digit frames. There are certainly a few other things that GPUs have to render in bulk to justfy novel rendering methods, but foliage is by far the best example. So I can understand why post processing AA is easier to implement. But is TAA really the best we cab do? Especially because things like swaying grass becomes a green blob through TAA. Slow and fine movement like swaying is really the bane of temporal sampling.

    • beefcat@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      That is an insanely expensive solution to this problem. You are cutting performance by 75% or more to make that possible, meaning your 30 FPS game could be doing 120 if you stuck to native 1080p.

      • Fushuan [he/him]
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        That’s the thing, my game is running at 60+, and I don’t need more.

        In any case new graphic cards AR prepared to run for 4k games, so having a 1080p screen which which I’m content is a godsend performance wise, it let’s me do stuff like this without practical performance losses.