• jsomae@lemmy.ml
    link
    fedilink
    arrow-up
    4
    ·
    1 month ago

    It’s pretty easy to discern refresh rate with the human eye if one tries. Just move your cursor back and forth really quickly. The number of ghost cursors in the trail it leaves behind (which btw only exist in perception by the human eye) is inversely proportional to the refresh rate.

    • Fushuan [he/him]
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      1 month ago

      Sure, but wasting double or triple the resources for that is not fine. There’s very limited places where that even is a gain on games, because outside those super competitive limited games it’s not like it matters.

      • jsomae@lemmy.ml
        link
        fedilink
        arrow-up
        4
        ·
        1 month ago

        Yeah I agree with you, but I was just refuting your claim that it’s not perceivable even if you try.

        • Fushuan [he/him]
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          1 month ago

          oh, yeah I’ve read and heard of plenty people saying that they definitely notice it. I’m lucky enough not to because most ARPGs don’t run 60FPS on intense combat, let alone 120 fps on a rtx3080 lmao.

          I was talking more about the jump from 240 and beyond, which I find surprising for people to notice the upgrade on intense gaming encounters, not while calmly checking or testing. I guess that there’s people who do notice, but again, running games on such high tick rate is very expensive for the gpu and a waste most of the time.

          I’m just kinda butthurt that people feel like screens below 120 are bad, when most games I play hardly run 60 fps smooth, because the market will follow and in some years we will hardly have what I consider normal monitors, and the cards will just eat way more electricity for very small gains.