• SuddenDownpour@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    48
    arrow-down
    1
    ·
    7 months ago

    A developer tells this anecdote from a project from the PS2 era. They showed a pretty late build of the game to their publisher, very few weeks before the game had to be ready to begin the distribution process, and the FPS appeared at a corner of the screen, never quite falling below 30FPS, but often making important jumps. The people from the publishing company said that “Everything about the game looks fine, except for the FPS. 30 FPS is unacceptable, and we cannot publish it if you can’t reach a consistent 60 FPS”.

    You don’t need to know much about development to understand that making such a demand weeks before the launch of a medium-sized project is asking the impossible, and so did this dev team. In the end, they changed the function that took the real calculation of the FPS so that it returned a fraction of the difference between 60 and the real FPS, so the next time the publisher took a look at the game, the FPS counter would always show a value between 58 and 60, even though they didn’t have the time to really optimize the game. The publisher didn’t notice the deception and the game was a commercial success.

    • AeroLemming
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      36
      ·
      7 months ago

      That seems highly unethical. 30 FPS really is unplayable and I’m sure that if this is true, it caused headaches in many players.

      • Soggy@lemmy.world
        link
        fedilink
        English
        arrow-up
        26
        arrow-down
        1
        ·
        edit-2
        7 months ago

        30 is unplayable? I guess I haven’t played all these games then.

        • AeroLemming
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          9
          ·
          7 months ago

          I don’t know about you, but I get headaches when playing first person games with rapid camera movement if my FPS is too low. I guess that must not be common, based on the votes.

          • rautapekoni@sopuli.xyz
            link
            fedilink
            English
            arrow-up
            12
            arrow-down
            1
            ·
            7 months ago

            Did you not see the part where it said “PS2 era”? The turning rates of the player/camera in any console first person shooter from those times are downright sluggish by mouse/keyboard standards, but the games were also designed around that slower pace.

            • AeroLemming
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              22
              ·
              7 months ago

              I never had a PS2, so how would I know that? No need to be so hostile!

                • AeroLemming
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  6 months ago

                  Because I didn’t realize there was a critical design difference that allowed 30 FPS to work without causing headaches. I’ve played my fair share of FPS games and have never, ever seen that slow camera behavior anywhere, so how the fuck would I even have any indication that it was present, especially if there was demand for 60 FPS? Lemmy truly is more toxic than Reddit 💀

  • Underwaterbob
    link
    fedilink
    English
    arrow-up
    50
    arrow-down
    5
    ·
    7 months ago

    Meh. 60 is enough for me. I didn’t notice 144 being that much better than 60.
    30 can fuck right off though.

    • SgtAStrawberry@lemmy.world
      link
      fedilink
      English
      arrow-up
      14
      ·
      7 months ago

      I can go down to 30 probably a bit lower as long as it is consistant, that is the most important part.

      It can also have a bit to do with me powering through Watch Dogs at 1 fram per second in some parts. You never notice how good 25-30 is until your frames starts camping in the singel digits.

      • jaycifer@kbin.social
        link
        fedilink
        arrow-up
        2
        ·
        7 months ago

        I remember playing Assassins Creed II on pc with a 9500GT and getting sub 20fps constantly to the point I had to wait for character animations to catch up with the dialogue so the next person could talk. Halfway through the game I upgraded to a GTX 560 and was astounded that everything was in sync and oh so smooth. I always remember that when I start getting annoyed I can’t get over 90fps in a game. As long as it’s playable!

    • PM_Your_Nudes_Please@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      2
      ·
      7 months ago

      It pretty much only makes a difference in FPS games where you’re constantly switching back and forth between crosshairs focus and peripheral vision flick reactions. At 144Hz, motion blur between frames is largely eliminated, so you have more accurate flicks and your vision at the crosshairs is much sharper.

      • Carnelian@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        ·
        7 months ago

        FPS, and also just anything in general where the camera is panning quickly, such as character-centered 2d games

      • kilinrax@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        7 months ago

        You can’t play racing games if you believe that. I’d far rather play an FPS at 30 than a racing game at 60. Low frame rates can give me motion sickness at high camera speeds.

  • Vespair
    link
    fedilink
    English
    arrow-up
    35
    arrow-down
    9
    ·
    7 months ago

    All you FPS kids are just doing the new version of “eww that game has 2d graphics; polygons or bust!!” from the PlayStation era.

    Yes, progress is cool and good, but no it’s not the end-all be-all and no not every game has to have bleeding edge FPS to be good.

    Like, we’re literally already done this shit guys; can’t we just learn from the past?

    • CH3DD4R_G0B-L1N@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      32
      arrow-down
      3
      ·
      7 months ago

      My brother or sister in pixels, this is not the same. I’m not a graphics snob. I still play pixelated, barely discernible nonsense games. When I updated from 30 to 144, it was a whole new world. Now even 60 can feel sluggish. This is not a graphical fidelity argument. It’s input and response time and motion perception. Open your mind, man. Accept the frames.

      • Vespair
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        5
        ·
        7 months ago

        And that matters for certain games, a lot. But it doesn’t functionally matter at all for others. Same as the transition to polygons. My point, which I thought I stated clearly, was not “FPS BAD!!”, it was “FPS generally good, but stop acting like it’s the single most important factor in modern gaming.”

        • And009@reddthat.com
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          1
          ·
          7 months ago

          Simply put, if everything was 144fps then it would be easier on the eyes and motions would feel more natural. Even if it’s just navigating menus in a pixel style game.

          Real life has infinite frames per second. In a world where high fps gaming becomes the norm, a low 24 fps game could be a great art style and win awards for its ‘bold art direction’.

            • jaycifer@kbin.social
              link
              fedilink
              arrow-up
              5
              ·
              7 months ago

              That article states people can perceive images as rapidly as once every 13 milliseconds, which they math out to 75 fps, 25% higher than 60.

              Looking at the study itself, they were testing whether participants could pick out a picture that displayed for 13-80 ms when “masked” by other brief pictures, with a focus on whether it made a difference if the participant was told what image they were looking for before or after seeing the images. What they found was that participants could pick out the image as low as the 13 ms mark (albeit with less accuracy) and could generally do so better if told what to look for beforehand.

              What this tells me is that your source has nothing to say about anything over 75 fps. It also was testing in a fundamentally different environment than a video game, where your brain will constantly expect an image similar to and stemming from the image before it rather than seeing a completely different image. If you were to draw conclusions based on the study despite the differences, what the study would suggest is that knowing what to look for, as your brain does gaming, would make you better able to pick out individual frames. This makes me want to think that your source does not support your assertion, and that in a game you could perceive frame rates higher than 75 fps at a minimum.

              From my own knowledge, there’s also a fundamental difference between perceiving reality and computer screens in the form of motion blur. Objects moving in real time will leave a faint blur behind when perceiving it that your brain can use to fill in any blanks it may have missed, making reality appear smoother than it is. For an example of this wobble a pencil back and forth to make it “bend.” Movies filmed at 24 fps capture this minute motion blur as they film which makes it easier for our brains to watch them despite the lower frame rate. Real time rendered video games do not have this effect, as there are no after images to fill in the blanks (unless you turn on motion blur, which doesn’t do a good job emulating this).

              This means video games need to compensate, and the best way to do that is more frames per second so your brain doesn’t need to fill in the blanks with the motion blur it’s used to seeing in the real world. You’ll obviously get diminishing returns from the same increase, but there will still be returns.

      • whofearsthenight
        link
        fedilink
        English
        arrow-up
        2
        ·
        7 months ago

        Yeah, as much as I can give a shit about ray tracing or better shadows or whatever, as a budget gamer, frame rate is really fucking me up. I have a very low end PC so 60 is basically max. Moving back to 30 on the PS4 honestly feels like I’m playing PS2. I had the [mis]fortune of hanging out at a friends house and playing his PC rig with a 40 series card, 240hz monitor, etc, and suffice it to say it took a few days before I could get back to playing on my shit without everything feeling broken.

      • nevemsenki@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        8
        ·
        7 months ago

        Now even 60 can feel sluggish.

        That’s more or less the placebo effect at work, though. Most people cannot see “faster” than 60FPS; the only actual upside of running higher FPS rate is that you don’t go below 60 in case the game starts to lag for whatever reason. Now, you may be one of the few who actually see perceive changes better than normal, but for the vast majority, it’s more or less just placebo.

        • V0lD@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          1
          ·
          7 months ago

          That’s more or less the placebo effect at work, though. Most people cannot see “faster” than 60FPS;

          You can literally see the difference between 60 and 144 when moving the cursor or a window on your desktop. What are you on about

        • CommanderCloon@lemmy.ml
          link
          fedilink
          English
          arrow-up
          2
          ·
          7 months ago

          That’s just wrong. I couldn’t go back to my 60Hz phone after getting a 120Hz new one. It’s far from placebo, and saying otherwise is demonstrably false.

      • Vespair
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        1
        ·
        7 months ago

        The discussion is about 144

        • abbotsbury@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          1
          ·
          7 months ago

          144 FPS isn’t even bleeding edge, there are monitors with refresh rates higher than that.

    • dumpsterlid@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      2
      ·
      edit-2
      7 months ago

      One of most insufferable aspects of video game culture (pc gaming in particular) other than the relentless toxic masculinity from insecure nerds is an obsessive focus on having powerful hardware and shitting on people who think they are getting a good experience when they don’t have good hardware.

      The point is to own a computer that other people don’t have so you can play a game and get an experience other people don’t have, the point isn’t to celebrate a diversity of gaming experiences and value accessibility for those without the money for a nice computer. It really doesn’t matter if these people are intending to do this consciously or not, this is a story as old as time. It is the same exact bullshit as guitar people who only think special exotic or vintage guitars are beautiful, claim to absolutely love guitar but never once in their life have stopped to think about how much more beautiful it is that any random chump can get an objectively wonderful sounding guitar for a couple of hundred dollars than it is that they own some stupid special edition guitar with a magic paint job that cost as much as my shitty car.

      Good thing these people don’t fully dictate the flow of all of video game development, but they will never ever learn because this is the kind of pattern that arises not from conscious intention but rather from people uninterested in critically examining their own motivations.

      It is the same damn nauseating thing with photography too….

    • Nikki@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      7 months ago

      it depends on if its a good 30 or not. if inputs are quick and responsive, and the framerate stays at 30 then its fine. but if my device is struggling to run the game and its stuttering and unresponsive then its awful

      sm64 comes to mind as the best 30fps experience ive had, and i am spoiled rotten on high refresh rate games

  • gmtom@lemmy.world
    link
    fedilink
    English
    arrow-up
    25
    arrow-down
    2
    ·
    7 months ago

    I get it’s a meme but I usually play on 144fps and when I go back to 60 fps I literally don’t notice a difference even down to like 40-45 I barely see much difference. 30 is noticeable and a bit shit, but my eyes get used to it after like 30 mins, so not a big deal.

    • dumpsterlid@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      7 months ago

      I think it takes significantly more mental extrapolation between frames and general adjustment to your eyes not receiving frames at as quick of a rate but if the frame rate is fairly stable the human brain adapts.

      The brains visual processing is so powerful that the difference between 30fps and 144fps on paper is much smaller in reality, especially if your brain has already learned the muscle memory of “upscaling” a low framerate to work with its perception of a 3D environment.

      Competitively, for games like arena shooters or rocket league the frame rate is real but for most games it is a matter of the smoothness occurring on the physical monitor screen or it occurring on some level of mental image processing. What someone sees who has let the mental skill of processing a lower frame rate atrophy is a temporary sensation like putting on colored glasses for awhile, then taking them off and seeing everything washed out in a particular color. Weird, uncomfortable, but temporary.

      The real problem is inconsistent framerate where the clock your brain has gotten used to receiving new visual information with the arrival of each new frame of visual information is slow enough to be on the edge of perception but keeps speeding up or slowing down chaotically. Your brain can’t just train a static layer of mental image processing to smoothen that out. Time almost feels like it is slowing up and speeding down a little, and it becomes emotionally discouraging that every time something fun happens the framerate dips and reality becomes choppier.

    • Franklin@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      1
      ·
      edit-2
      7 months ago

      For me frame time inconsistency is the most noticeable. FPS, as long as it’s consistent 40 and above is fine.

      I will notice the difference in fluidity of motion but a large frame time difference destroys the experience.

    • BleatingZombie@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      7 months ago

      It seems to be dependent on the game for me. Some just seem like they should move smoother (like Akham Asylum on PC)

    • CommanderCloon@lemmy.ml
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      4
      ·
      7 months ago

      Yeah no, a game I regularly play just had 120Hz support added, and I’m never changing that back. I once even tried editing the .ini config just to change the framerate after coming back to it from a 120Hz game. It just is night and day, both in the input lag and the smoothness of the image

  • Cornpop@lemmy.world
    link
    fedilink
    English
    arrow-up
    18
    arrow-down
    1
    ·
    7 months ago

    My monitor is 240hz and counter strike runs about 400fps max settings at 1080p on my new system. It was absolutely insane playing for the first time coming from my steam deck running 30-40fps on lowest settings.

  • setsneedtofeed@lemmy.world
    link
    fedilink
    English
    arrow-up
    13
    ·
    7 months ago

    Not playing TekWar at 18 frames per second

    Cowards. It’s like you don’t even care about Capstone Software, the pinnacle of entertainment.

  • Fosheze@lemmy.world
    link
    fedilink
    English
    arrow-up
    18
    arrow-down
    5
    ·
    7 months ago

    People always go on and on about framerate but I’d take 4k at 60fps over 1080p at 144fps any day. I never really noticed a differance over 60fps. But the resolution makes a massive difference.

    • jaycifer@kbin.social
      link
      fedilink
      arrow-up
      2
      ·
      7 months ago

      For me it depends on the game. A menu game from Paradox like Crusader Kings? 4k 60fps. A competitive shooter? Ideally the max resolution (for greater pinpoint accuracy) and 144fps, but between the two I’d want maximum fps for the reaction speed and responsiveness. A pretty game like Ori and the Will of the Wisps? Crank the graphics up and I’m happy with 60fps.

    • RaoulDook@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      7 months ago

      They both make a difference. 1080p is console TV gaming minimum resolution. A good PC can do much better. I like my ultra-widescreen 144Hz monitor more than a plain boring 16x9 high-res monitor.

      But even games at 3440x1440 @ 144Hz don’t look as good as full-on VR at 120Hz and up. VR gaming for much time will make gaming on a monitor feel obsolete.

    • The Picard Maneuver@startrek.websiteOP
      link
      fedilink
      English
      arrow-up
      21
      arrow-down
      1
      ·
      7 months ago

      Every time I get a taste for something better, things get more expensive… I’m going to avoid trying anything higher than 144 for a while.

      • pivot_root@lemmy.world
        link
        fedilink
        English
        arrow-up
        18
        ·
        edit-2
        7 months ago

        It’s for the best that you do that.

        Sincerely, someone who “had” to buy an RTX 4080 after buying a new 200 Hz monitor.

        • Alteon@lemmy.world
          link
          fedilink
          English
          arrow-up
          8
          ·
          7 months ago

          Oh. Fuck…you got the Odyssey G9 as well?

          In order for me to even taste the sweet potential of that monitor, I’m having to build a whole new computer. I’m dreading it.

          • pivot_root@lemmy.world
            link
            fedilink
            English
            arrow-up
            18
            ·
            7 months ago

            Hide the Pain Harold

            The best advice I can give you is to turn off the FPS counter. If the game feels like it’s stuttering, turn down the quality. If it feels fine during gameplay, don’t fuck with it, and under no circumstances should you enable an FPS counter or frame timing graph.

            If you’re anything like me and you do enable the FPS counter or frame timing graphs, you’ll spend more time optimizing performance than actually enjoying the game…

        • Cornpop@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          7 months ago

          I just built a new PC with a Ryzen 7800X3D CPU and a Radeon RX 7900 XTX GPU, loving it so far. 400fps with Counter-Strike 2 at max settings 1080p

            • Cornpop@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              7 months ago

              B650, wasn’t planing on OC and figured better to put the money to the CPU and GPU

              • pivot_root@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                7 months ago

                Great pick. The X670 and E variants are insanely overpriced for what they provide. People don’t need PCIE 5 when there aren’t even any non-SSD components that use it.

        • XIN
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          7 months ago

          I made an awful mistake of getting a 1440p OLED instead of the 540 and now a TN panel is going to be very difficult to get used to.

    • EvolvedTurtle@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      7 months ago

      I’ve been there lmao

      Lately tho I’ve been using fabric mods and damn is it optimized

      Like it’s frustrating cause forge has a much much larger mod selection But it just so slowww

  • soggy_kitty@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    12
    arrow-down
    6
    ·
    edit-2
    7 months ago

    Who plays games at 30fps? I’m fairly sure 60 is industry standard now no?

    • Blackmist@feddit.uk
      link
      fedilink
      English
      arrow-up
      1
      ·
      7 months ago

      Even most console games run at 60 now, with an option to turn on some RT graphical wankery and run at 30.

      I often turn it on to see what it looks like, and then decide it’s not worth it. Ratchet and Clank actually played decently at 30, and one of the Ghostwire Tokyo options allowed you to have RT and decent framerates with a minor hit to resolution.

      Gsync/Freesync/VRR is a game changer for lower end hardware, because then all those dips below 60 get smoothed out to an even 45 or so. I’ve spent a lot less time fucking about with setting on PC since getting a monitor that supports that.

  • ThatWeirdGuy1001@lemmy.world
    link
    fedilink
    English
    arrow-up
    14
    arrow-down
    9
    ·
    7 months ago

    Meanwhile I pull out my old CRT slap in my N64 and play just like I would as a kid.

    Y’all’s obsessions with graphics and frames is weird to me.

    No hate, just confusion.

  • JackGreenEarth
    link
    fedilink
    English
    arrow-up
    16
    arrow-down
    49
    ·
    7 months ago

    Can’t your eye only see like 30 frames per second in the center?

    • yggdar@lemmy.world
      link
      fedilink
      English
      arrow-up
      46
      arrow-down
      1
      ·
      7 months ago

      Our eyes and brains don’t perceive still images or movement in the same way as a computer. There is no simple analogy between our perception and computer graphics.

      I’ve read that some things can be perceived at 1000 fps. IIRC, it was a single white frame shown for 1ms between black frames. Of course most things you won’t be able to perceive at that speed, but it certainly isn’t as simple as 30 fps!

      • Zron@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        ·
        7 months ago

        The human brain evolved to recognize threats in the wilderness.

        We see movement and patterns very well because early hominid predators were very fast and camouflaged, so seeing the patterns of their fur and being able to react to sudden movements meant those early people didn’t die.

        But evolution doesn’t optimize. Things only evolve up to the point where something lives long enough to reproduce. Maybe over extremely long time spans things will improve if they help find mates, but that is all evolution does.

        Your brain perceives things fast enough for you not to get eaten by a tiger. How fast is that? Who the fuck knows.

        All the being said, I like higher HZ monitors. I feel like I can perceive motion and react to things more quickly if the frame rate is higher. The smoother something looks, the more likely I feel that I can detect something like part of a character model rounding a corner. But no digital computer is ever going to have analog “frame times”, so any refresh rate you think feels comfortable is probably fine.

    • olutukko@lemmy.world
      link
      fedilink
      English
      arrow-up
      19
      arrow-down
      3
      ·
      7 months ago

      No. This is something console fanboys used to spread up when pc gamets showed off with their +30fps games

    • zout@kbin.social
      link
      fedilink
      arrow-up
      11
      arrow-down
      3
      ·
      7 months ago

      That’s what I’ve heard. but also, the frequency of electricity in the USA is 60 Hz because Tesla found after experimentation that that’s the frequency where you don’t notice a lightbulb flickering anymore. Since the lightbulb flickers 120 times per second at 60 Hz, you could assume that a lower framerate than 120 fps is noticable.

    • SendPicsofSandwiches@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      9
      ·
      7 months ago

      Technically yes, but the more fluid the video is 8n the first place, the fewer gaps your brain has to fill in. On 30 fps you can see the moving image just fine, but your brain is always assembling the pieces and ignoring the gaps. The higher framerates reduce the number of gaps and makes a surprising difference in how smooth something looks in motion.

    • TheSlad@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      57
      ·
      7 months ago

      Also most monitors only go up to 60fps, and even if you have a fancy monitor that does, your OS probably doesn’t bother to go higher than 60 anyways. Even if the game itself says the fps is higher, it just doesn’t know that your pc/monitor isnt actually bothering to render all the frames…

      • fiah@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        27
        ·
        7 months ago

        my man, just because you’ve never seen the refresh rate option in the monitor settings doesn’t mean it hasn’t been there since basically forever

      • pivot_root@lemmy.world
        link
        fedilink
        English
        arrow-up
        27
        ·
        edit-2
        7 months ago

        This is blatantly false.

        Windows will do whatever frame rate the EDID reports the display as being capable of. It won’t do it by default, but it’s just a simple change in the settings application.

        Macs support higher than 60 Hz displays these days, with some of the laptops even having a built-in one. They call it by some stupid marketing name, but it’s a 120 Hz display.

        Linux requires more tinkering with modelines and is complicated by the fact that you might either be running X or Wayland, but it’s supported as well.

        • drcobaltjedi@programming.dev
          link
          fedilink
          English
          arrow-up
          8
          ·
          7 months ago

          To add on to this. There are phones coming out now with 90+hz screens. They are noticably smoother than the 60hz ones. My current phone does 120hz.

          Yeah the OS can and will shove out frames as fast as the hardware can support them

        • Voyajer@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          edit-2
          7 months ago

          Wayland picks up my 155, 144, and 60 hz monitors and sets them to the correct refresh rate on it’s own nowadays, so it’s even more painless.

      • olutukko@lemmy.world
        link
        fedilink
        English
        arrow-up
        17
        ·
        7 months ago

        You just won the award for stupidest comment in the whole commentsection. That is just completely false and makes no sense in any way. Your computer doesn’t just skip calculations its told do do. Where did you even get this idea lmao

        • fiah@discuss.tchncs.de
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          7 months ago

          no it wasn’t true back then either, CRTs have been doing 100hz and more decades ago and it was very much supported by OSes and games