• circuitfarmer@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    69
    arrow-down
    5
    ·
    edit-2
    1 year ago

    It’s BS though. People with TOTL hardware are having issues. Those systems don’t underperform because the game is advanced or anything like that – the game underperforms because it is a new release that is poorly optimized. It’s also expected because it’s on a senior citizen of a game engine that likely needs a few other nudges.

    Todd Howard forgets that PC users see this shit all the time, and it’s pretty obvious with this one. Hoping to see talk of optimization in a coming patch instead.

    Edit: a good example – not hitting 60fps in New Atlantis, but concurrently, CPU usage in the 50s and GPU usage in the 70s. That’s a sign of poor optimization.

    • Alto@kbin.social
      link
      fedilink
      arrow-up
      33
      arrow-down
      12
      ·
      1 year ago

      I’m starting to think that maybe, just maybe brute forcing a 26 yesr old engine that makes skyrim have a stroke if you try to play above 30fps isn’t a good idea

        • Animoscity@lemmy.world
          link
          fedilink
          English
          arrow-up
          17
          arrow-down
          2
          ·
          1 year ago

          No, Im not a fan of the game personally but a quick search shows they are using the creative engine 2, which is a newer version of their engine.

          • azertyfun@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            12
            arrow-down
            2
            ·
            1 year ago

            They could have called it Creative Engine 129030129784.32985 for all that it matters. It’s just a name for an engine update, as they do for every new game. They didn’t re-write it from scratch; that would be a billion-dollar venture.

            From what I’ve read it’s the exact same engine as FO4 with better lighting (and of course, as with every new game, some improvements locally relevant to the gameplay).
            But, fundamentally, underneath the fancy lights, still the same engine. That explains the 2008-esque animations, the bugs, the performance issues, and general flatness of the game. It can’t be more than “Skyrim in Space” because that’s what it technically is.

          • mordack550@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            1
            ·
            1 year ago

            Because putting a 2 after the name makes a new engine. It’s just a new iteration of the same old engine that runs Fallout 3, skyrim, and Fallout 4.

          • Alto@kbin.social
            link
            fedilink
            arrow-up
            5
            arrow-down
            4
            ·
            1 year ago

            Ill see if I can find it when I’m at my PC, but in an interview a dev said it was still using significant amounts of code from their Gamebryo engine from 97

      • _waffle_@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        5
        ·
        edit-2
        1 year ago

        What game engine is 26 years old other than the Unreal engine?

        Edit: stepped on some toes i guess lmfao

    • Th3D3k0y@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      3
      ·
      1 year ago

      My friend and I were just discussing the likelihood that some hardware producers pay game devs to purposely output bad optimizations so users are encouraged to spend more on upgrades.

      • circuitfarmer@lemmy.sdf.org
        link
        fedilink
        English
        arrow-up
        6
        ·
        1 year ago

        In this case, you get Starfield free with the purchase of select AMD CPUs or GPUs.

        But it’s weird for Todd Howard to come out with this push now, because it’s in response to those already playing the game.

        • Rheios@ttrpg.network
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          edit-2
          1 year ago

          I mean, that’s probably why he would make the push. The bait’s in the mouth (people have the game), then comes the pull of the hook (they have to upgrade to try and handle its poor optimization, fulfilling the benefit of AMD backing them). And Beth doesn’t lose anything if its too frustrating and people stop playing over it because they already have the money.

          EDIT: Admittedly I keep forgetting that game-pass is a thing, but maybe even that doesn’t really matter to Microsoft if it got people to get on gamepass or something? That makes my earlier point a bit shakier.

        • MentalEdge@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          19
          arrow-down
          7
          ·
          1 year ago

          While I’m no fan of paid sponsorships holding back good games, this is untrue.

          Neither nvidia nor amd block their partner devs from supporting competing tech in their games. They just won’t help them get it working, and obviously the other side won’t either, since that dev is sponsored. There are some games out there that support both, some of them even partnered.

          So yes, it’s bullshit. But it’s not “literally paid” bullshit. Bethesda could have gone the extra mile, and didn’t.

          • hypelightfly@kbin.social
            link
            fedilink
            arrow-up
            3
            arrow-down
            8
            ·
            edit-2
            1 year ago

            AMD blocks partners from implementing DLSS. You’re probably right that it’s not paid bullshit as the payout isn’t monetary. But it’s still being blocked due to the partnership.

            This is hardly the first game to do this. Jedi Survivor, RE4 have the same problem. AMD sponsored FSR2 only. The work required to implement FSR2 or DLSS is basically the same (motion data). That’s why DLSS mods were immediately available.

            Since FSR2 was released not a single AMD sponsored game has DLSS added. Even games done in engines like unreal where all the dev has to do is include the plugin.

  • gearheart
    link
    fedilink
    English
    arrow-up
    21
    arrow-down
    2
    ·
    1 year ago

    I expected this once everyone kept buying into nvidias dlss.

    Nvidia and dlss will be required to get titles to run decently.

    Minimal game optimization will be done on majority of future game titles.

    Fml

  • Pocketyeti@lemmy.world
    link
    fedilink
    English
    arrow-up
    19
    arrow-down
    1
    ·
    1 year ago

    Why upgrade when I will just pick it up on the PS7, 10 years from now, along with the Skyrim bundle.

  • 1stTime4MeInMCU@mander.xyz
    link
    fedilink
    English
    arrow-up
    20
    arrow-down
    4
    ·
    1 year ago

    I haven’t played starfield yet but many of the recent headliner releases have been performance hogs. It’s not unreasonable to expect people to either play with lower settings or upgrade if you want to run the best possible set up. That’s why there are performance sliders in most games. When you need a 3080 to run minimum settings that’s when you start running into trouble (👀ksp 2)

    • DoomBot5@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      1 year ago

      At the same time my 3080 runs these games just fine with 60-90 fps at 4k with high settings. Don’t need more than that for games that aren’t competitive.

    • Ookami38@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      1 year ago

      Man, that’s why armored core blew me away. Completed the whole game, at launch, maximum settings and I don’t recall a single frame drop. 3060, with very mediocre other hardware. I know there’s a lot to be said about map sizes and instanced missions, but with as fantastic as that game looks and plays…

      • weirdo_from_space@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        Same happened with Doom Eternal. The graphics were a show stopper when the game came out and the game didn’t even stutter. It’s so well optimized that I’m told you can even play it with integrated graphics.

        • Ookami38@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          3
          ·
          1 year ago

          It’s almost like having a giant open world comes with some massive drawbacks. I’m pretty fatigued over open world games tho so that may just be me.

          • weirdo_from_space@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            Frankly, open world sucks. I’ve played Far Cry 2 sometime last year because one of my friends spoke so highly of it and I’ve spent more time driving around than actually shooting anything. It served no purpose other than wasting player’s time. Missions were rather basic too. And nothing in the reviews of more modern examples showcase that anything has changed.

      • 1stTime4MeInMCU@mander.xyz
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        I have a 3060Ti and play most games on max settings. There is the occasional game that explodes if I do that but otherwise GPU power is out ahead of decently optimized games (probably because gaming is now no longer the driving factor for GPU performance).

      • NotAFuckingBot@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        1 year ago

        I own many games that I impulse buy, but find out that I don’t care for. That gets expensive.

        Now I’m much more selective, and tend to wait until the game’s been out long enough to get patches, updates, and reviews.

        Add my lack of interest in any Todd Howard product until ES6, which I may not live long enough to play (boomer puke here), as well as the offhanded arrogance of his ‘upgrade your PC’ statement, and that about covers why I’ve decided not to buy Starfield.

  • cyanarchy@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    12
    ·
    1 year ago

    Starfield also requires an SSD, a first for a modern triple-A PC game.

    I recall the same being said about Cyberpunk 2077, and I’m not sure that was the first either.

  • GBU_28
    link
    fedilink
    English
    arrow-up
    14
    arrow-down
    2
    ·
    1 year ago

    Y’all are surprised the boss of a AAA studio suggested you buy hardware from companies he has a deeply vested interest in?

    It’s all one big circle jerk of companies and anyone buying “cutting edge” gets what they deserve.

    You’re the product in more ways than one

    • Zeppo@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      You’re literally the consumer in this instance. The game is the product. The computer is the product.

  • Neato@kbin.social
    link
    fedilink
    arrow-up
    14
    arrow-down
    3
    ·
    edit-2
    1 year ago

    Its on Game Pass, Todd. If it doesn’t run well I’ll just not play Skyrim-Space Edition.

    My partner who is interested has a PS5 and an older PC. If her PC doesn’t run it, she’ll probably just keep playing Stardew Valley. Honestly it’s not like anyone is going to really be talking about Starfield in a month or two except ridiculous ship builds on social media.

    • nivenkos@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      3
      ·
      1 year ago

      I bought a new PC just to play Starfield (and BG3 with less issues).

      It looks alright overall. But it’s pretty crazy that even 30xx cards can’t run it well (I had a 1070 though).

      • circuitfarmer@lemmy.sdf.org
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        I did a CPU/mobo/RAM upgrade for it – but I was quite overdue.

        It looks alright overall.

        That’s the thing. It looks alright, but it’s not the next-gen beauty fest that they want people to think it is. Plenty look better and run better. I enjoy the game, but the whole argument that it’s a graphical standout doesn’t really hold water.

  • manastorm@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    13
    arrow-down
    4
    ·
    1 year ago

    I have a i9 13900k and a Radeon 7900xtx, 64GB RAM and I had to refund on steam it because it would keep crashing to desktop every few minutes. Sometimes I would not even get passed the Bethesda into Logo before crashing. Very frustrating experience to say the least.

    • AmosBurton_ThatGuy@lemmy.ca
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      2
      ·
      edit-2
      1 year ago

      I mean, the game definitely runs like shit but if you keep crashing that sounds like a you problem. My 7600x/6700XT/32GB DDR5 build hasn’t crashed once in 15 hours of playtime and I’ve heard a ton of complaints about the game but barely any about crashing.

    • entropicshart@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      9
      ·
      1 year ago

      I have a i7-10700k/32gbRAM/3080ti - playing the game at 4k with all settings to max (without motion blur ofc) and with almost 80hrs into the game, I have yet to have a single crash or performance issue.

      Only realized people were having issues when I saw posts and performance mods popping up.

  • qyron@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    1
    ·
    1 year ago

    Not that I’ll be buying it anytime soon but if the hardware specifications I’ve read are true, no graphics card is worth €500+ to play a game. This is bonkers.

  • comedy@kbin.social
    link
    fedilink
    arrow-up
    4
    ·
    1 year ago

    Wish my computer weren’t dead, so I could at least try to play it. Although my 2070 wouldn’t have survived. It runs nice on my Series X, but I hate playing this type of game with a controller.

    • circuitfarmer@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      I’m a PC gamer who likes playing with controllers generally (from the couch), but damn, I hate the way they adapted run and walk to the left analog stick. Feels horrible. I wish I just had autorun and could hold a button to walk. The key binding shuts off even if I try and force it with Steam controller config, because the game doesn’t technically support split inputs.

  • speedstriker858
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    3
    ·
    1 year ago

    Ridiculous statement. I’ve got an rx 7900xtx and a ryzen 7 7700x with 64 gigs of ram @5600mhz and the fucking game barely ever hits 144fps. Usually it’s sitting around 100-110 fps which is playable for sure, but literally every other game I’ve played on it has had no problem staying nailed at 144fps. This is at low-medium settings BTW (for starfield).

    • Rykzon@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      1 year ago

      Ridiculous statement. 100-110fps is far above playable. Do people forget how Witcher, Crysis and others ran on release?

  • Sho@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    3
    ·
    1 year ago

    What Todd Howard is being a dipshit tool again? I’m shocked…shocked I tell you…

    • Rheios@ttrpg.network
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      1 year ago

      I’m a little shocked. Normally its Hines caught with his foot that deep in his own mouth.

    • Joker@kbin.social
      link
      fedilink
      arrow-up
      5
      arrow-down
      1
      ·
      1 year ago

      Same here except I use a 6600 xt, which isn’t anywhere near as good as your GPU. I’m running medium settings at 4k and it’s fine. It even runs on the Steam Deck, although the graphics are not so good on there. Still, it’s playable and I will probably play there when it’s convenient.

      IMO, ultra settings are for people with new, high end hardware and to future proof a game for at least a couple years. It’s not for people running a 2-3 year old rig with a 1080p GPU. Medium and high settings are generally good. Ultra is just like bonus mode for hardcore enthusiasts.

      • nfntordr@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        Yeah, the reason why I mentioned my experience is because I’m finding people with better specs complaining and I’m like if we just turned the FPS counter off and enjoyed the game, I’m sure we’d barely notice it dips below 60 at times.