As someone who doesnt mind 30fps, there shouldnt be games running at 30 on new gen hardware anymore lol
This right here. As a 40+ gamer, I don’t mind 30fps. Been dealing with lower fps for a long, long time and its fine for me. But that just seems like an unreasonably low expectation of a AAA video games these days.
What’s really weird to me is the hard 30fps cap. Why not have at least an option to disable the cap and let VRR do it’s job?
It was new gen three years ago, and its kept up the 60fps dream for a lot of games over the last three years. However developers were always going to hit the point of diminishing returns when their visions got bigger, and now we’re there.
If you target 60 fps you have to be more conservative woth poly counts, draw calls, shader complexity, rendering capabilities etc. You get have more you can play with on the rendering side and can technically have better visuals. It’s a dev decision. Devs will always need to make that decision until there are not hardware limitations.
Double the frame rate is always better than marginally better visuals no one would even notice unless you have a magnifying glass to compare side by side
and in this case rhey made the wrong decision imo
games like minecraft, runescape or WoW are still popular, why the hell are studios spending this much of their performance on having 4k resolution on every rock, tree and dust mite
Beth has historically had to make serious gameplay concessions because of consoles. Console limitations killed open cities and levitation on their engine in Oblivion.
I don’t mind if they play it safe with Starfield.
I think they should at least give console players the choice between 4k 30 fps or 1080p 60 fps Let’s be realistic here, 4k 60fps for a game of this size in this engine will require a BEEFY machine, nothing a current gen console can offer.
The game is probably CPU bound not GPU bound, based on past bethesda games. If that is the case, decreasing the resolution will not necessarily increase the frame rate a proportional amount.
Idk, if they release a game in 2023 that is still CPU bound that would be a big L from them. I really hope that’s not the case.
Especially because I bought a freaking 7900 XTX mainly for Starfield :D
Idk, if they release a game in 2023 that is still CPU bound that would be a big L from them.
This is Bethesda were talking here lmao. Starfield is still running on the Creation Engine, which they’ve been hacking together since the Morrowind days.
I love how Microsoft said that with their exclusive titles they’ll only have to focus on one console and as such the performance will be better. Now here we are and seemingly all of these titles run at 30 FPS. I just hope they will offer a performance option if it is run on a lower resolution. Having these options is exactly what keeps me on the PC platform.
Not a deal breaker for this kind of game but a 60fps performance mode on series x at 1080p would’ve been a nice option.
Playing TOTK right now on switch and it really proves how great games can overcome technical limitation. A masterpiece at 30fps is still a masterpiece. Here’s hoping Starfield can deliver as a great game first and foremost.
If the bottleneck is something like AI or physics calculations on the CPU then lowering the rendering resolution won’t help achieve a higher framerate unfortunately.
I suspect most games shipping this gen without 60 FPS modes are CPU bound.
They’ll save 60fps for the Special Edition in 3 years.
Only on the “Pro” consoles that will have been released by then :).
Xbox Series X X Xbox X (Elite edition)
So on a series X you are forced to run it at 4k?
I think that’s just the display resolution. I expect this game will use dynamic render resolution like most games these days. The render resolution will probably not hit 4K often (if at all).
I’d rather see consoles be limited to what they can handle than a game to be limited for everyone because of what a single console can handle.
I want this game to be huge and look beautiful. If my PC can handle 60fps I don’t want to locked to 30fps because that’s all an Xbox can handle. And if I want to play it on an Xbox I don’t want it to be a blurry mess to get 60fps, I want it to look as good as it possibly can. Especially in a game like this where the visuals do a great majority of the storytelling when it comes to exploration and finding new things.
the fps lock isnt on pc
Yes, I was praising that. I may have worded it in a confusing way.
It still seems way to common for an engine to have other systems tied to FPS, so e.g. running at a higher framerate will mean the physics engine also runs faster, or all animations are faster.
As a game dev: this is 100% the developers fault. The engine knows how long it’s been between frames. Devs can use that information to keep everything running at the same pace regardless of 30fps, 10, or 120.
Next time you see a game with its speed tied to the frame rate, make sure you make some noise! Maybe devs will finally fucking learn.
i wouldnt be too sure that fps and physics are tied, they managed to separate them for 76
This isn’t surprising. Todd Howard already stated that given the choice between fidelity or framerate they would choose fidelity every time. It’s disappointing that he thinks that’s still what people want in 2023, but it’s not surprising.
That’s absolutely what I want in 2023. Anything over 30fps is completely unnecessary outside of competitive multiplayer.
I find 30 FPS strictly bearable with a controller, unplayable on mouse and keyboard
Yeah 30 to 60 is a big difference. Past 60 things definitely start looking real samey though.
I’m curious about this kind of thing from an engine and console architecture perspective. Any gamedevs able to shed some light?
I work in the industry, but not directly on low-level engine implementation details. Personally, my gut thinking is that the Creation Engine is falling behind in terms of modern asset streaming techniques.
Similarly, I wonder if a lack of strong virtualized geometry / just-in-time LOD generation tech could be a huge bottleneck?
From what I understand, efforts like Nanite in UE5 were an enormous engineering investment for Epic, and unless Bethesda has a massive engine team of their own (they don’t), they simply won’t be able to benefit from an in-house equivalent in tech.
Ultimately, I do think the lack of innovation in the Creation Engine is due to internal technical targets being established as “30FPS is good enough”, with frame times below 33ms being viewed as “for those PC gamers with power to spare.”
My best guess would be that the engine just has vast amounts of technical debt. Skyrim (pre-LE at least) had a savegame corruption bug that has been around since Morrowind. And while I’m sure they have rewritten huge parts of the engine over the decades it’s not rare to see bugs persist over generations, and modders complaining loudly about it. The engine has never been great about asset streaming either so no surprise here.