I'm getting ~60fps on my 4090 at 1440p. Haven't had any framerate issues. My GPU has been pinned at 100% and my temps are sky high, but the game is smooth. They definitely need to fix this, but it's not as dire for everyone as it's being made out to be.
Same for me with a 3090 + 5950x, after the first patch. It's hammering the GPU with almost constant 100% load, while CPU gets by with ~40% or something like that.
The entire history of video cards has been "games that were hard on graphics cards"... leading to... a new generation of graphics cards that run the game with ease... leading to... a new generation of games. It's an ongoing battle.
making a comparison to a AAA game at 4k ultra is exactly why the simulation genre is niche and doesn't bother to appeal to that audience anymore. People underestimate how much of AAA development is agressively culling out behaviors and minimizing calculations for things that aren't (or aren't likely) to be on screen, and for a simulation you can't do this. But people don't value what they don't see.
what? it's a city building game. houses. basically boxes. a landscape mesh, trees. and some voxel traffic. without camera movement it should be infinite FPS, the scenes are ideal for all of the usual game engine standard optimizations from z-index culling to simply streaming in low-poly shit on distant houses and trees.
I don't even understand why it's not basically caching most of the rendered scene and then using it as a background to move traffic on it, and update it as the simulated Sun moves every 10 seconds or so.
Yup, it's boxes that need to run logic every frame and don't just get removed from existence if you turn the camera away. It's doing a lot more than just rendering.
Compared to something like God of War that is trying to turn off anything not in the currently loaded level chunk (which of itself has mostly static assets), a simulation game needs to spend its budget moving around assets all over it's scene, or at least prepare approximations for where and what an asset is doing once it comes back in view. This is why such a game tends to be more cpu intensive. But I wouldn't be surprised if the GPU is being used for compute in such a game.
----
With all that said: I haven't played but to answer your specific question it sounds like the game has a very heavy and/or suboptimal GI scheme. People here have said to turn off GI and volumetric and they get much better performance. That kind of dynamic lighting may be the main culprit for a game already taking up a lot of cpu/GPU budget from the above factors.
What is new is that 80% of current gamers is playing on a GPU on par the with the highest end model of about 7 years ago. Moore’s law might not be dead, but at least is economically forgotten. So older games might have gotten over the performance bump in a year if two, now it’s more like seven. Paradox failed to correct for economics in this party.
(My 1060 6GB is still above average on userbenchmark.com. That a 2016 card and has never been top of the line.)
That was before Moore's law was dead. Now devs need to write code that is at least not terrible. (Insert link to casey muratori ranting about modern software)
This is not a new generation game, the graphics are at-least 10 years behind the rest of rest the industry. (No PBR or other modern rendering techniques are used).
I don't think that's accurate. There's tons of complaints and legit issues with performance for sure, but I don't think making up numbers (likely grossly exaggerated) is helpful.
It doesn't require a 4090 to run, or to run well. 99% of customers in fact do not consider the situation dire. That's completely fake and made up. By tweaking a couple default settings (depth of field and fog are the big ones) the game runs fine.
So it's safe to say the game ships with bad defaults and the performance could be better all around. But to say 99% of customers might consider the situation dire, to be completely honest, is fucking stupid and I'm disappointed to see it here.
>But to say 99% of customers might consider the situation dire, to be completely honest, is fucking stupid and I'm disappointed to see it here.
It's definitely the biggest effect of the reddit migration. Many technical topics here still feel fine for the most part (if not slightly more political), but game topics on here feel like I'm back on r/games or r/gaming.
I could have worded it better. Basically, we've been inundated with a bunch of pre-release players saying the game was basically unplayable on any GPU.
Post-launch, it's perfectly playable for a lot of people. Even on older cards. Performance is by no means good, but it's not a slideshow like everyone was worried it would be pre-release.
(fwiw I also tested this on a 2070. It wasn't great at 1440p but was performing just as well as my 4090 on 1080p)
It's strange to think about Consoles, PCs, and Ports. Consoles were nothing like PCs for a long time, so ports took actual effort and reduced cross-platform releases a bit. Then consoles started to look and act more like PCs and porting was relatively easier until the hardware started to diverge into the powerful, combined SoC systems we have now compared to the sturdy-but-separated PCs that haven't had a major form-factor change in 20 years.
So PC ports are less efficient because they don't have such continuous, low latency access to memory and the best solution (cheapest?) is usually to just dump everything into VRAM and require more memory. It's frustrating when you can buy a game on a PS5 and it plays great but the PC port needs at least double the specs to run well.
I guess PC gaming was kind of always expensive on the high end but the low end is just disappearing entirely when devs aren't allowed to spend time optimizing at all.
There have been so many Unity games which do not manage to throttle GPU usage. Battletech was another one where at the menu it would keep the GPU at 100%.
Not sure why devs allow this, but with Battletech it was a known issue that was never fully resolved.
You'll have to enable v-sync to limit framerate. I enable it always (or fps limit) because I don't like blasting my GPU at 100% which makes it hot and loud.