Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The entire history of video cards has been "games that were hard on graphics cards"... leading to... a new generation of graphics cards that run the game with ease... leading to... a new generation of games. It's an ongoing battle.


There are games that are hard on graphics cards.

Then there are okay-looking city building games that pin the best GPU that exists at 100% usage at 1440.

A card that’s meant for playing AAA games at 4K Ultra.

Maybe a dev accidentally bundled a crypto miner they were running on company hardware after hours?


making a comparison to a AAA game at 4k ultra is exactly why the simulation genre is niche and doesn't bother to appeal to that audience anymore. People underestimate how much of AAA development is agressively culling out behaviors and minimizing calculations for things that aren't (or aren't likely) to be on screen, and for a simulation you can't do this. But people don't value what they don't see.


what? it's a city building game. houses. basically boxes. a landscape mesh, trees. and some voxel traffic. without camera movement it should be infinite FPS, the scenes are ideal for all of the usual game engine standard optimizations from z-index culling to simply streaming in low-poly shit on distant houses and trees.

I don't even understand why it's not basically caching most of the rendered scene and then using it as a background to move traffic on it, and update it as the simulated Sun moves every 10 seconds or so.


Yup, it's boxes that need to run logic every frame and don't just get removed from existence if you turn the camera away. It's doing a lot more than just rendering.

Compared to something like God of War that is trying to turn off anything not in the currently loaded level chunk (which of itself has mostly static assets), a simulation game needs to spend its budget moving around assets all over it's scene, or at least prepare approximations for where and what an asset is doing once it comes back in view. This is why such a game tends to be more cpu intensive. But I wouldn't be surprised if the GPU is being used for compute in such a game.

----

With all that said: I haven't played but to answer your specific question it sounds like the game has a very heavy and/or suboptimal GI scheme. People here have said to turn off GI and volumetric and they get much better performance. That kind of dynamic lighting may be the main culprit for a game already taking up a lot of cpu/GPU budget from the above factors.


Logic (simulation) is running on the CPU. And it's not the CPU that's running on 100%, even on an empty map.

Sure, maybe they run some neural network on the GPU (for ... pathfinding? or what?), but it's highly unlikely. And again, empty map.

Yes, probably someone went rather overboard with GI and other shaders.


What is new is that 80% of current gamers is playing on a GPU on par the with the highest end model of about 7 years ago. Moore’s law might not be dead, but at least is economically forgotten. So older games might have gotten over the performance bump in a year if two, now it’s more like seven. Paradox failed to correct for economics in this party.

(My 1060 6GB is still above average on userbenchmark.com. That a 2016 card and has never been top of the line.)


That was before Moore's law was dead. Now devs need to write code that is at least not terrible. (Insert link to casey muratori ranting about modern software)


This is not a new generation game, the graphics are at-least 10 years behind the rest of rest the industry. (No PBR or other modern rendering techniques are used).


I didn't make my comment about this specific game.

I made it about Cyberpunk 2077, Crysis, and Doom 3.


Yeah, but generally those games that push for new GPUs look spectacular, not like SimCity 5 from over a decade ago.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: