Most Xbox360 and PS3 games were 720p at 30fps. 720p was mostly fine because 1080p TVs were luxury items back then.
The performance problems in modern games are often not caused by fillrate-vs-resolution bottlenecks though, but by poor engine architecture decisions (triggering shader recompilations in the hot path).
Shader recompilation causes stuttering not general performance problems. Shader complexity will though, which is a function of render quality.
But I’m confused about why you think fill rate isn’t an issue? If you are now upgrading from 1080p to 4K your GPU needs at the very least 4x the pixel pushing power and even then that’s only to maintain the same detail; you bought a 4K screen for more detail.
> But I’m confused about why you think fill rate isn’t an issue?
Because this is can be easily dealt with via upscaling or buying a more expensive GPU, but fixing shader recompilation in the hot path requires a complete engine redesign.
There aren’t faster GPUs affordable to most consumers, that’s the point. Yes, DLSS is used as a crutch because it’s easier to do AI upscaling than render at a higher resolution.
You don’t need a full engine redesign. UE5 provides tools for PSO bundling and also pre-caching, but you need to use them.
Also good material design and structure helps reduce the number of PSOs needed but again, you need knowledge of how the engine’s materials system works.
Presumably people do this because they hate money; as you say, it's much harder to make the pixels just slightly more crisp and you'll pay dearly for the privilege.
I might be misremembering, but I seem to remember most games of that era were 540p scaled to 1080p. 720p would have been an upgrade. But your point still stands.
The performance problems in modern games are often not caused by fillrate-vs-resolution bottlenecks though, but by poor engine architecture decisions (triggering shader recompilations in the hot path).