Standards have certainly changed over the years. This takes me straight back to 2003 when SimCity 4 came out, turned out to be an absolute resource hog, and I'd have been overjoyed with 20fps.
As the late Henry Petroski said: "The most amazing achievement of the computer software industry is its continuing cancellation of the steady and staggering gains made by the computer hardware industry."
The 2013 version of SimCity was a disaster for all sorts of reasons, not least of which that they took game devs and just mysteriously expected them to know how to build and run online services, run databases etc. A friend of mine was working for another EA subsidiary and ended up being parachuted in to try to help save the day. One of his first contributions made a phenomenal difference: He enabled connection pooling to the database servers. They'd done the entirely understandable, naive, thing of just going with the client defaults.
Seeing what folks in the demoscene can do nowadays with such limited hardware makes modern software feel all the more puzzling. I mean, yes demoscene stuff isn't concerned about ease of development, security or integration. But it does leave you yearning, think about the possibilities of modern hardware if treated with care.
This is the precise reason I prefer embedded development. The challenge of fitting my entire application into a handful of kilobytes with just a KB or two of RAM is a lot of fun. I get to build a program that runs really fast on a very slow system.
It's a point of personal pride for me to really understand what the machine is and what it's doing. Programming this way is working with the machine rather than trying to beat it into submission like you do with high level languages.
It seems a lot of programmers just see the CPU as a black box, if they even think about it at all. I don't expect more than a couple percent of programmers would truly grok the modern x86 architecture, but if you stop to consider how the CPU actually executes your code, you might make better decisions.
In the same vein, very high level languages are a big part of the problem. It's so far abstracted from the hardware that you can't reason about how it will actually behave on any real machine. And then you also have an invisible iceberg of layer upon layer upon layer of abstraction and indirection and unknowable, unreadable code that there's no reasonable way to know that your line of code does what you think and nothing else.
Modern software practices are bad and we should all throw away our computers and go back to the 8086. Just throw away the entire field of programming and start again.
I love embedded as a hobby but God is it a silly statement to imply we should go back to low level asm/C bs for everything, we would get so little done. Oh, but it would run fast at least.
Problem isn't high level dev, it's companies skimping out on the optimisation process.
That's sort of the opposite of treating the hardware with care. It's all done with no allowances for varying hardware at all. This is like pining for a 70s text editor, while refusing to admit the world has moved beyond 7bit ASCII, and that stuff like unicode support isn't "optional".
Treated with care and 1000x the development time, budget etc.
Things are slow because we prefer ease of development for these extraordinarily large and complex projects we call video games.
I think the smart thing really is to work it all out at a high level and then target the slow, considered and expensive fixes to where they're really needed.
I'm not excusing obviously lazy development though, but I do think we need to remember how crazy the scope of newer games can be. Imagine showing MSFS2020 to someone from 10-15 years ago; much of the earth scanned in and available for you to fly over, of course there are perf hiccups.
I will fully concede that the trend of game makers releasing half-baked, poorly optimized games that are buggy and unplayable at launch is totally a thing and it is frustrating and we should demand better (though we keep buying so why would they stop?).
BUT.... the online game community is so insufferable and this Cities Skylines II launch is a great example of it. The game is not about 4k 120 fps gameplay. It is a simulation game that runs fairly well even on last gen's hardware if you drop SOME of the fidelity settings. But that's not the predominant discourse. If people can't play it at 4k out of the box on their overpriced 4090 then they take straight to the internet to complain (and mind you they have tried fiddling with exactly 0 knobs to make it runnable).
I am by no means making excuses for game makers who certainly share much of the blame for creating an environment of distrust among game fans. But the online discourse is just rage baiting and looking for anything to hate with minimal evidence or sometimes even outright lies. Makes me want to go into a cave and play my games without seeing any content or discussion about it.
> If people can't play it at 4k out of the box on their overpriced 4090 then they take straight to the internet to complain (and mind you they have tried fiddling with exactly 0 knobs to make it runnable).
The top comment contains this extract from an IGN review:
> I have a 13900k, 64GB of RAM, and a RTX 4090, playing on a 1440p ultrawide monitor. I got 35fps at the main menu and in game on a brand new map w/o building a single thing. Turning off motion blur and depth field increased this from 35 to 50fps. Not a single other graphics setting changed the performance at all. I turned off every single setting I could or set it to the lowest possible, and still only got 50fps.
Yes, I was addressing the broader discourse more generally, specifically Reddit. But you're right that the article did directly address this though I would say the tone and title of the article are incongruous with the simple fact that they were able to get the game to run well with minor tweaks.
I take issue with "only got 50fps". This is not Counter Strike or a game that demands 300fps. 50fps (if your 1% lows are within reason) is completely playable.
It's not about the number by itself, it's what the number implies. If an RTX 4090 can't get to 60 FPS on an empty map, what will happen when the game is running on an RTX 4060 and it has to render a complete city?
>I take issue with "only got 50fps". This is not Counter Strike or a game that demands 300fps. 50fps (if your 1% lows are within reason) is completely playable.
You're complaining about people complaining. If you're satisfied with the game being playable then play it, but others may have different expectations of quality, and it's not wrong for them to voice their opinion when the product they paid money for doesn't meet them. I personally don't remember participating in a congress of gamers where everyone agreed not to complain about a game unless it was practically unplayable.
>If you're satisfied with the game being playable then play it, but others may have different expectations of quality, and it's not wrong for them to voice their opinion when the product they paid money for doesn't meet them.
I'm all for voicing opinions in a civic and calm manner. Most people online voicing their opinions come off as know-it-all teens or children throwing tantrums. It's as-if they have a _right_ to a CS2 with 120fps. Paradox warned about bad performance prior to launch (https://forum.paradoxplaza.com/forum/threads/updates-on-modd...). Nobody claimed or said performance was gonna be great. And still, people act surprised.
It's no surprise that the reviews was down at close to 30% a couple of hours into the release and today at 52%. Why is there such a massive bias towards large reviews at the first hours? Because many gamers loves thrashing about. It's much more important than taking a step down and calming down.
The people who reviewed the game in the first few days were the ones who either pre-ordered it or bought it as soon as it came out. They were so excited to play the new installment they took a gamble and trusted that the developer would produce a polished product, because they wanted to be able to play it as soon as possible. When they got to play the game, they saw it ran poorly to the point that it might have spoiled the experience for them. They're right to be angry about it, especially when developers and publishers make most of their money during the first few weeks since launch. By releasing a half-finished product they're treating their most enthusiastic users like crap. They didn't have to do that, they could have delayed the launch. Be it because of decisions made by the publisher or by the developer, they chose to release when they did. They made their bed, now they have to lie in it. I don't blame anyone who raves about performance, because what was released was well outside the realm of what's acceptable for a finished product, regardless of what said prior to launch. You don't get to sell a car with an asterisk that says "by the way, the fuel tank leaks so until we find a way to fix it you'll use twice as much fuel as normal".
> You don't get to sell a car with an asterisk that says "by the way, the fuel tank leaks so until we find a way to fix it you'll use twice as much fuel as normal".
Yeah. But in case of CS2, gamers did buy the leaking car. Devs analogously said "by the way, the fuel tank leaks" and people just went with "OK" and bought CS2, after which the customer started to complain (rave?!) about leaking fuel tanks. The car salesman retail store said "Well you can have all money back no questions asked until you've driven at least 160km". Steam has generous refunds. What does the customer do? (S)he still goes onto review sites and bitch about bad leaking fuel tanks. It is very much in bad faith on the customers part.
I wouldn't rush to Colossal Games defense if customers just said "It ran bad for me on my 4090 for some reason so I refunded". That's not what's going on with the negative reviews though. People act entitled.
I would argue a dealer telling you about a major defect directly before you buy the car is a bit different than a post on some forums that the product they're selling is not well made.
I would suggest it's not reasonable to expect that someone buying a game has to do research on a forum to know the game is unfinished -- if it's being sold as a finished game it's reasonable to expect it's in a playable state. the original post was meaning to say it would be unheard of for other products to allow companies to sell known unfinished products as finished products, even with the promise of completing the product. and consumers would similarly balk at such a proposal for virtually any other object.
it was more the absurdity of the different way games are treated which is anti-consumer.
>I would suggest it's not reasonable to expect that someone buying a game has to do research on a forum to know the game is unfinished -- if it's being sold as a finished game it's reasonable to expect it's in a playable state
It is playable, though. Now, if you wanted it to be a perfectly optimized, polished experience with no hitches even on older hardware: well, you get what you pay for, I suppose.
>it would be unheard of for other products to allow companies to sell known unfinished products as finished products, even with the promise of completing the product.
if "120fps" is your minimum requirement of "playable", then you probably care enough about performance to the point where you need to research every game you buy. Similar to how someone interested in road rallies or drag style street racing probably won't be satisfied even with perfectly driveable cars.
The kinds of people making these complaints are those "street racers", so to speak.
You're still not getting it. Yeah, if a car dealership had such a generous return policy you could get your money back and get a car that does what you need within your budget. But these people didn't want just a city builder and they happened to buy this one. They wanted to play the new version of Cities: Skylines. They're loyal fans and they're treated like beta testers.
Yes, it's entitlement. Customers are entitled to get a quality product in exchange for their money. When Paradox goes to spend their earnings they're not going to be throttled to do it at 45 cents per second.
It runs perfectly fine on my 1080 with low settings and a city of 50k.
Quite frankly I'm tired of people speaking out of their ass when there are people who are actually fucking playing the game. Do you also postulate about the weather without looking out the window?
Even 30 FPS is perfectly playable in any game. People have truly gone off the deep end with this FPS shit. Back in the day, most (all?) console games ran at 30 FPS, and we somehow managed to play them and have fun. I would love to see the people complaining about "only" 50 FPS try to wrap their brains around that one.
In your opinion only, not one to be shared by all.
> Back in the day, most (all?) console games ran at 30 FPS
No they didn’t.
> I would love to see the people complaining about "only" 50 FPS try to wrap their brains around that one.
Well it helps that first of all, you’re wrong. Secondly, there is a SIGNIFICANT difference with CRT motion quality meaning it’s not at all comparable to frame rates on LCD/OLED today.
most/all 3d games did for gen 5 and 6, and most of 7 before engines can utilize that hardware. Ocarina of time ran at 15 fps IIRC. Crash Bandicoot ran at 30fps. Final Fantasy 7 fell back down to 15fps.
Once we got to the PS2 era stuff started being 30fps, with the truly exceptionally optimized games hitting 60. Final Fantasy X would get bumped to 30fps and Jak and Daxter: the precursor legacy would be 60fps but often have some spikes down to 30fps. From what I can find, The original Halo also targeted 30fps.
>Well it helps that first of all, you’re wrong
but you just said it was their opinion not shared by all. This is why people don't take the gaming community discourse seriously.
As someone that primarily plays on PC i haven't thought 30 FPS is acceptable since the NES era.
Cities skylines 2 is being released on PC where the expected baseline has been 60 fps for 25 years or so. 60 fps at a reasonable resolution should be available to most users. If even people with a top of the line CPU and $1800 GPU are limited to 50 fps you failed.
this take is not fair to the community. almost all games auto set graphics settings as per your hardware since many years now.
any good developer would test these configurations on common hardware combinations (at least on the popular GPUs) before shipping. other than maybe the graphics preset, why are we expected to change all the dials just to start the game from the menu?
in the crossplatform era, PC has been treated as second class citizen with optimization. forza motorsport is also another example where even having above their minimum requirements give a slideshow on launch, despite lowering all settings.
expecting every gamer on PC to be tinkerers is just a myopic take that does not help with development priorities.
It's predominantly people with strong opionions who actively partake in online communities (instead of just lurking), so the vitriol is to be expected.
Seems like this was likely from before the hotfix that was released this morning which has improvements for some of the more egregious issues mentioned like DOF, LOD, and global illumination: https://store.steampowered.com/news/app/949230/view/37093367...
Still far from ideal but glad to see movement so quickly from the dev team and as has been mentioned the game is certainly playable albeit with some setting tweaks.
This is the week before go-live with a large customer. They've been doing lots of testing on their end (refreshing), and in the final stretch they found two glaring issues.
These two issues were glaringly, obviously wrong in core modules, and 100% reproducible.
Both were bad queries (static but parameterized). One resulted in an error from the DB server, and the other was a full join rather than filter by parameter (x=x vs x=:x), so spectacularly wrong results. Both were triggered by doing typical operations in our application.
Both issues had been in our code for many, many months, yet somehow the thousands of users we have across our several hundred customers somehow didn't report or experience these issues.
I fixed both issues in less than 15 minutes.
This isn't the first time. Sometimes I'm amazed how long such glaring issues manages to survive out in the wild amongst our customers.
Not saying they didn't know. Just saying that sometimes these things just happen.
I spent so much time in cities.exe I'm scared to check out the second one. How do you feel about the modding scene for it? I had probably 70GB of assets loading, converting the game into something altogether different.
After seeing this happen time and time again, it's kind of a wild decision to make. So many negative reviews I see these days are about performance issues.
You would think a little more time would be put into reaching at least some reasonable performance level.
> So many negative reviews I see these days are about performance issues.
Unfortunately, negative publicity from bad performance doesn't really stop these games from selling well, as proven by most AAA releases in the past few years.
Skylines isn't an AAA release though. And the first game was only really successful because the prior sim city was full of user hostile changes. The new game similarly banks on the accumulated positive image of the first one. To risk all that with a user hostile, awfully optimized early release is a risky game to play.
The studio already killed a series once with a bad second entry, cities in motion 2...
Exactly, and what the other replies to your comment don't understand is that: a; they'll patch performance over time b; humans forget things very quickly, initial bad reviews don't matter.
Oh yeah, everyone remember the Panama papers? Yeah seems the public have forgotten about them a long time ago. What was done about em? Humans only remember some things.
yep, performance issues are just less important than functionality issues in the lead-up to a release. You can fix performance with a patch. It's hard to walk back "game crashes when I trigger the main menu"
I think they released it without having a discussion about how bad an impression poor default settings could make. With a few adjustments it looks great and is very playable on my 3080 at 4K even before the patch. Really big blunder for sure.
They announced in advance of launch that it would have these issues so not unexpected, guess they knew the issue but no time to add it into the final release yet.
I feel like this is not only a massive technical mess-up but also belies a deep misunderstanding of their customers. Most C:S fans were more excited for a more realistic traffic AI and mixed use zoning. Yet this experience was made unplayable by an attempt to make it look prettier.
As a C:S fan, I want realistic traffic, mixed zoning, and expect it to be prettier. I don't find it unplayable at all even when it is turning my PC into a space heater. Optimizations will come, and I'd rather be waiting for optimizations than features.
Definitely not from Paradox. Europa Universalis IV is 10 years old now, still getting regular DLC releases, and late game is still unacceptably laggy on 2023 computers. Like, 20 to 0.2 fps.
I have an absurd amount of hours in EU4 and I have absolutely no lag in EU4 on my desktop. And I can think of at least two times where Paradox has had a patch that really significantly increased performance.
How often do you play past 1750? That’s when the 5 second month ticks happen for me. Also 2-3 seconds between clicks in the macro builder when you get 3000 provinces or so, methinks some algorithm is quadratic. Restarting the game helps.
I have a 1050ti and I can play it on low graphics settings at 1080 with no depth of field. I've gotten to about city stage #5 or so and I feel that it's playable. It's not pretty but good enough to get a feel for the game and learn about the new stuff.
For me - not at all. I don't have a computer that meets their minimum system requirements. I really feel like the cycle of newer games requiring more hardware is pointless and wasteful. I wish publishers would focus on good game design and make their games accessible to people without high-powered gaming hardware.
It takes a lot of time and effort to pare down the assets and add tricks to make a game look good on lower hardware. Targeting current-gen GPUs means the devs get a better game out in the same time and budget.
> belies a deep misunderstanding of their customers. Most C:S fans were more excited for a more realistic traffic AI and mixed use zoning. Yet this experience was made unplayable by an attempt to make it look prettier.
That's not fair. This is a new engine that they probably expect to support for maybe as long as 10 or 15 years. As a AAA publisher, Paradox doesn't get to stylize behind indie-style aesthetics and needs to keep up with where their peers are headed. It's not aiming to be prettier just for the heck of it, but because it needs to maintain a certain mark to keep the franchise relevant.
Knowing Paradox, more rich gameplay enhancements probably are on the update and DLC agenda, and we can assume that their designers really care about that kind of stuff. But for AAA publishers like them, there are also other factor that matter and that may need to take priority.
I don't think that's true at all. Nothing else Paradox publishes has AAA graphics, and that's not the target audience for their games. The first Cities Skylines didn't beat Sim City because it had better graphics. It was because of gameplay. Since EA gave up on city sims, they don't even have peers, they're only competing with themselves.
>and that's not the target audience for their games.
nor was it for Baldur's Gate. But studios are constantly seeking to try and expand their audience, so graphics are an inevitable first impression.
And of course, They need to show that this sequel is a deserved step up instead of staying in CS1. if it just looked like CS1 but with more plugins, that might not draw enough people away from 1.
The cards have a certain level of performance. They can change the settings of the game to match what they bought. If they have a $2,000 card, it's going to look good.
I believe that lots of the pushback on the game are due to poor defaults. Lots of unnecessary graphics settings are enabled, leading to bad first time experiences. After disabling elements that most users won't care about (motion blur, volumetric clouds, global illumination) and following some tips from people online, I find no issue enjoying the game >60fps without my GPU fans screaming. (to be fair, using 7900x / 3090)
Yes the graphics need optimization, but they could have rolled out the release with good defaults and just admitted that high end graphics needed more time... but you can enjoy the game in the meantime! Instead, their steam reviews will remain marred.
Yay, finally a comment that is similar to my experience and isn't speculation or kneejerk reaction. I'm enjoying the game at 4k and it looks better than CS now that I've adjusted things.
They totally screwed up the default settings, not sure why as it seems like an unforced error.
It actually picked a resolution with a 24Hz refresh rate when it first loaded for me. I think their "virtual texturing" optimization also kicks off on first run, making the initial menu experience stuttery and slow.
They even wrote about it beforehand. They knew that there a few single settings that kill performance, which they wanted to address with a guide. Instead of changing the default settings accordingly, it seems! Madness, a completely self-inflicted launch disaster.
I'm getting ~60fps on my 4090 at 1440p. Haven't had any framerate issues. My GPU has been pinned at 100% and my temps are sky high, but the game is smooth. They definitely need to fix this, but it's not as dire for everyone as it's being made out to be.
Same for me with a 3090 + 5950x, after the first patch. It's hammering the GPU with almost constant 100% load, while CPU gets by with ~40% or something like that.
The entire history of video cards has been "games that were hard on graphics cards"... leading to... a new generation of graphics cards that run the game with ease... leading to... a new generation of games. It's an ongoing battle.
making a comparison to a AAA game at 4k ultra is exactly why the simulation genre is niche and doesn't bother to appeal to that audience anymore. People underestimate how much of AAA development is agressively culling out behaviors and minimizing calculations for things that aren't (or aren't likely) to be on screen, and for a simulation you can't do this. But people don't value what they don't see.
what? it's a city building game. houses. basically boxes. a landscape mesh, trees. and some voxel traffic. without camera movement it should be infinite FPS, the scenes are ideal for all of the usual game engine standard optimizations from z-index culling to simply streaming in low-poly shit on distant houses and trees.
I don't even understand why it's not basically caching most of the rendered scene and then using it as a background to move traffic on it, and update it as the simulated Sun moves every 10 seconds or so.
Yup, it's boxes that need to run logic every frame and don't just get removed from existence if you turn the camera away. It's doing a lot more than just rendering.
Compared to something like God of War that is trying to turn off anything not in the currently loaded level chunk (which of itself has mostly static assets), a simulation game needs to spend its budget moving around assets all over it's scene, or at least prepare approximations for where and what an asset is doing once it comes back in view. This is why such a game tends to be more cpu intensive. But I wouldn't be surprised if the GPU is being used for compute in such a game.
----
With all that said: I haven't played but to answer your specific question it sounds like the game has a very heavy and/or suboptimal GI scheme. People here have said to turn off GI and volumetric and they get much better performance. That kind of dynamic lighting may be the main culprit for a game already taking up a lot of cpu/GPU budget from the above factors.
What is new is that 80% of current gamers is playing on a GPU on par the with the highest end model of about 7 years ago. Moore’s law might not be dead, but at least is economically forgotten. So older games might have gotten over the performance bump in a year if two, now it’s more like seven. Paradox failed to correct for economics in this party.
(My 1060 6GB is still above average on userbenchmark.com. That a 2016 card and has never been top of the line.)
That was before Moore's law was dead. Now devs need to write code that is at least not terrible. (Insert link to casey muratori ranting about modern software)
This is not a new generation game, the graphics are at-least 10 years behind the rest of rest the industry. (No PBR or other modern rendering techniques are used).
I don't think that's accurate. There's tons of complaints and legit issues with performance for sure, but I don't think making up numbers (likely grossly exaggerated) is helpful.
It doesn't require a 4090 to run, or to run well. 99% of customers in fact do not consider the situation dire. That's completely fake and made up. By tweaking a couple default settings (depth of field and fog are the big ones) the game runs fine.
So it's safe to say the game ships with bad defaults and the performance could be better all around. But to say 99% of customers might consider the situation dire, to be completely honest, is fucking stupid and I'm disappointed to see it here.
>But to say 99% of customers might consider the situation dire, to be completely honest, is fucking stupid and I'm disappointed to see it here.
It's definitely the biggest effect of the reddit migration. Many technical topics here still feel fine for the most part (if not slightly more political), but game topics on here feel like I'm back on r/games or r/gaming.
I could have worded it better. Basically, we've been inundated with a bunch of pre-release players saying the game was basically unplayable on any GPU.
Post-launch, it's perfectly playable for a lot of people. Even on older cards. Performance is by no means good, but it's not a slideshow like everyone was worried it would be pre-release.
(fwiw I also tested this on a 2070. It wasn't great at 1440p but was performing just as well as my 4090 on 1080p)
It's strange to think about Consoles, PCs, and Ports. Consoles were nothing like PCs for a long time, so ports took actual effort and reduced cross-platform releases a bit. Then consoles started to look and act more like PCs and porting was relatively easier until the hardware started to diverge into the powerful, combined SoC systems we have now compared to the sturdy-but-separated PCs that haven't had a major form-factor change in 20 years.
So PC ports are less efficient because they don't have such continuous, low latency access to memory and the best solution (cheapest?) is usually to just dump everything into VRAM and require more memory. It's frustrating when you can buy a game on a PS5 and it plays great but the PC port needs at least double the specs to run well.
I guess PC gaming was kind of always expensive on the high end but the low end is just disappearing entirely when devs aren't allowed to spend time optimizing at all.
There have been so many Unity games which do not manage to throttle GPU usage. Battletech was another one where at the menu it would keep the GPU at 100%.
Not sure why devs allow this, but with Battletech it was a known issue that was never fully resolved.
You'll have to enable v-sync to limit framerate. I enable it always (or fps limit) because I don't like blasting my GPU at 100% which makes it hot and loud.
It's a Unity game. Why are people genuinely surprised that performance is absolute garbage? Both CS2 and KSP2 (kerbal) REQUIRE incredibly specific tuning that unity simply does not offer. The latter being rigid bodies at scale (which is semi impossible with a dedicated engine) and the former being "almost-factorio-level" low level cell processing. The graphics of both are second tier to that.
This is a tangent, but I think using the acronym CS2 is really confusing because now there's Counter Strike 2, which is a lot more popular than Cities Skyline 2.
the person mentioned other games in the post about unity. while we know counter-strike is not built on unity, it would be confusing for those who just know surface-level about those games.
Do you have any credibility to make this claim? I'm an actual game dev who has shipped lots of stuff in unity. Ive never worked on a AAA game but I'm pretty sure I could improve the performance in this game if given the time.
>Ive never worked on a AAA game but I'm pretty sure I could improve the performance in this game if given the time.
That's the neat part, I'm an actual game dev and I know you're never "given the time". I'm sure the talent here could and will fix many of these issues, but at the end of the day publishers want to ship something (especially in this economy).
Plenty of Unity games exist with acceptable performance. Hell, City Skylines 1 also used Unity and the performance only got bad with big cities, not new cities.
It's also odd to call out Unity as the issue when talking about CS:2 and KSP2, when both of their predecessors also used Unity and run better.
Only because you're reading the damning into it. Acceptable isn't a euphemism for bad, I just reserve "great" for things that are, well, great. Like Factorio level's of optimization (Even though that game is capped at 60fps). 100+ fps with 7 year old hardware is perfectly adequate in my book. Skimming some popular games that use Unity and run fine: Rimworld, Outer Wilds, Gunfire Reborn, Enter The Gundeon, Risk of Rain 2, Hollow Knight. I see some more, but I haven't personally played or watch someone play them so I hesitate to list them.
Humankind the Civ rip-off game made with Unity is unplayable. I bought it hoping it would be kinda like Cities Skylines 1 was to Civ IV, but I've never been able to finish a campaign. I either get frustrated with the performance and quit or the game runs out of memory and crashes. Steam won't give me a refund because I played too many hours. But it is a freaking simulation game, 2 hours is absolutely nothing on a decent sim. This is also how I learned about the 2 hour limit for refunds.
Unfortunately, I keep installing it every so often to see if anything has been fixed and nothing has been fixed. I've complete given up now. It was a waste of money and an even bigger waste of time.
That's probably because the game is made with a Unity Personal license, for small or cheap studios only. With other licenses the splash can be disabled.
One of the best games ever made, Outer Wilds, was done in Unity. Dyson Sphere Program is an excellent factorio style game that's very well optimized. I do not share your same trepidation on seeing the logo (doesn't move me in any direction, both some of the best and some truly awful games have been made with it).
Usually that hesitance is because a lot of amatuer stuff has been made using Unity. But when in the right hands, it can feel like it is its own thing but it is easy to be hesitant after shoveling through so much trash.
Yeah, because it means the devs don't have a few thousand in the budget to support removing the logo, so of course they don't have the money to pay people to performance tune the game.
Simulators, especially ones of this size, are usually much more CPU intensive than GPU intensive. So a lot of reviews are obsessive about how beefy their GPU card is, but I suspect a lot of people have "top-heavy" rigs.
I play a lot of simulator games. Most of them are not paragons of optimization (well, except maybe Factorio). But these are not action games so FPS is not really something I personally optimize for and I don't necessarily understand the importance of benchmarking it against, like, Cyberpunk.
I wouldn't be surprised if Paragon's optimization fix will be just to nerf the depth-of-field.
The issues in question here surface even with an empty city, at higher populations the CPU will almost certainly become the issue but right now most of the complaints are unrelated to sim performance and it even seems like for most people the sim performs quite well even at higher populations.
The speed of your simulation in a game like this should have minimal impact on the framerate of the game. You can paint the screen 144 times a second, but only tick the simulation 10 times a second and get a much better experience than painting the screen and updating the simulation 20 times a second. Maybe your city runs slowly, but you can still look around at the various bits moving slowly.
Not trying to be rude, but this honestly reads like you don’t actually play simulation/4X games.
> Maybe your city runs slowly
If the simulation is running slow that has a much more detrimental on the quality of game play than jumping from 45 fps to 75 fps. Sure, it’s a simulation, but it’s primarily a game, not a weather model. And I say this as an early adopter of high hertz monitors and a frequent fps player. I absolutely need high frames and low input latency in a competitive PvP game, but in a strategy game, it’s much more important that the tick rate is fast enough that it’s interesting. A slow sim is boring, and it’s not just me, people complain about this all the time in (late game) Stellaris(another Paradox title).
If the issue is occuring even on the main menu and a completely new, bare level, that possibility can be safely eliminated. It's also highly doubtful that someone with a 4090 would have a CPU pitiful enough to excuse 20FPS.
FWIW I don't think that was the implication. Even when the simulation and rendering are asynchronous having a busy CPU will result in lower rendering performance, regardless of how busy the GPU is.
They probably should have mentioned the CPU specs at least, but it's very easy to look at utilization while the game is running and see that it's GPU bound even with high-end video cards.
Not to mention that this is on an empty map, so there's very little simulation even happening yet, and the big FPS gains come from turning down GPU effects like DoF and motion blur.
If it has embarrassingly parallel tasks that it can dispatch to a massively parallel subsystem dedicated to solving embarrassingly parallel tasks, is that abuse or smart use of resources?
That being said most simulation games are usually memory-latency and memory-bandwidth limited, not compute limited.
It looks a perfect match. At first. Then you realize that you are not alone.
Very much like using the Java stream parallel API in a webserver is doing wonders in dev, but not in production, as you have many other threads serving other requests also starved for CPU cores
Most people who have the money to invest in a 4090 are also going to invest in a beefy CPU. It's not a secret that CPUs are bottlenecks for games within the gaming community.
I mean its only common enough that the headline grabbing feature of the markets undisputed largest GPU vendor is a feature literally designed to reduce its impact as a bottleneck.
Disagree. Many games are ultimately limited by the performance of a single hot thread (often physics). Some things just don't parallelize, so throwing 8/12/16/24 cores at them doesn't help..
> Most of them are not paragons of optimization (well, except maybe Factorio). But these are not action games so FPS is not really something I personally optimize for and I don't necessarily understand the importance of benchmarking it against, like, Cyberpunk.
In many of these games, FPS is correlated with simulation speed. So when the fps starts to chug, the simulations starts going slower too.
It makes perfect sense. Simulations are done in discrete time chunks, as the framerate drops and those time chunks grow larger there are two choices: make the simulation bad, or make the world slow down.
If you don't adjust simulation rate you start seeing higher instances of objects phasing through each other, collisions not preserving energy correctly, pathfinding just fundamentally breaking.
Well ideally sim "ticks" are completely seperate from rendering, but this is not the reality in many games where they share the same thread, or where cross communication (like UI stuff) blocks enough to slow the other component down.
Even in Minecraft, with its completely seperate server/client, the rendering can bog down TPS due to JVM pauses and other reasons I don't even understand.
if it was cpu bound because of world sim, changing the resolution would not improve the performance. also, there's no reason for the world to be that much more complicated than the first game.
The Tropico games are a perfectly pleasant place to start. They get progressively easier and more casual.
The Anno games are my all around favorite. 1404 is my favorite, but honestly 1800 is probably the best.
Banished spawned an entire genre unto itself even as it hasn't aged gracefully. These survival sims have a lot more "bite" to them. Try Planetbase for a more streamlined experience. Timberborn if you like physics. Also have heard good things about Farthest Frontier.
And then there's Frostpunk which is an all-around amazing experience. The theming and mood rivals any first-person cinematic shooter.
Your people in Banished die. A lot. So unlike most other economic sims, building your city is not the primary goal, population survival is. And you have to balance population and resources pretty intimately.
Calling Banished a tarted up Settlers would be similar to calling Dark Souls a tarted up Zelda game.
I'm still an old blowhard and think Tropico 1 is still the best. It was so tough that you had to become a fascist every time. Which made it a harder game but also a much sharper commentary.
Sure it uses CPU but desktop CPUs that gamers use are beasts. Even if it’s a few years old mid level CPU it’s better than any console or mobile device by a long shot. They have dedicated tower or liquid coolers so they can push a lot of power into the chips
With sim games it's not purely about clock speed or TDP. The previous generation 5800X3D outperformed the newest chips in simulation games for a while thanks to it's very large cache.
I have played a little bit and I'm a bit confused about all the negative feedback regarding the performance.
I have an AMD Ryzen 9 3900X and NVIDIA GeForce RTX 3090, so not too bad, but also not the best.
And the game runs totally fine for me. I just reduced the screen resolution (I think 1920x1024 or so) and left the other settings as they were (I think all set to High). And I get very constant 45 FPS without hiccups. It feels very smooth and playable. Maybe it will get a bit worse when my city gets bigger, but so far, I don't see any problems.
This is even before the first patch, which was released today. This patch is supposed to optimize the performance more. I will try it later.
Note, this is with Linux. I run it on Linux with Proton 8. So maybe it actually runs better on Linux than on Windows?
It will keep happening because people rush to preorder or buy shitty, unfinished games on a release day- just look at Jedi Survivor from this year. The game was in a laughable state for months after release. I thought waiting for so long was enough for the patch shitshow to be over.
No incentives to ship good quality games, especially looking at larger companies. Luckily there are still sincere reviewers that will give you heads up about it
I want to give shout-out to the studio behind Lies of P. Game was super well optimized since release and I can't recall a single bug during dozens of hours of gameplay. It's a bit sad that it went by without a lot of hype. This is the kind of game dev that should be praised.
LOL that’s such a disrespectful take… the gameplay design in those games are second to none. That’s something that needs to be fine tuned from scratch each time.
That is an understatement. The amount of refinement they have done now on various games is starting to feel like it is beyond just pure iteration but is a result of intuition that cannot be taught.
Not OP and a bit of an aside but, funnily enough the leap from BotW to TotK actually was the first time in a while that I felt like Nintendo didn't really bring that kind of refinement and innovation. It wasn't not a bad game, not by a long shot, but it lacked that usually tight gameplay and experience that you come to expect. This is in large part the open world as it felt like they used scale rather than refinement to get those across the line.
For those that don't play other Paradox published games... Newly launched Paradox games are early access. Maybe perpetual early access with on-and-off performance woes, like Stellaris and other Clausewitz games. As much as I like Paradox games, that and the endless DLC is the steep price you pay for their sim games.
The we appear to have slashdotted the site, so I’m just reacting to the headline, but is this a big problem? It is a Paradox published game, presumably we’ll be playing it in 2030 on RTX9070’s or whatever (And buying expansions).
It seems like a shame if the ultimate quality is limited by the hardware available now. Turn down the setting ‘till you hit 30fps (this isn’t a twitch shooter.
Yeah it seems like a ripoff, I’m sure in a couple years more capable, cheaper cards will come out (and then people still playing this game can enjoy it with all the bells and whistles).
It also uses 10+k drawcalls per frame. It's amateur hour over there, bordering on fraud. Even more frustrating to see how successful that model is. And how feverish people defend them online.
this is what happens when you focus on content before performance. they'll get that number way down over time.
games are data problems, like every other computer program. you must know your data before you can optimize how it is moved around efficiently. draw calls are a very good benchmark for this. as they optimize, it will go way down, if it hasn't already.
I've always had a felt for city building games having played since Simcity 4 first came out in 2003. imo the genre does not need great graphic - it needs great content, Song of Syx is one of my more recent (~2 years) favorite, having a city building game unable to run with the best video card is baffling at best.
I still consider Simcity 4 with full mods and addons the best ever, but for some reason the industry doesn't seem to make similar titles any more, the newer city building games focus on details but gives up on scale.
It's likely they had a financial agreement with their publisher that stated that the game would launch prior to November 2023 or they would lose some % of the revenue. Publisher contracts often also mandate a certain Metacritic score must be met in order to receive full payment.
While the game should not have been released in this state (or at least with defaults set to broken graphics settings, such as motion blur and depth of field), CO released a patch today and users are saying that it has dramatically increased performance.
> If you're having issues with performance, we recommend you reduce screen resolution to 1080p, disable Depth of Field and Volumetrics, and reduce Global Illumination while we work on solving the issues affecting performance.
I tried it and had no problems. Everything is smooth, no visual problems. I have a mid-tier Radeon from 3 years ago.
Yeah they fucked up the launch and it's unfortunate they're getting such negative press, but it's a city simulator it doesn't need 120FPS, what are we talking about?
If you want to just have fun and go build cities, there's a clear path for it
I would more compare it to "The air conditioner is broken, and the radio crashes. But it still gets me from home to work."
The stuff needs to be fixed, but the core purpose is absolutely met.
The core purpose of a city builder/simulator is to let you creatively build well-simulated non-buggy beautiful cities. You can absolutely do that. The fact that some volumetric effects are missing, or that you can't do it in 4K is absolutely a problem, and the devs should fix it.
But let's not pretend like this is the equivalent of going 10mph on the road without being able to shift gears, come on.
There's certain problematic graphical settings for CS2 right now. If you have a potato GPU and put it on low settings (as you should with a low end GPU) the performance is fine.
My card is below the recommended specs and its perfectly playable with a city of 50k.
This is armchair game developer sleuthing and I don't buy it for a dollar. Mainly because performance issues with this game occur even on an empty map. There is absolutely no way the game is rendering 10,000+ characters at that level of detail.
You'll be wise to ignore Redditors when they claim to be making some kind of breakthrough that eluded professional game developers over years of development.
In another comment there was a link to https://twitter.com/AtkosKhan/status/1717525097626349696?s=2... with a note that it can increase FPS by 100% by just removing cim rendering using the game's developer mode, so it *does* seem like cim rendering does have major impact in the game performance, if you have cims to begin with :).
> “Citizen lifepath feature does not tie to citizen geometry and does not affect the performance figures of the characters. We know the characters require further work, as they are currently missing their LODs which affect some parts of performance. We are working on bringing these to the game along general LODs improvements across all game assets. Characters feature a lot of details that, while seemingly unnecessary now, will become relevant in the future of the project.“
So I guess sometimes these things can elude professional game developers over years of development.
Watching the preview content they were hocking with youtubers and on their main channel, I did get the sinking feeling it would be DoA. Didn't even hear that it came out. Not initially because of performance issues, but because it seemed like they were just adding a few incremental things to the already brutally sluggish engine, and the anticipated release from announcement seemed short. None it seemed very ambitious for a 2nd major release 7 years later or whatever. No shade at the actual devs mind you, they're just working with the constraints they have, it's better than what I could achieve almost certainly, but it didn't seem buyable.
Looks like this is the straw that'll finally get me to upgrade my 12 year old CPU and 7 year old GPU. Has anybody figured out what a usable GPU is with today's patch? My initial research indicated that there's a sharp performance hit for 8GB of VRAM or less, so I was thinking the 12GB 6700XT might be the cheapest card with usable performance. Is that still the case with Friday's patch?
Do modern AAA games support dual (or multiple) GPUs?
In my days we had SLI and Crossfire, but I don’t think there’s something like it nowadays.
For sure better coded games with proper software driver support will be needed, but I think this trend of having bigger and more power hungry GPUs needs to de-escalate.
There are many configurations with workstations having multiple GPUs for rendering, AI/ML training, password cracking and so on.
Surely if the game code and the display driver supported it, there would be speedups.
I’ve been playing Pillars of Eternity and I’m amazed just how poorly it runs given what it is.
I’d love a no-judgment, no-blame post mortem on a game like that, which tries to dig into the technical reasons why it’s so poorly optimized and the non-technical reasons for how they get there.
I’d imagine it usually boils down to having a deadline and a budget, but I’m quite interested in understanding the in-between portion much more intimately.
They kept on adding visual features that work independently, making all their own API calls, one after another, got it looking good, and moved on. That's as opposed to creating a cohesive rendering pipeline that deduplicates effort and accomplishes the same.
On top of that, and most importantly, I doubt they allotted time to refactor and optimize, or... they allotted it after release.
If you thrash Unity's APIs hard enough, which in turn thrashes the underlying vulkan/DX level APIs, clever drivers aren't quite enough to keep the GPU loaded with useful work.
There's probably a long tail of other stuff they need to profile and tweak to get a better balance of performance to visual quality, like changing the precision of shader calculations, tuning level of detail, and so on.
Virtually every Unity game ever that does a non-trivial level of background simulation has considerable jank. There’s just a point where you hav wot say - look - Unity games consistently have performance issues at some point it’s legit to just call it a poorly performing platform.
The same cities skyline where traffic hasn't worked properly since day 1? It may not be as bad as some, but C:S has plenty of issues on it's own if you try to play it as an actual game and not a scenery painter, which is all its' really good for. It's always been ocean wide, pond deep.
Is this on maximum settings? Does the game still look good when the settings are turned down?
If so (!), I see no issue here, and on the contrary I applaud them for including options. It gives players more options (the person who doesn't mind aliasing can play at 1080p and get incredible lighting effects), and makes the game more future proof.
Can't confirm this at all with my 4090 and i9-13900KF. Even with highest settings on Windows 11 23H2 (25977) with GeForce 545.92, I get 30 FPS -- the stuttering started only when my city reached a population of around 80k. (Display is a LG 4K HDR 32", connected via DisplayPort.)
The bigger issue is that the actual graphics themselves look the same as the previous game. I’m sure you can do side-by-side stuff and show it’s better, but I was hoping for a tour-de-force of amazing looking cities. A no buy for me until something changes.
I'm a bit confused about the performance numbers I've seen about cities II
I "only" have a 3090 and get consistent frames around 40-50
And yeah that's low, I'd prefer 60 or above but I'm nowhere near 10-20
There are CPU/GPU meters on the screenshots. Highest CPU load I saw is 78% on one core and lower on other cores, but GPU is always 100% or close to it.
How hard is it to debug shaders? I admit I’ve only written like two shaders for WebGL and I found it to feel like a black box I feed code into and see what I get out the other end.
I admit to not knowing much about the history behind this decision, but it does seem like an odd choice. If I were making a game that was both GPU-intensive as well as CPU-intensive as the simulation scales, I would target performance when selecting my game engine. Seems like C++ is a better choice compared to C#.
AFAIK Unity is not made on C#, only the scripting is C#. The engine itself is coded in C++, just like most other game engines. It's kinda like complaining that ID software has been using LUA for scripting inside their engine since RAGE, probably not a good factor for engine performance.
I'm guessing they're rendering those trees without any LODs, i.e. they're always pushing millions of polygons. That's why the settings don't make a huge difference.
I checked the IGN review for comparison. IGN reviewer also has an RTX 4090 but plays with a lower 1440p resolution:
> I have a 13900k, 64GB of RAM, and a RTX 4090, playing on a 1440p ultrawide monitor. I got 35fps at the main menu and in game on a brand new map w/o building a single thing. Turning off motion blur and depth field increased this from 35 to 50fps. Not a single other graphics setting changed the performance at all. I turned off every single setting I could or set it to the lowest possible, and still only got 50fps.
How can a game even get to this point? Everyone in the development process must have noticed that it was running poorly. Even if all the developers had the most expensive consumer GPU available, they still would have seen poor performance.
And even if such filters are somehow so intensive, they should be turned off by default. I am playing Cities Skylines 2 and there is so much unnecessary eye candy that is turned on even with a mid-range graphics card. The game runs fine for me at 1440p when I turn off the really intensive post processing.
They screwed up the defaults. When I first loaded it was set to a resolution with a 24Hz refresh rate. Also the game looks bad with the default SMAA, it looks great if you change the advanced graphics settings to use TAA. I'm on a 3080 at 4K and after fiddling with settings it looks wonderful and is very playable. Unbelievable that they hosed the first impressions so much.
But no TV maxes out at 24hz, that is a lower bound meant to support things like blu-ray players for judder-free movie playback.
60 hz has always been the bare minimum any display supports, so if any game is picking something below that then something has gone horribly wrong. But really, games should be using the current desktop refresh rate as the default, because you know it's supported and makes things like alt+tabbing considerably faster even in exclusive fullscreen.
Agreed. First time I launched it on my lowly 6600XT it defaulted to high settings at 4k, completely unplayable (like 3fps).
Reading Reddit and going down to 1080p mediumish with depth of field turned off and it's fine, 30fps. It's a very fun game!
I have absolutely no idea why they didn't turn the defaults down. It would have been a 5 min config change surely and the impressions of the game would have been vastly vastly better.
It does look terrible for me though, despite being fun. The shadows are terrible and there are loads of rendering glitches.
I'm also a bit concerned now my city is getting bigger at how poor the traffic wayfinding is. Seems to be as dumb as CS1 (with no mods) which is really bad, with the added nuisance that the traffic cycle seems to be basically 0 cars until rush hour then a massive flood of traffic.
It's actually quite ironic that all I really wanted from CS was faster performance in bigger cities (huge fail from CS2 here), better traffic simulation (jury is out but isn't looking great) and better road tools (this is what cs2 is great at).
I do not understand motion blur. I am paying top dollar for these pixels, let me see them all. That it is a negative performance impact makes my decision all the easier.
Apparently it makes the game feel "faster" in racing games because it adds to the movement illusion. I don't know why you would use it in other games, though
It was used to try and make 30fps console racing games look less juddery, but didn't really work. Racing games are a genre that really does need 60fps+ to feel good.
Even in the 90s, the developers of 3D arcade racing games were well aware of this, ensuring that the original arcade versions of games like Ridge Racer, Daytona, and Sega Rally ran at 60fps, rather than sacrificing that smoothness to add more detail. And those games looked spectacularly good for their time.
In real life your eyes do the blurring on fast moving objects. Real life is nothing like a game without motion blur. Swipe you hand fast in front of your face fast. Do you really see it sharply all that time through the movement? Or just the beginning and the end?
At 24fps motion blur in games or 3d movies is bad. But at 120fps or more motion will never look well without it.
Nowhere are they replacing models, replacing textures, or anything of the sort. I suspect the issue here is that box blur with a sampling of 3x3 is exponential performance hit at 4k vs 1080p. By a factor of 8x due to the extra sampling lookups.
My cynical take is that everybody noticed, but did not fix. They shipped ASAP. New updates, where they address more tech debt, may fix some if this, claim major performance improvements, and thus guaranteed to generate positive press / posts / tweets.
> My cynical take is that everybody noticed, but did not fix.
Before it was released, they told everyone that the game was not optimized graphically, and the minimum specs were already high. It was quite clear that there is still more work to be done to get it in a better state. So, not sure why you have this "cynical" take. This isn't something they are trying to deceive people with. I'm not even in the market to buy the game, and I know this.
The cynical part here is that they expect the experience to be so-so, and rendering clearly problematic, but they still sell the beta-quality product, certain that the gamers would buy it anyway, and won't expect something well-done.
They published an announcement on Stream, right next to the buy button, where it shares the current issues. People are free to make their own choices. This is made up drama by people making ignorant knee-jerk reactions.
Unless this disclosure is listed at the top of the Steam product page, feels pretty rotten for uninformed consumers who are excited to play a city builder.
When you posted this, it was the top announcement. It's still in the top announcements row 5 days later, easily seen. It took a Google search of "steam city skylines 2" to discover. This announcement is as close to the purchase button as you can get it right now on Steam.
My personal opinion: yes. As 160+Hz monitors are finally becoming commonplace, it's becoming less defensible for games to run anywhere below 60fps.
But it's the fact that these figures are being seen with the most powerful consumer graphics card (on a brand new unpopulated game map) that's the problem.
City Skylines 1 has been played and supported non-stop for over 8 years. Why should they limit the highest possible graphics settings only to existing technology?
Again, this is a simulation game. If I ask the game to spit out more detail and graphics than can be physically supported by today's technology, I guess I don't necessarily see this as a design fault.
> If I ask the game to spit out more detail and graphics than can be physically supported by today's technology, I guess I don't necessarily see this as a design fault.
To be clear: I had to turn the quality down to much worse than I recall Cities Skylines 1 looking in order to get worse performance than Cities Skylines 1.
The major complaint here is not "I cannot max the settings on my mediocre hardware", but rather "I cannot mediocre the settings on my max hardware"; the performance is bad at any level, but my hardware, while not top-of-the-line, should be able to run a game like this at 4K playable, or at 1080p while looking awesome and running great. Instead, I get 1080p with a mix of medium/low settings in order to get it looking pretty okay in most aspects.
> Why should they limit the highest possible graphics settings only to existing technology?
This assumes Cities Skylines 2 is using some next gen graphics technology when in practice it is a Unity game without any cutting edge graphics. Another comment mentioned they are doing 10k+ draw calls per frame - it's just poorly optimized.
How does developing another engine help at all? It is very likely its graphics are so under-optimized because they spent most of their time on the simulation aspect, which is more than challenging enough. Dyson Sphere Program is a game with lots of stuff going on at a time, the engine isn't the issue. Cities Skylines 2 is just a very complex game.
As a city builder it has not just dynamic geometry (which makes graphics optimization more challenging) and lots going on on screen like DSP, but also a complex agent based simulation of a city, its transportation, its economy, and individual agent AIs. It is far more complex than most games despite not being graphically intensive. And do note that as far as cities builders go, it's also one of the most ambitious graphically. My personal opinion is they deserve some slack as long as the game continues to improve.
If it were doing a ton of compute-shader simulation on GPU to actually increase the fidelity of the simulation that would be one thing, but having a badly implemented graphics pipeline isn't that.
this is a trope that is spit out by the developers of every poorly optimized sim in recorded history.
Yes, scalability towards the future is good -- but not at the detriment of player performance now.
If the player base doesn't stick around during our current dark-times medieval technology stack, there won't be a player base when we have whatever future tech makes it playable -- see the problem?
Somewhat yes, I didn't try it yet, but a few things don't add up then:
Why enable those heavy, performance crippling post processing filters by default?
From what I can find in this comment section, and I guess this has to be taken with a grain of salt, performance doesn't scale. It seems we max out at 50fps on a beefy 13th gen Intel with the fastest GPU available, but then there's a couple comments with mid-range hardware where performance is nearly identical.
To shoot from the hip on this one. It sounds like it could be limited by a single thread. Clock rates between mid and high end CPU's are significantly different but core count is. Thus if a single thread is holding up the works, that would explain the stagnant performance profile despite potential overall performance.
> As 160+Hz monitors are finally becoming commonplace
The Steam hardware survey suggests otherwise to my eyes; it's often easy to forget working in tech that a lot of Steam customers have relatively shit hardware:
While they don't track monitor peak supported refresh, just looking at the rest of the results suggests to me high-refresh gaming (over 60FPS) is going to be relatively niche at best. Lots of PC gaming specs I take for-granted on my own builds or consider "average" are actually not that common.
I absolutely agree 60fps should be considered the floor for most modern PC games running on "reasonable" hardware though, 30fps belongs to the past.
I don't think that's true. Most people who play PC games are Steam users, and they're playing a huge variety of games: pretty much any game that's not exclusive to some other store.
It’s the cheapest headset that also offers a reasonable experience. For around $250 you get a well designed device that includes controllers and does not require a separate computer to use. It’s a very good value, and if you’re unsure about VR probably the best place to dip your toe in the water.
For CPU heavy games it totally makes sense to target 30fps, but that better be 30fps at 1080p with integrated graphics, (or 30fps 1440p with 1060 class gpu). Add eye candy if you want, but it really should run well on any remotely modern system.
If they said "The game is locked to 30fps" I'd probably have been ok with that. It's not a fast paced game, and capping the framerate at a low value is probably a decent enough design choice for that particular game. As you say, it's enough that you won't really notice it in gameplay.
The problem is that we're talking about the absolute highest end consumer card sold by nVidia rendering a blank map, without such a framerate lock in place. If it's not locked, then that framerate is the product of a performance bottleneck somewhere. With a modern gaming PC, "performance bottleneck" also means you're sitting next to a moderately powerful space heater.
Locking a game to 30hz would not have been a "decent design choice" given that's half the refresh rate of nearly every computer monitor out there, especially for a game that involves a lot of scrolling.
Games get mocked for locking to 60hz, 30hz would get them laughed at outright.
Yes, but this is an unrelated issue. Last game I have seen to not have an "hardware cursor" option/default was a badly made DirectX 8 one that has trouble running on Windows 7.
>In my experience 30 FPS is perfectly fine for a game like this, I'm not sure what do you mean about "scrolling" ?
A top down city sim 'scrolls' up/down/left/right around a map, which is a movement that is strongly associated to screen tearing -- which i'm not really sure is a relevant thing to bring up given the variety of v-sync options available.
Even a game locked at 30fps that 'scrolls' often shouldn't experience significant tearing with the options out there.
Even without tearing, a high framerate is very desirable for scrolling around a map or web page. For instance, iphones are widely praised for excelling in this regard. It's just easier on the eyes if things on screen move smoothly when you scroll/pan around, it's less fatiguing.
50 FPS certainly doesn't make the game unplayable, it's true.
But if that's with an empty map, it's an ominous sign for people who plan to build a city in this city building game, as rendering usually slows as polygon count increases.
And if it's with an $1800 GPU, its an ominous sign for the 99.3% of gamers who don't have an $1800 GPU.
And if that's at 1440p - well, I'd wager a lot of the folks with a $1800 GPU also have a 4k screen.
It's not the average that's the problem, is that it has hard stutters from when you do something like swing or rotate the camera around and force a lot of new things to load. Also there's no fps limiter so I can't stop the game from attempting to match my monitor's 165 FPS with a overheating laptop 3060, causing the stuttering to be even more visible.
It's a huge issue, considering the game doesn't look like a revolution in graphics AND you're essentially playing it on a super-computer by 5-6 year old standards.
> I put 8 hours into Frostpunk before realizing I was limited to 30FPS.
Yes, and I have beaten games that rendered improperly at sub-30FPS as a kid, because the requirement was 16MB of VRAM, while my card only had 8. But that's just as irrelevant to the point being made.
It's like playing in black and white, kind of. But yes 50 FPS is completely fine if you have not gotten used to better. Also helps if the game doesn't rely on high levels of detail (which Skylines, unfortunately, does), so that moving the camera around doesn't just make everything blurry.
If moving the camera around makes things blurry, then they must be using some kind of weird post-processing on top, can't blame that on the FPS alone !
It's unrelated to post-processing. Moving the camera around makes things blurry at low frame rate (and 50fps is low) because of sample-and-hold blur. This can be mitigated by strobing the screen at the same speed as the frame rate, but few monitors support strobing at 50Hz (it would look annoyingly flickery) and if the game ever dropped below 50fps it would break the effect.
Try moving file explorer windows with your display set to different refresh rates. “Blur” might be an incorrect description, but it’s how I would describe it at least.
I feel like this is grasping at straws? Plenty of unorthodox games go mainstream. Can’t expect consumer hardware to the stay the same forever. This game is just poorly optimized.
I think I'm giving people the opposite impression that I intended.
Well-compensated people (like engineers) are more likely to own a 4K monitor and high-end GPU, so I meant to convey that there's probably a large overlap with the type of people who play city builders (like engineers), so they really should have accounted for that.
It was only after I bought my 4K 120Hz OLED that I found out how popular they are for sims, including pinball. It's been awesome.
It is just a matter of preference, I don’t see why people get bent out of shape about it.
30fps is fine for lots of games. Especially for stuff that doesn’t require super twitchy gameplay.
If somebody really cares about 60fps or 120fps, fine. But the developers have some effects they want to implement, and if some of them don’t fit into 8.3ms, let those of us who don’t mind 30fps enjoy them, right?
Because this means the game runs even worse on pretty much every PC out there, since the majority of gamers don't use the most powerful GPU on the market. And it's not like this game has some outstanding visuals which would somehow justify such performance.
The problem is the UI won't be smooth on VRR displays unless they implemented some clever workarounds. Your mouse cursor will only update at the same framerate as the rest fo the game. Even budget gaming monitors these days support g-sync, freesync, or both, so it's no longer a problem that only affects a small % of users with deep pockets.
The only workaround I can think of that wouldn't also defeat the point of having VRR would be to add some in-engine low framerate compensation artifically boost the reported framerate to a multiple of the real framerate, but I have yet to see any games that actually do this.
Problem with these kinds of statements is that.. well, they're not true. I have RTX 2070 running in 3440x1440. I'm at 35fps on low settings with some special stuff turned off, e.g GI and fogs. In an small town with 4k inhabitants still, so things can get worse for me.
Regardless, the problem is blown out of proportion. I think we can give it some weeks and 4090 performance issues will be in the past.
As someone who can safely say that watching a video game in 30 FPS is equivalent to watching a slideshow, yes. It gives me intense motion sickness immediately. So I can't play something below 60 FPS
> How can a game even get to this point? Everyone in the development process must have noticed that it was running poorly
A lot of games run poorly during most of the development process, performance and bugs get fixed in the last stages. But when the publisher forces a premature release this means the fixes have to be done in the months afterwards. See Cyberpunk 2077, Mass Effect Andromeda, Battlefield 2042 and countless others. And this will continue to happen as long as gamers keep preordering.
Display pixel counts for reference, a 1440p ultrawide (assuming 21:9 and not 32:9) is about 60% the pixel count of 4K
4K 16:9 3840 x 2160 = 8.29 million pixels (1.00x)
1440p 21:9 3440 x 1440 = 4.95 million pixels (0.60x)
1440p 16:9 2560 x 1440 = 3.69 million pixels (0.44x)
lots of ways. It is very easy to do things which affect performance negatively.
Once you're sure about what you're doing and the data that you're drawing to the screen, then you can start optimizing things so that performance goes up.
Colossal chose to release when they did because they promised to release on that day. The game isn't done, and they've said this.
The big ones are paid cash, and before they stream they've already had a team working with them, covering what exactly they do or don't want shown and talked about - and if there are any technical problems, they've been worked out, which may even include special builds of the game.
The smaller ones are "partners" who get access to the game earlier than purchasers, zero-cost items for giveaways, and some access to game publisher staff - but only if they play by the (NDA'd) rules of the publisher, and those rules usually explicitly say "do not say anything negative about the game."
Not as powerful of a setup, but I have a 5800X, 32 GB of RAM, and an RX 6800 XT on a 4K monitor, and I got 5 FPS on the main menu (which was showing an empty grass field).
I had to set my resolution to 1080p in order to get anything remotely resembling performance (this change alone took main menu FPS from 5 to ~55), plus disabling motion blur, depth of field, and volumetrics, and turning model fidelity down. I've also heard that disabling vsync can make a huge difference, but it didn't affect things for me.
This brings to mind fond memories of simulation games from back in the 80s/early 90s sporting the most primitive of graphics, if they had graphics at all, but which allowed the player to fill in the blanks with their own imaginations. Updated games of this sort would be astonishing to play on modern hardware.
It _seems_ okay around 1080, but then you realize there's almost nothing going on and it's only doing ~60FPS. It'll choke on anything remotely complex.
(insert my comment here, it's not important, I won't waste my time writing one. Let's see how many comments you can go before saying whoops. My bet is infinity / you never will)
If you notice how fast the PR and guides for optimizing performance has come I'd say not only the developers but everyone that's not the top of the top knew the game's pain points.
When I was a game dev the argument went something like this: "Well, computers will be faster once we finish the game, so don't worry too much about the performance right now."
Two years later: "WE NEED TO RELEASE THIS TOMORROW. JUST MAKE IT BUILD AND SHIP IT!"
It's satire, people are reaching way too far in this thread (ex. IGN's review is using 3080 / Ryzen 7 and they were happy with performance)
The site none of us can read probably has more info about this being a RTX 4090 quirk. And it's not exactly surprising the dev team wasn't optimizing for RTX 4090, and the quirky reaction here would be justified if they were.
> It's not like Nvidia completely rebuilt the GPU between revisions
NVidia absolutely changes the assembly language rather dramatically between iterations. That's why the code is in NVidia PTX (or HLSL / Microsoft's DXIL, DirectX Intermediate Language), and then recompiled to actual GPU assembly each time you install a game or run the code on a different system.
I have not seen an instance where a GPU series that has been out for over 12 months has regressed in performance prior to an older one
Perhaps back in the very early accelerator days when everyone was making GPU-specific hacks of Quake 1 but even that was smoothed over by the transition to DirectX/OpenGL/GLiDE at the time
Performance problems like this are usually complex dumb things. Like calculating a thing a million times when it only needs to be done once, doing one thing and context switching instead of batching, dumb locking that leaves threads spending most of their time waiting, creating and destroying a thing millions of times when it should be reused, etc etc.
It will just be lots of these things that need to be found and corrected.
I fault the people complaining about performance of a city builder running at 1440p more than the developer. You're looking at concrete and asphalt and brick textures and shit like that. You don't need anything higher than 720p.
Dude, we're talking about an RTX 4090. A GPU that can run current-gen AAA games on max settings at 4k60fps should not slow to a crawl when rendering "concrete and asphalt and brick textures and shit", to use your words.
So if I have a budget PC, I should be content with running this game at Nintendo 64 resolution? Or I guess it's on me to fork over $1,000 for a high-end GPU if I want to run it at 2/3 the resolution of a midrange mobile phone, as you suggest.
OpenTTD graphics are just not even remotely comparable. It looked so ugly when I tried it that I couldn't bother to continue for more than 2 minutes. If graphics don't matter then maybe even drop openttd and play a text based graphic game. It will run smooth even on a raspberry pi.
The game is originally from 1994, what do you expect? We didn't even have 3dfx 3d accelerator cards back then. OpenTTD will run smooth on a Pi. It ran smooth on my 486 when it first came out.
I'm not a huge gamer, but I did spend 14 hours a day for a week playing this game once. Never been sucked in to a game like this. And even better if you can find the original music and graphics. The music is excellent, and will play in your dreams.
First time I played Transport Tycoon was on my 486, I got a demo of the original game on a floppy disk out of some PC game magazine. Looking back, it's amazing how much game play fit onto a 1.44MB floppy disk. I ended up purchasing the Deluxe version on a CDROM. I ripped the CD and have kept an image of it ever since, which came in handy when I found OpenTTD and wanted to use the original graphics and content. Chris Sawyer and the OpenTTD devs are legendary in my mind.
Not quite a city building but I think possibly comparable - factorio you can at least get to endgame on pretty low end hardware with decent fps. It even runs on the switch.
I say comparable because of the number of things moving around on belts, machines, etc. I think would be a similar workload. Although it is 2D...
I run Stellaris on an aging i5-3750K and a somewhat newer Geforce GTX 1070Ti at 1080p and also think that the complaints about performace are overblown. I even leave the planet generation settings turned up fairly high (I know a lot of people online reduce them to like 0.25 for performance reasons). That's not to say I never have slowdowns, enormous doomstack on doomstack fleet battles in the endgame hurt the FPS, but they are fairly uncommon. Fighting the 4x strength crisis fleets are the one other time that the game tends to chug, but those tend to be fairly late as well.
That said, I don't have a ton of expansions installed and no mods. I have a strong suspicion that some mods are harder on performance than others, especially the over-the-top ones like Gigastructural Engineering.
I used to play a game called Mobility back in ye day, early 00s, it was city building with a scientific background - IIRC it came out of a funding grant of the German government.
Gotta look if I have an ISO flying around somewhere.
Assuming you can generate 24fps of "movie quality perfect images", but you can't.
Let's take motion blur as an example: It's a very expensive to do it as a real-time VFX - so we cheat and approximate it - that doesn't that look good most of the time - because we often have to cut too many corners and end up with a poor approximation.
You're generating very slow images that a computer believes is a facsimile of reality that, unfortunately, don't represent motion blur very well, so they might be "24fps", but they look wrong compared to reality, and you probably can't conscientiously tell what is wrong, just that something seems to be sightly off.
So it's easier instead to attack the problem from the other side, and generate (or frame generate, that's a thing now) and show 480fps "crisp images" and make human persistence of vision do actual motion blur for you.
Even when viewing a movie instead of playing a game there is a huge difference between the blurry mess that fills the screen every time there is motion at 24 FPS vs. the sharper details of higher framerates.
The crazy thing is people are actually trained to hate it. When you see smooth motion your brain goes "oh no, this is some cheaply shot camcorder shit, not a quality movie" even though what is really happening is the picture looks better than you expect.
Ignoring all of the scientific evidence otherwise, there is not a chance you're actually parroting this nonsense without actually noticing a rather substantial difference when viewing something (say a movie) at 30fps versus 60+fps. It's literally night and day. Me and a few coworkers noticed the difference between 60 and 120 when upgrading phones recently, and that difference is much more nuanced than the difference between 30 and 60.
As the late Henry Petroski said: "The most amazing achievement of the computer software industry is its continuing cancellation of the steady and staggering gains made by the computer hardware industry."