Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I checked the IGN review for comparison. IGN reviewer also has an RTX 4090 but plays with a lower 1440p resolution:

> I have a 13900k, 64GB of RAM, and a RTX 4090, playing on a 1440p ultrawide monitor. I got 35fps at the main menu and in game on a brand new map w/o building a single thing. Turning off motion blur and depth field increased this from 35 to 50fps. Not a single other graphics setting changed the performance at all. I turned off every single setting I could or set it to the lowest possible, and still only got 50fps.

How can a game even get to this point? Everyone in the development process must have noticed that it was running poorly. Even if all the developers had the most expensive consumer GPU available, they still would have seen poor performance.



> Turning off motion blur and depth field increased this from 35 to 50fps

Whats more, such simple postprocessing filters should not tank FPS so dramatically, even with a severe CPU bottleneck.

That alone is not just unoptimized, its a severe bug.


And even if such filters are somehow so intensive, they should be turned off by default. I am playing Cities Skylines 2 and there is so much unnecessary eye candy that is turned on even with a mid-range graphics card. The game runs fine for me at 1440p when I turn off the really intensive post processing.


> even if such filters are somehow so intensive

They should never be so intense.

Some shaders unintentionally scale nonlinearly with resolution, and I'm thinking that may be the case here.


They screwed up the defaults. When I first loaded it was set to a resolution with a 24Hz refresh rate. Also the game looks bad with the default SMAA, it looks great if you change the advanced graphics settings to use TAA. I'm on a 3080 at 4K and after fiddling with settings it looks wonderful and is very playable. Unbelievable that they hosed the first impressions so much.


Being set to a default of 24hz sounds like the idea of someone forced to ship the game, hoping users wouldn't notice.


Its probably just detecting a 24p TV resolution in fullscreen, if I were to guess.


But no TV maxes out at 24hz, that is a lower bound meant to support things like blu-ray players for judder-free movie playback.

60 hz has always been the bare minimum any display supports, so if any game is picking something below that then something has gone horribly wrong. But really, games should be using the current desktop refresh rate as the default, because you know it's supported and makes things like alt+tabbing considerably faster even in exclusive fullscreen.


Agreed. First time I launched it on my lowly 6600XT it defaulted to high settings at 4k, completely unplayable (like 3fps).

Reading Reddit and going down to 1080p mediumish with depth of field turned off and it's fine, 30fps. It's a very fun game!

I have absolutely no idea why they didn't turn the defaults down. It would have been a 5 min config change surely and the impressions of the game would have been vastly vastly better.

It does look terrible for me though, despite being fun. The shadows are terrible and there are loads of rendering glitches.

I'm also a bit concerned now my city is getting bigger at how poor the traffic wayfinding is. Seems to be as dumb as CS1 (with no mods) which is really bad, with the added nuisance that the traffic cycle seems to be basically 0 cars until rush hour then a massive flood of traffic.

It's actually quite ironic that all I really wanted from CS was faster performance in bigger cities (huge fail from CS2 here), better traffic simulation (jury is out but isn't looking great) and better road tools (this is what cs2 is great at).


> Also the game looks bad with the default SMAA, it looks great if you change the advanced graphics settings to use TAA

Well thats very subjective, as I usually can't stand TAA and like SMAA.


More importantly, why the heck does a city-builder game need motion blur?

It's usually a pretty obnoxious effect even in action games, the first thing to turn off if the option exists...


I do not understand motion blur. I am paying top dollar for these pixels, let me see them all. That it is a negative performance impact makes my decision all the easier.


Apparently it makes the game feel "faster" in racing games because it adds to the movement illusion. I don't know why you would use it in other games, though


It was used to try and make 30fps console racing games look less juddery, but didn't really work. Racing games are a genre that really does need 60fps+ to feel good.

Even in the 90s, the developers of 3D arcade racing games were well aware of this, ensuring that the original arcade versions of games like Ridge Racer, Daytona, and Sega Rally ran at 60fps, rather than sacrificing that smoothness to add more detail. And those games looked spectacularly good for their time.


I don't really get it. Games without motion blur (esp FPP) look so choppy and nasty for me. Event at 4k, 120fps/hz.

Motion blur looks like a movie. Without it it's just a slideshow no matter how quick.


Real life doesn't look like a movie.

Fast action doesn't turn into a shaky smeary mess, and there's no '24fps camera panning' judder as you look around


In real life your eyes do the blurring on fast moving objects. Real life is nothing like a game without motion blur. Swipe you hand fast in front of your face fast. Do you really see it sharply all that time through the movement? Or just the beginning and the end?

At 24fps motion blur in games or 3d movies is bad. But at 120fps or more motion will never look well without it.


Depth of field is often more nuanced than simply postprocessing filtering, it generally also swaps textures and models based on distance from camera.


>it generally also swaps textures and models based on distance from camera.

Isn't that just LOD, and is done regardless of whether dof is enabled or not?


Yeah, normally this is LOD and is a separate subsystem


I didn't know this, interesting.

I don't really like DoF anyway :P


Because it’s factually false.


>”it generally also swaps textures and models based on distance from camera.”

What?!? No it doesn’t. It uses the depth buffer to progressively box blur or gaussian blur the frame post-render.

https://lettier.github.io/3d-game-shaders-for-beginners/dept...

Nowhere are they replacing models, replacing textures, or anything of the sort. I suspect the issue here is that box blur with a sampling of 3x3 is exponential performance hit at 4k vs 1080p. By a factor of 8x due to the extra sampling lookups.


Severe CPU bottleneck normally implies that your CPU may need to be upgraded though? If you are running a 13900k, there isn't much room for that.


It really just means that that's the component that's limiting performance, not that there's necessarily any way to fix the bottleneck by upgrading.


I can understand bottlenecking the CPU mid game, but doing so on the main menu is a sign something is fundamentally wrong here.


My cynical take is that everybody noticed, but did not fix. They shipped ASAP. New updates, where they address more tech debt, may fix some if this, claim major performance improvements, and thus guaranteed to generate positive press / posts / tweets.


> My cynical take is that everybody noticed, but did not fix.

Before it was released, they told everyone that the game was not optimized graphically, and the minimum specs were already high. It was quite clear that there is still more work to be done to get it in a better state. So, not sure why you have this "cynical" take. This isn't something they are trying to deceive people with. I'm not even in the market to buy the game, and I know this.


The cynical part here is that they expect the experience to be so-so, and rendering clearly problematic, but they still sell the beta-quality product, certain that the gamers would buy it anyway, and won't expect something well-done.


They published an announcement on Stream, right next to the buy button, where it shares the current issues. People are free to make their own choices. This is made up drama by people making ignorant knee-jerk reactions.


Unless this disclosure is listed at the top of the Steam product page, feels pretty rotten for uninformed consumers who are excited to play a city builder.


When you posted this, it was the top announcement. It's still in the top announcements row 5 days later, easily seen. It took a Google search of "steam city skylines 2" to discover. This announcement is as close to the purchase button as you can get it right now on Steam.

This is made up drama.


Their publisher is Paradox so probably. They don't have a "only ship when its ready" culture.


I don’t think Paradox has a particularly urgent shipping culture either? It’s whatever their QA department does that sucks.


Is 50 FPS a huge issue?

I play a lot of city builders and so long as the UI is still smooth I really don't notice any gameplay difference between 30fps and 60fps.

I put 8 hours into Frostpunk before realizing I was limited to 30FPS.


My personal opinion: yes. As 160+Hz monitors are finally becoming commonplace, it's becoming less defensible for games to run anywhere below 60fps.

But it's the fact that these figures are being seen with the most powerful consumer graphics card (on a brand new unpopulated game map) that's the problem.


City Skylines 1 has been played and supported non-stop for over 8 years. Why should they limit the highest possible graphics settings only to existing technology?

Again, this is a simulation game. If I ask the game to spit out more detail and graphics than can be physically supported by today's technology, I guess I don't necessarily see this as a design fault.


> If I ask the game to spit out more detail and graphics than can be physically supported by today's technology, I guess I don't necessarily see this as a design fault.

To be clear: I had to turn the quality down to much worse than I recall Cities Skylines 1 looking in order to get worse performance than Cities Skylines 1.

The major complaint here is not "I cannot max the settings on my mediocre hardware", but rather "I cannot mediocre the settings on my max hardware"; the performance is bad at any level, but my hardware, while not top-of-the-line, should be able to run a game like this at 4K playable, or at 1080p while looking awesome and running great. Instead, I get 1080p with a mix of medium/low settings in order to get it looking pretty okay in most aspects.


> Why should they limit the highest possible graphics settings only to existing technology?

This assumes Cities Skylines 2 is using some next gen graphics technology when in practice it is a Unity game without any cutting edge graphics. Another comment mentioned they are doing 10k+ draw calls per frame - it's just poorly optimized.


Sticking with Unity probably killed them. They might as well have deved another engine.


How does developing another engine help at all? It is very likely its graphics are so under-optimized because they spent most of their time on the simulation aspect, which is more than challenging enough. Dyson Sphere Program is a game with lots of stuff going on at a time, the engine isn't the issue. Cities Skylines 2 is just a very complex game.

As a city builder it has not just dynamic geometry (which makes graphics optimization more challenging) and lots going on on screen like DSP, but also a complex agent based simulation of a city, its transportation, its economy, and individual agent AIs. It is far more complex than most games despite not being graphically intensive. And do note that as far as cities builders go, it's also one of the most ambitious graphically. My personal opinion is they deserve some slack as long as the game continues to improve.


If it were doing a ton of compute-shader simulation on GPU to actually increase the fidelity of the simulation that would be one thing, but having a badly implemented graphics pipeline isn't that.


this is a trope that is spit out by the developers of every poorly optimized sim in recorded history.

Yes, scalability towards the future is good -- but not at the detriment of player performance now.

If the player base doesn't stick around during our current dark-times medieval technology stack, there won't be a player base when we have whatever future tech makes it playable -- see the problem?


Somewhat yes, I didn't try it yet, but a few things don't add up then:

Why enable those heavy, performance crippling post processing filters by default?

From what I can find in this comment section, and I guess this has to be taken with a grain of salt, performance doesn't scale. It seems we max out at 50fps on a beefy 13th gen Intel with the fastest GPU available, but then there's a couple comments with mid-range hardware where performance is nearly identical.


To shoot from the hip on this one. It sounds like it could be limited by a single thread. Clock rates between mid and high end CPU's are significantly different but core count is. Thus if a single thread is holding up the works, that would explain the stagnant performance profile despite potential overall performance.


> As 160+Hz monitors are finally becoming commonplace

The Steam hardware survey suggests otherwise to my eyes; it's often easy to forget working in tech that a lot of Steam customers have relatively shit hardware:

https://store.steampowered.com/hwsurvey/Steam-Hardware-Softw...

While they don't track monitor peak supported refresh, just looking at the rest of the results suggests to me high-refresh gaming (over 60FPS) is going to be relatively niche at best. Lots of PC gaming specs I take for-granted on my own builds or consider "average" are actually not that common.

I absolutely agree 60fps should be considered the floor for most modern PC games running on "reasonable" hardware though, 30fps belongs to the past.


Most Steam customers play games like Counter Strike, Dota, Team Fortress, PUBG, GTA, Call of Duty which have modest hardware requirements.

I wouldn't read too much into these stats.


I don't think that's true. Most people who play PC games are Steam users, and they're playing a huge variety of games: pretty much any game that's not exclusive to some other store.


> I wouldn't read too much into these stats.

This is probably one of the best public sources of data we have on the topic of what hardware actual PC gamers are using.


Off topic but 42% of VR headsets are Quest 2

Woahhh


It’s the cheapest headset that also offers a reasonable experience. For around $250 you get a well designed device that includes controllers and does not require a separate computer to use. It’s a very good value, and if you’re unsure about VR probably the best place to dip your toe in the water.


>As 160+Hz monitors are finally becoming commonplace, it's becoming less defensible for games to run anywhere below 60fps.

You're forgetting the increasing popularity of the Steam Deck where even 60 FPS is a luxury so game devs should definitely have that as a target.


CS2 devs reportedly had 30 FPS as a target. Which is actually fine for that kind of game, but it sounds like that is NOT for low-end hardware ?


For CPU heavy games it totally makes sense to target 30fps, but that better be 30fps at 1080p with integrated graphics, (or 30fps 1440p with 1060 class gpu). Add eye candy if you want, but it really should run well on any remotely modern system.


If they said "The game is locked to 30fps" I'd probably have been ok with that. It's not a fast paced game, and capping the framerate at a low value is probably a decent enough design choice for that particular game. As you say, it's enough that you won't really notice it in gameplay.

The problem is that we're talking about the absolute highest end consumer card sold by nVidia rendering a blank map, without such a framerate lock in place. If it's not locked, then that framerate is the product of a performance bottleneck somewhere. With a modern gaming PC, "performance bottleneck" also means you're sitting next to a moderately powerful space heater.


Locking a game to 30hz would not have been a "decent design choice" given that's half the refresh rate of nearly every computer monitor out there, especially for a game that involves a lot of scrolling.

Games get mocked for locking to 60hz, 30hz would get them laughed at outright.


The devs supposedly aimed for 30 FPS (but missed it, because that would be for low end hardware?)

In my experience 30 FPS is perfectly fine for a game like this, I'm not sure what do you mean about "scrolling" ?

It's the much more fast paced, mostly first person games that really need those 60+ FPS...


I notice when my mouse updates at 30fps. The movement feels noticably choppier.


That is a fair call. The game visual updating slow is not a huge issue but if the input cursor is slow, it is very apparent.

Considering something like Simcity 2000/3000. Visual update is probably in single digits FPS most of the time, but the input cursor is very smooth.


Yes, but this is an unrelated issue. Last game I have seen to not have an "hardware cursor" option/default was a badly made DirectX 8 one that has trouble running on Windows 7.


>In my experience 30 FPS is perfectly fine for a game like this, I'm not sure what do you mean about "scrolling" ?

A top down city sim 'scrolls' up/down/left/right around a map, which is a movement that is strongly associated to screen tearing -- which i'm not really sure is a relevant thing to bring up given the variety of v-sync options available.

Even a game locked at 30fps that 'scrolls' often shouldn't experience significant tearing with the options out there.


Even without tearing, a high framerate is very desirable for scrolling around a map or web page. For instance, iphones are widely praised for excelling in this regard. It's just easier on the eyes if things on screen move smoothly when you scroll/pan around, it's less fatiguing.


50 FPS certainly doesn't make the game unplayable, it's true.

But if that's with an empty map, it's an ominous sign for people who plan to build a city in this city building game, as rendering usually slows as polygon count increases.

And if it's with an $1800 GPU, its an ominous sign for the 99.3% of gamers who don't have an $1800 GPU.

And if that's at 1440p - well, I'd wager a lot of the folks with a $1800 GPU also have a 4k screen.


It's not the average that's the problem, is that it has hard stutters from when you do something like swing or rotate the camera around and force a lot of new things to load. Also there's no fps limiter so I can't stop the game from attempting to match my monitor's 165 FPS with a overheating laptop 3060, causing the stuttering to be even more visible.


You can set a frame limit per-program in the NVIDIA control panel.

Undervolting also reportedly works really well in laptop nvidia chipsets, FYI.


It's a 3000$ computer playing at low resolution, of course it is.


Heh, in New Zealand, the card alone is $3500NZD


4090 is the fastest available graphics card. That’s the problem.


It's a huge issue, considering the game doesn't look like a revolution in graphics AND you're essentially playing it on a super-computer by 5-6 year old standards.

> I put 8 hours into Frostpunk before realizing I was limited to 30FPS.

Yes, and I have beaten games that rendered improperly at sub-30FPS as a kid, because the requirement was 16MB of VRAM, while my card only had 8. But that's just as irrelevant to the point being made.


It's like playing in black and white, kind of. But yes 50 FPS is completely fine if you have not gotten used to better. Also helps if the game doesn't rely on high levels of detail (which Skylines, unfortunately, does), so that moving the camera around doesn't just make everything blurry.


If moving the camera around makes things blurry, then they must be using some kind of weird post-processing on top, can't blame that on the FPS alone !


It's unrelated to post-processing. Moving the camera around makes things blurry at low frame rate (and 50fps is low) because of sample-and-hold blur. This can be mitigated by strobing the screen at the same speed as the frame rate, but few monitors support strobing at 50Hz (it would look annoyingly flickery) and if the game ever dropped below 50fps it would break the effect.

Blur Busters explains:

https://blurbusters.com/faq/oled-motion-blur


Try moving file explorer windows with your display set to different refresh rates. “Blur” might be an incorrect description, but it’s how I would describe it at least.


20 fps is playable for a city builder, but that's on a $2000 GPU

A bit unfair since most people aren't gaming at 4K (Steam hardware survey puts it at 3.29%), but still


Ok, so around 4 million users. I wonder what the overlap of 4K owners and people who play city builders is?


I feel like this is grasping at straws? Plenty of unorthodox games go mainstream. Can’t expect consumer hardware to the stay the same forever. This game is just poorly optimized.


I think I'm giving people the opposite impression that I intended.

Well-compensated people (like engineers) are more likely to own a 4K monitor and high-end GPU, so I meant to convey that there's probably a large overlap with the type of people who play city builders (like engineers), so they really should have accounted for that.

It was only after I bought my 4K 120Hz OLED that I found out how popular they are for sims, including pinball. It's been awesome.


Anecdotal evidence: I play at 4k if I can manage, and I sport a 2060S.


50fps is okay. 50fps on a $1600 GPU is definitely not okay.


It is just a matter of preference, I don’t see why people get bent out of shape about it.

30fps is fine for lots of games. Especially for stuff that doesn’t require super twitchy gameplay.

If somebody really cares about 60fps or 120fps, fine. But the developers have some effects they want to implement, and if some of them don’t fit into 8.3ms, let those of us who don’t mind 30fps enjoy them, right?


30FPS on the world's #1 fastest GPU is totally not OK. Heck, I'd be unhappy with 30FPS on my ancient Radeon Pro 580.

I remember easily getting frame rates far in excess of my monitor's top refresh rate back in 1999.


Why isn’t it ok?


Because this means the game runs even worse on pretty much every PC out there, since the majority of gamers don't use the most powerful GPU on the market. And it's not like this game has some outstanding visuals which would somehow justify such performance.


The problem is the UI won't be smooth on VRR displays unless they implemented some clever workarounds. Your mouse cursor will only update at the same framerate as the rest fo the game. Even budget gaming monitors these days support g-sync, freesync, or both, so it's no longer a problem that only affects a small % of users with deep pockets.

The only workaround I can think of that wouldn't also defeat the point of having VRR would be to add some in-engine low framerate compensation artifically boost the reported framerate to a multiple of the real framerate, but I have yet to see any games that actually do this.


Yes, yes it is. I have a 4k TV with 120hz. I want to be able to use it. If Anno 1800 can run at 100+ framerates, surely this game can too.


The issue is that if it runs at 50 FPS on an RTX 4090, it's going to run at 5 FPS on an average GPU.


Problem with these kinds of statements is that.. well, they're not true. I have RTX 2070 running in 3440x1440. I'm at 35fps on low settings with some special stuff turned off, e.g GI and fogs. In an small town with 4k inhabitants still, so things can get worse for me.

Regardless, the problem is blown out of proportion. I think we can give it some weeks and 4090 performance issues will be in the past.


> I have RTX 2070 running in 3440x1440. I'm at 35fps on low setting

LOL. You just proved my point.


> 5 FPS on an average GPU.

> I'm at 35fps on low setting

You can admit that you exaggerate.


It's not just 50 fps. It's 50 fps with the best possible hardware and lowest possible settings.

But also, yes, even in a city builder low for is very annoying when you're used to the smoothness of 100+.


Yes, particularly if you become accustomed to higher refresh rates. This is even worse if the game doesn't support VRR in any form natively.


As someone who can safely say that watching a video game in 30 FPS is equivalent to watching a slideshow, yes. It gives me intense motion sickness immediately. So I can't play something below 60 FPS


> How can a game even get to this point? Everyone in the development process must have noticed that it was running poorly

A lot of games run poorly during most of the development process, performance and bugs get fixed in the last stages. But when the publisher forces a premature release this means the fixes have to be done in the months afterwards. See Cyberpunk 2077, Mass Effect Andromeda, Battlefield 2042 and countless others. And this will continue to happen as long as gamers keep preordering.


Display pixel counts for reference, a 1440p ultrawide (assuming 21:9 and not 32:9) is about 60% the pixel count of 4K

  4K    16:9   3840 x 2160 = 8.29 million pixels (1.00x)
  1440p 21:9   3440 x 1440 = 4.95 million pixels (0.60x)
  1440p 16:9   2560 x 1440 = 3.69 million pixels (0.44x)


also 5120x1440 (32:9) is 7.37 million pixels (.89x)


> How can a game even get to this point?

lots of ways. It is very easy to do things which affect performance negatively.

Once you're sure about what you're doing and the data that you're drawing to the screen, then you can start optimizing things so that performance goes up.

Colossal chose to release when they did because they promised to release on that day. The game isn't done, and they've said this.


> How can a game even get to this point?

Quill18 had a live stream the day of release (Oct 19) and it seemed fine (2h10m in, after some Galactic Civilizations 4: Supernova) from what I saw:

* https://www.twitch.tv/videos/1954873419

Various other live stream days, as well as a separate Youtube series:

* https://www.twitch.tv/quill18/schedule

* https://www.youtube.com/@quill18/videos


Twitch Streamers stream fullhd and will usually run games at fullhd.


...and they're not even remotely objective.

The big ones are paid cash, and before they stream they've already had a team working with them, covering what exactly they do or don't want shown and talked about - and if there are any technical problems, they've been worked out, which may even include special builds of the game.

The smaller ones are "partners" who get access to the game earlier than purchasers, zero-cost items for giveaways, and some access to game publisher staff - but only if they play by the (NDA'd) rules of the publisher, and those rules usually explicitly say "do not say anything negative about the game."


I don't think CS2 is exactly the kind of game that lacked abundance of streamers waiting for the release to be honest.


Not as powerful of a setup, but I have a 5800X, 32 GB of RAM, and an RX 6800 XT on a 4K monitor, and I got 5 FPS on the main menu (which was showing an empty grass field).

I had to set my resolution to 1080p in order to get anything remotely resembling performance (this change alone took main menu FPS from 5 to ~55), plus disabling motion blur, depth of field, and volumetrics, and turning model fidelity down. I've also heard that disabling vsync can make a huge difference, but it didn't affect things for me.


This brings to mind fond memories of simulation games from back in the 80s/early 90s sporting the most primitive of graphics, if they had graphics at all, but which allowed the player to fill in the blanks with their own imaginations. Updated games of this sort would be astonishing to play on modern hardware.


This isn't accurate. Review here: https://www.ign.com/articles/cities-skylines-2-review

"My Ryzen 7 3700X and RTX 3080 were able to handle it okay on just shy of max settings"


it is accurate. check the steam reviews.


I found this test of Cities Skylines 2 with the somewhat faster non-laptop version of my GPU. It's so bad. https://www.youtube.com/watch?v=eNHqs6HYI0I

It _seems_ okay around 1080, but then you realize there's almost nothing going on and it's only doing ~60FPS. It'll choke on anything remotely complex.


its also a 3070ti and playing at 1080p, which is sort of ridiculous.


Would me reading Steam reviews make the IGN review use a 4090?


Perhaps it was edited after the fact. The claim about performance in OP has a link to youtube video recording the performance.


(insert my comment here, it's not important, I won't waste my time writing one. Let's see how many comments you can go before saying whoops. My bet is infinity / you never will)


Oh, they noticed. They complained. They said "we need to fix this".

The project manager said "Not high priority for launch", then closed the ticket.


If you notice how fast the PR and guides for optimizing performance has come I'd say not only the developers but everyone that's not the top of the top knew the game's pain points.


I mean if you have time to make guides, you had time to implement some presets in the ui...


More like c-suite, there's no chance PMs have anything to say in this and won't report issues upstairs just to be told "you can fix it after launch".


Is this satire? If this is real, I'd love to know more.


It's probably just a very good guess, since that's how it always goes.


When I was a game dev the argument went something like this: "Well, computers will be faster once we finish the game, so don't worry too much about the performance right now."

Two years later: "WE NEED TO RELEASE THIS TOMORROW. JUST MAKE IT BUILD AND SHIP IT!"


It's satire, people are reaching way too far in this thread (ex. IGN's review is using 3080 / Ryzen 7 and they were happy with performance)

The site none of us can read probably has more info about this being a RTX 4090 quirk. And it's not exactly surprising the dev team wasn't optimizing for RTX 4090, and the quirky reaction here would be justified if they were.


How would a 4090 perform worse than a 3090? It's not like Nvidia completely rebuilt the GPU between revisions


> It's not like Nvidia completely rebuilt the GPU between revisions

NVidia absolutely changes the assembly language rather dramatically between iterations. That's why the code is in NVidia PTX (or HLSL / Microsoft's DXIL, DirectX Intermediate Language), and then recompiled to actual GPU assembly each time you install a game or run the code on a different system.


I have not seen an instance where a GPU series that has been out for over 12 months has regressed in performance prior to an older one

Perhaps back in the very early accelerator days when everyone was making GPU-specific hacks of Quake 1 but even that was smoothed over by the transition to DirectX/OpenGL/GLiDE at the time


Perhaps it is limited by something other than the GPU after 50fps. It could be CPU bound at that point?


Same question then, if it is CPU bound on a 13900k


Performance problems like this are usually complex dumb things. Like calculating a thing a million times when it only needs to be done once, doing one thing and context switching instead of batching, dumb locking that leaves threads spending most of their time waiting, creating and destroying a thing millions of times when it should be reused, etc etc.

It will just be lots of these things that need to be found and corrected.


> 35 to 50fps. Not a single other graphics setting changed the performance at all.

Vsync locking frame rate to half of 100Hz?


Maybe they're going for the Crysis plan and designing for future tech?


Except Crysis looked incredible at the time. This is just Unity 3D graphics.


I fault the people complaining about performance of a city builder running at 1440p more than the developer. You're looking at concrete and asphalt and brick textures and shit like that. You don't need anything higher than 720p.


Dude, we're talking about an RTX 4090. A GPU that can run current-gen AAA games on max settings at 4k60fps should not slow to a crawl when rendering "concrete and asphalt and brick textures and shit", to use your words.

So if I have a budget PC, I should be content with running this game at Nintendo 64 resolution? Or I guess it's on me to fork over $1,000 for a high-end GPU if I want to run it at 2/3 the resolution of a midrange mobile phone, as you suggest.


>You don't need anything higher than 720p.

I don't want the UI to look horrible on my 27" 1440p monitor.


I do hope you're being sarcastic.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: