Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Uhhh, this doesn't even make sense. Simulation speed is usually controlled by a button.


It makes perfect sense. Simulations are done in discrete time chunks, as the framerate drops and those time chunks grow larger there are two choices: make the simulation bad, or make the world slow down.

If you don't adjust simulation rate you start seeing higher instances of objects phasing through each other, collisions not preserving energy correctly, pathfinding just fundamentally breaking.


Well ideally sim "ticks" are completely seperate from rendering, but this is not the reality in many games where they share the same thread, or where cross communication (like UI stuff) blocks enough to slow the other component down.

Even in Minecraft, with its completely seperate server/client, the rendering can bog down TPS due to JVM pauses and other reasons I don't even understand.


> Well ideally sim "ticks" are completely seperate from rendering

I have a counterexample: there was a game called der Omnibussimulator, in which the simulation and rendering were 1:1 coupled.


But the simulation can be done as fast the CPU dictates. If the GPU has to drop frames to keep up, it doesn't impact the simulation.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: