At what point does server memory start causing meaningful latency and physics problems? If it depends on some parameters, outline them.
I’m not sure what OS, if any, the servers run to memory manage, and I don’t think we’ll be allowed to find out. Still, afaik the most common practice when approaching the memory cap of the machine is to compress it, which creates a CPU overhead when adding to and reading memory.
That’s speculation though. I have no evidence.
I think it would have latency if it’s paging, but you probably would need to fill the memory, but I don’t know about how the server manages memory at all, I guess that’s improbable.
Depends on what your game is trying to run of course and what system it’s trying to be ran on. In most instances keeping under 450 MB client memory in game is generally the most preferable.
Live servers crash after 4 GB of memory is being used Edit: Apparently it was 3.5 GB and now it’s 8 GB in some servers
I haven’t noticed latency/physics problems while approaching it. Just the hard cutoff.
Doing something specific like instancing/cloning 10,000 parts at once, or putting 200 small loose spheres next to each other, would be the things that slow it down temporarily