Roblox internal memory leak causes my game servers to undiagnosably crash every 3 hours

Some additional information I would like to provide

I created a test place that acts as a stress test for memory by having a script create 100k parts inside of lighting every ~25 seconds, followed by clearing lighting a few seconds later, all in a loop, and this was the results i got:


Memory stats sourced from custom made memory tracker that is also located in the place

As seen in the graph, the total memory almost never went down while on the other hand, the untracked memory accumulated with almost every loop.

The place: BUG REPORTING - Roblox

3 Likes

Seems suspiciously similar to what’s been ruining one of my games?

Memory rises initially as things are loaded up on the server, then plateaus between sudden large jumps.
There’s no correlation between player count or any in-game event that I could find, but haven’t gone as in-depth as you. I just can’t figure out how I can possibly be triggering sudden gb spikes in server memory within the span of a minute.

4 Likes

Yep this is pretty much the exact profile I get too. The closest I could tie it to is player respawns, however I haven’t been able to narrow it down past that. Part of my problem was obscured by a now fixed HumanoidDescription memory leak.

Have you tried plotting respawns against server memory?

5 Likes

It’s funny you should say that actually, because I asked @Orlando777 to try and find any clues as to what was causing the memory leak, and he said it happens when people reset. Thing is I couldn’t replicate it and there just was very little code going on with character death and loading which was easily reviewed and ruled out, so I wrote it off as coincidence.

I’ll put in some code to plot respawns and memory when I have some time and report back.

5 Likes

Done some plotting long term plotting of the various memory items against up time for one of my places that has been mainly affected by this.

Max Players: 50

In all three instances, there is one or more sudden spikes in Core Memory that climbs uncontrollably out of control and for no explainable reason, increasing by several hundred mb before sharply falling to roughly its previous level, with the total and untracked memory retaining some of that amount, never falling.

Having been fortunate enough to be in the first server when the memory spike began, I was able to determine that the increase is almost solely caused by network/replicator:

It is odd that it increases so rapidly despite everything else non-core related remaining the same, instances especially.

5 Likes

I believe the same thing has been happening to my game constantly for the past few months. I posted about it before and thought I had figured out, but entire servers are continuing to crash due to random spikes of untracked memory just as you say.

It’s also incredibly hard for me to replicate as well.

2 Likes

Any update to this reported bug at this time?

Our games are still suffering from this unsolvable and unfixable bug

2 Likes

If you want a (temporary) fix - max out your game at 700 players and then limit it in a custom server browser. That raises your server memory to 12GB and this has mitigated the issue for us.

7 Likes

Are there any updates on this? I was about to start a new thread but figured I’d add here as our symptoms sound very similar.
We went live with our game a month ago, and we’ve been trying to track down the server crashes with no luck. Same thing - nothing else seems to be leaking, but Internal/Total Memory increases and has huge unexplainable spikes. When Total Memory reaches 6GB it eventually crashes.

Some servers have uptime as low as an hour, others may stay up for a whole day.

Our game is here: Wings-of-Fire-Seven-Thrones

We have custom character models (skinned mesh dragons), it is a large world, but it is not a building game so there’s no accumulation of parts over time.
However, there is a constant flux of parts (fire breathing, items, etc.) that get spawned and destroyed (usually using Debris service). It is a survival / pvp game so characters will get killed and respawned.

@unix_system - could you explain your workaround on how to get the 12GB servers? Is that only for private server games, as ours is free-to-play public.

If anyone or any staff can give some guidance on what the internal memory leak/spikes may be, it would help us understand what may be going on. If this is in fact a confirmed Roblox leak any information on its status or how to work around it would be helpful too.
Thanks.

image

Figure 1 - Each line represents a unique JobId and is tracking the Internal Memory value as obtained with:
game:GetService(“Stats”):GetMemoryUsageMbForTag ( Enum.DeveloperMemoryTag.Internal )

image

Figure 2 - Server Total Memroy. Same timeframe, server Total Memory from Stats:GetTotalMemoryUsageMb()
(Lines that reach right edge are still up)

image

Figure 3 - Players per server over same timeframe (x-axis is just UTC on this one)

3 Likes

This is exactly the same profile I am experiencing. We have not had any resolution on it (but getting our server memory capped at 12GB has at least delayed the issue so we can get a good 6-12 hours of server uptime)

Hi gigagiele. Could we please get a quick status update on this ticket? Even just knowing whether or not this is being looked at would be helpful! The crashes have been very frustrating for both the team and the user base. Any technical information you could provide would be greatly appreciated to help understand what may be the nature of the leak. I don’t mind adjusting to workaround it, but I feel like I’m taking shots in the dark at the moment.
Thanks

7 Likes

Adding my voice to this thread since I’m experiencing similar symptoms.
My game Dragon Blade also runs for a few hours then crashes. I am yet unable to pinpoint the cause but I also see over 1Gb used by the “megaReplicationData” in particular, and 2GB+ of CoreMemory.
My game also uses custom skinned-mesh avatars, and features a very large smooth terrain (8k by 8x studs).
Any update on this would be very helpful. At the moment the server can run from 3-5 hours before crashing.

5 Likes

did you guys ever figure out this problem? I’ve been reading through the forums trying to get an answer to this long-standing issue with my game and saw your posts in this thread

4 Likes

Still waiting on an update to this myself.

3 Likes

Can confirm this is also happening to me.

4 Likes

This is happening to me as well. Untracked memory reaches 3GB+ and touchReplication is at 1GB+ for some reason.

5 Likes

I read several other threads that described a similar problem. The memory leak seems to have something to do with character models being cached but the cache is never purged. Someone suggested adding this code to the server to help fend off the memory leak:

local playerService = game:GetService("Players")

playerService.PlayerRemoving:Connect(function(player)
	if player.Character ~= nil then
		player.Character:Destroy()
	end
end)

That supposedly forces the engine to destroy the character and purges the character appearance from the cache. Several individuals have said that it helps.

5 Likes

I’ve tried this to no avail, unfortunately.


This bug report has been open for nearly two years - at this point I’ve just accepted my game can’t be fixed as this is totally undiagnosable and absolutely destroying my userbase and retention. I’m not sure what I can do to fix it anymore.

4 Likes

I reported this problem again, but with a different area of the leak, which also has a very big impact on performance.

This issue is still open, I will keep an eye on it and share everything I have

5 Likes

Pretty sure around may the touch event broke on the client and that could somehow be related which is unlikely

1 Like