100+MB client-side memory leak on empty baseplate

0 LocalScripts/Scripts running:

1 ReplicatedFirst LocalScript running: (ChatService SetCallback())


Description:

After idling in any game for anywhere between half a minute → 2 minutes (max time observed), the clientside untracked memory will spike to 100+MB and will persist/increase at a slower but still significant rate.

I’m not confidently sure exactly when this started happening, just noticed it when my memory usage skyrocketed from mid-350’s to low-500’s without making significant changes to my game. I believed it was a personal error until I noticed it also occuring in an empty baseplate with no scripts running.

Issue began:

08 August 2021 ~09:00 CST

Replication:

  1. Play any game in the Roblox client or Studio
  2. Wait 1-2 minutes, monitoring clientside untracked memory
3 Likes

Hello! I’ve tried to reproduce the issue a few times, however have never been able to. Is this an issue you still consistently see? Thanks!

1 Like

Hello! Thank you for getting back to me. This has been consistently happening in both of these games (DBS Test Servers and warehouse).

warehouse has zero running scripts and I’m getting the memory leak.

https://streamable.com/1inguf

Here is a video of it happening. The memory leak occurs at around 00:55.

The strangest thing is that I don’t get it on my other games, which would eliminate the possibility of a faulty CoreScript/DefaultScript, and in DBS Test Servers (DBS Test Servers - Roblox) where I first noticed the issue, I played the game with ALL localscripts disabled and still got the memory leak.

https://www.roblox.com/games/7287355463/DBS-Test-II

Here is a duplicate of the game with ALL LocalScripts disabled. When I join (Players.CharacterAutoLoads is disabled by default.) I get the same memory leak issue.

https://www.roblox.com/games/7287454336/Untitled-Game

This is an empty baseplate where I got a 130MB memory leak.

Removal of all lighting modifiers and all lighting technologies will still yield the memory leak for me.

I don’t get it on my baseplates either. I’m using the newer baseplate though.
Are you sure this couldn’t be a plugin you have maybe?

After testing on the Classic Baseplate game itself, I can confirm this behaviour.

I can also confirm this behaviour the new baseplate.

As these experiences are private, can you please send the .rbxl files of these experiences to Logs / crash dumps / other bug files - DevForum | Roblox? Thanks!

2 Likes

This happens on normal baseplates, these are both project games, so they’re open to download or play.

https://www.roblox.com/games/95206881/Baseplate
https://www.roblox.com/games/6560363541/Classic-Baseplate

Also, something I can checked it seems that none of the behaviour changes from Workspace change anything, same issue regardless, on default or not. It could be one of these can still be affecting something, but I would expect something to change when I enable all of them to their newer counterparts, but no.

image

When you join, it already is using a bit of memory, 5-7, but it goes up pretty quickly, my other baseplates from before don’t have this issue, they always stay at 0.

1 Like

Is the repro just literally waiting for a few minutes and at some point the memory consumption jumps instantaneously?

1 Like

Ohh, can someone please confirm that this only happens on low quality levels? I see something odd where untracked memory jumps if you’re on low quality in the baseplate, but goes back down if you dial the quality all the way to 10.

6 Likes

Yep, can confirm this.

image

However it does seem like when it has a instantaneous spark in memory usage, that it will still have a bit of leaked memory left, but even then, yes, having a higher quality setting does seem to decrease that number by a large amount.

Changing back to lower graphics settings seem to make that come back instantly to a higher number.

I play roblox at max quality level.

Yes, this is exactly what happens, though in some rare, rare outliers I will join and get an instant 30-40MB leak, then it will jump again to the usual 100-200MB.

Here is a recording that I sent earlier to mister @OuterspaceNemo a few hours ago; this place has zero scripts running on clientside and serverside.

https://streamable.com/1inguf

I’m not sure if discourse gives me permissions to add you to a private thread, but I’m sure something can be done if you need more samples of this issue or I can post them in this thread. NEVERMIND I thought this was a separate public thread

Also my friend @PoppaPengo is getting the same issue in some of his games.

Upon debugging this further I believe this problem, or at least my reproduction of this problem, doesn’t represent a memory leak – it’s simply the case that “untracked” memory is computed completely incorrectly.

It would help to get two microprofile dumps to confirm - one before the problem happens (when untracked memory is low) and one after (when it’s high). But our computation of untracked memory, I believe, is simply broken.

The good news is that we plan to rework the dev. console memory reporting tool next year!

18 Likes

I will provide as many samples as possible; the first one will be in the same place that the recording was taken then as many more as you need

Thank you so so much for looking into this issue.

Pre-spike/when I first saw memory jump (since from my observations I think the Memory usage performance stat and untracked memory update at different rates, so this may be inaccurate)

log_B2274_microprofile_20210820-182539.html (1.1 MB)
log_B2274_microprofile_20210820-182526.html (1.1 MB)

When I first saw spike:

log_B2274_microprofile_20210820-182635.html (1.1 MB)

2 logs post-spike:

log_B2274_microprofile_20210820-182655.html (1.1 MB)
log_B2274_microprofile_20210820-182649.html (1.1 MB)

These dumps sort of confirm my suspicion although they are kinda odd in a different way :sweat_smile: I’d also recommend checking if the memory consumption that is reported by the Task Manager changes at all. If it doesn’t then it’s definitely not a leak and just a bug in “untracked memory” calculation.

4 Likes

I hope this isn’t a bad thing?

On-the-dot
Memory stayed in the mid-230’s before-and-after the ‘leak’ started

Thank you for putting my anxiety to rest X)

4 Likes

Hey, I’d just like to add that this issue is still persistent on the Roblox Windows Store version of the client

Here is a memory usage comparison of an empty baseplate:

Roblox client:

Roblox windows store client:

1 Like

This topic was automatically closed after 5 days. New replies are no longer allowed.