Why is this causing memory leak?

Hey, soo recently i made this module that i called Cluster, it’s global table like structure that allows me to create segments, pretty much simple idea, but there is a problem

local Cluster = {}
local Segments = {}

--/ Private functions
local function doSegmentExist(SegmentName: string): {}?
	return Segments[SegmentName]
end

--/ Segments management

function Cluster:createSegment(Name: string)
	if doSegmentExist(Name) then return end
	
	Segments[Name] = {}
end

function Cluster:deleteSegment(Name: string)
	local Segment = doSegmentExist(Name)
	if not Segment then return end
	
	table.clear(Segment)
	Segments[Name] = nil
end

--/ Segments data management

function Cluster:addToSegment(SegmentName: string, Index: any, Value: any)
	local Segment = doSegmentExist(SegmentName)
	if not Segment then return end
	
	Segment[Index] = Value
end

function Cluster:removeFromSegment(SegmentName: string, Index: any)
	local Segment = doSegmentExist(SegmentName)
	if not Segment then return end
	
	Segment[Index] = nil
end

function Cluster:insertToSegment(SegmentName: string, Value: any)
	local Segment = doSegmentExist(SegmentName)
	if not Segment then return end
	
	table.insert(Segment, Value)
end

function Cluster:retrieve(SegmentName, Index: any): any?
	local Segment = doSegmentExist(SegmentName)
	if not Segment then return end
	
	return Segment[Index]
end

return Cluster

Module’s code ^

Let’s say we have code below

local mod = require(script.Parent)


mod:createSegment("A")

print("a")

task.wait(8)

print("b")

for i = 1, 100 do
	mod:addToSegment("A", i, i * 2)
end

task.wait(1)
for i = 1, 100 do
	mod:removeFromSegment("A", i)
end
print("c")

Now after deletion, Luau Heap shows that cluster’s A upvalue (which is according to Roblox reference of something outside the script) grew, which means it wasn’t removed, but the thing is that when i remove entire Segment it magically goes back to normal, i tried everything, but nothing have worked to fix this, any ideas?

Note: task.wait(8) is here to make Luau Heap metrics
Note: I tested it with more than 100 iterations, and number grew, for 1000 it was 16k bytes
Note: Last note, i’ve used task.wait(0) to give GC some time, and it didn’t fixed problem too

EDIT: I also forget to mention it only happens when there are two loops, if i remove something from segment when it’s in same loop, it will also solve the problem, which seems strange to me

5 Likes

I could be wrong, But i think this is what you’re looking for.
The issue here stems from how Lua handles memory management and garbage collection, especially when working with references to tables. While table.clear removes the contents of a table, the table itself still exists in memory until it is explicitly dereferenced, allowing garbage collection to reclaim the memory.
Kinda how im looking at this.

  • Table References in Lua: Even after clearing a table with table.clear, the table’s memory remains allocated because the table object still exists. Any reference to the table prevents garbage collection.
  • Segments Reference: Your Segments table holds references to all created segments. Clearing a segment’s contents does not remove its reference, so the memory footprint persists.
    Your solution could probably be something like this,

collectgarbage("collect")
This should allow Lua’s garbage collector to reclaim the memory. However, garbage collection doesn’t occur immediately; it runs periodically or when triggered.
Im not that much of an expert in this field but this might do the trick

local Cluster = {}
local Segments = {}

--/ Private functions
local function doSegmentExist(SegmentName: string): {}?
    return Segments[SegmentName]
end

--/ Segments management

function Cluster:createSegment(Name: string)
    if doSegmentExist(Name) then return end
    Segments[Name] = {}
end

function Cluster:deleteSegment(Name: string)
    if not doSegmentExist(Name) then return end
    Segments[Name] = nil -- Remove the reference to the segment
    collectgarbage("collect") -- Trigger garbage collection explicitly for testing
end

--/ Segments data management

function Cluster:addToSegment(SegmentName: string, Index: any, Value: any)
    local Segment = doSegmentExist(SegmentName)
    if not Segment then return end
    Segment[Index] = Value
end

function Cluster:removeFromSegment(SegmentName: string, Index: any)
    local Segment = doSegmentExist(SegmentName)
    if not Segment then return end
    Segment[Index] = nil
end

function Cluster:insertToSegment(SegmentName: string, Value: any)
    local Segment = doSegmentExist(SegmentName)
    if not Segment then return end
    table.insert(Segment, Value)
end

function Cluster:retrieve(SegmentName, Index: any): any?
    local Segment = doSegmentExist(SegmentName)
    if not Segment then return end
    return Segment[Index]
end

return Cluster

Then the test script

local mod = require(script.Parent)

mod:createSegment("A")
print("Before Adding:", collectgarbage("count") * 1024, "bytes") -- Memory before additions

-- Add data to the segment
for i = 1, 100 do
    mod:addToSegment("A", i, i * 2)
end
print("After Adding:", collectgarbage("count") * 1024, "bytes") -- Memory after adding data

-- Remove data from the segment
for i = 1, 100 do
    mod:removeFromSegment("A", i)
end
print("After Removing Data:", collectgarbage("count") * 1024, "bytes") -- Memory after removing data

-- Delete the segment entirely
mod:deleteSegment("A")
print("After Deleting Segment:", collectgarbage("count") * 1024, "bytes") -- Memory after segment deletion

Best of luck

2 Likes

Sadly you can’t use other modes than count, but the problem is that it doesn’t happen if i delete the data in the same loop as it’s created, even with delay it still works, problem occurs when i move deletion to second loop

1 Like

I didn’t realize it got depreciated my apologies.
we can try this

local Cluster = {}
local Segments = {}

--/ Private functions
local function doSegmentExist(SegmentName: string): {}?
    return Segments[SegmentName]
end

--/ Segments management
function Cluster:createSegment(Name: string)
    if doSegmentExist(Name) then return end
    Segments[Name] = {}
end

function Cluster:deleteSegment(Name: string)
    if not doSegmentExist(Name) then return end
    Segments[Name] = nil -- Dereference the segment
end

--/ Segments data management
function Cluster:addToSegment(SegmentName: string, Index: any, Value: any)
    local Segment = doSegmentExist(SegmentName)
    if not Segment then return end
    Segment[Index] = Value
end

function Cluster:removeFromSegment(SegmentName: string, Index: any)
    local Segment = doSegmentExist(SegmentName)
    if not Segment then return end
    Segment[Index] = nil
end

function Cluster:insertToSegment(SegmentName: string, Value: any)
    local Segment = doSegmentExist(SegmentName)
    if not Segment then return end
    table.insert(Segment, Value)
end

function Cluster:retrieve(SegmentName: string, Index: any): any?
    local Segment = doSegmentExist(SegmentName)
    if not Segment then return end
    return Segment[Index]
end

return Cluster
local mod = require(script.Parent)

-- Helper to print memory usage
local function printMemoryUsage(label)
    local usedMemory = collectgarbage("count") * 1024 -- Convert KB to bytes
    print(label .. " Used memory: " .. usedMemory .. " bytes")
end

-- Create a segment
mod:createSegment("A")
printMemoryUsage("Before Adding Data")

-- Add data to the segment
for i = 1, 100 do
    mod:addToSegment("A", i, i * 2)
end
printMemoryUsage("After Adding Data")

-- Remove data from the segment
for i = 1, 100 do
    mod:removeFromSegment("A", i)
end
printMemoryUsage("After Removing Data")

-- Delete the segment entirely
mod:deleteSegment("A")
printMemoryUsage("After Deleting Segment")

im kinda counting on this
– Delete the segment entirely
mod:deleteSegment("A")

and

i changed this

function Cluster:deleteSegment(Name: string)
    local Segment = doSegmentExist(Name)
    if not Segment then return end
    
    table.clear(Segment)
    Segments[Name] = nil
end

i could be wrong, if i understand correctly
What It Does: Clears the table’s contents (table.clear(Segment)) before removing the reference with Segments[Name] = nil. This is redundant because once the reference is removed (Segments[Name] = nil), the table is eligible for garbage collection regardless of whether it was cleared.

i added this

local function printMemoryUsage(label)
    local usedMemory = collectgarbage("count") * 1024 -- Convert KB to bytes
    print(label .. " Used memory: " .. usedMemory .. " bytes")
end

Should help observe how memory usage changes at each step

1 Like

It sadly didn’t fixed the problem, when i delete segment it magically fixes it entirely, also when i not remove data there is more left of it than when i destroy it, which mean removing data from segment clears it in like 95% and this rest can cause memory leak

EDIT: When i added it repeats the code twice, the memory leak itself stayed the same as it was with 1 run, even when data was different

1 Like

We could try to replace the Table instead of Clearing

function Cluster:deleteSegment(Name: string)
    if not doSegmentExist(Name) then return end
    Segments[Name] = {} -- Replace with a new table
end

If that doesn;t help

function Cluster:deleteSegment(Name: string)
    if not doSegmentExist(Name) then return end
    Segments[Name] = nil -- Remove reference
    Segments[Name] = {} -- Immediately replace with a new table
end

i feel like im runing out of ideas and need sleep

local recycledTables = {}

function Cluster:deleteSegment(Name: string)
    if not doSegmentExist(Name) then return end
    table.insert(recycledTables, Segments[Name]) -- Store for reuse
    Segments[Name] = nil -- Remove reference
end

function Cluster:createSegment(Name: string)
    if doSegmentExist(Name) then return end
    if #recycledTables > 0 then
        Segments[Name] = table.remove(recycledTables) -- Reuse a recycled table
    else
        Segments[Name] = {}
    end
end

maybe if possible :confused:

function Cluster:replaceSegment(Name: string, newData)
    if not doSegmentExist(Name) then return end
    Segments[Name] = newData
end

Best of luck once again XD

2 Likes

We are talking about different scenario, using destroySegment() fixes problem, but if i do this:

for i = 1, 100 do
	mod:addToSegment("A", i, i * 2)
	task.wait(0)
end

task.wait(1)

for i = 1, 100 do
	mod:removeFromSegment("A", i)
	task.wait(0)
end

There is small memory leak that appears only once and depends on number of repeats i use to test it, if i remove everything, for let’s say some sort of maid class storage, it will leave a memory leak, we talk about scenario where i can’t delete segment due to some reasons

1 Like

My apologies XD I’m sleepy, I am trying to help tho.
So that removeFromSegment function isn’t fully eliminating memory of certain elements…
Sometimes i feel like roblox is just wild XD, though an actual expert on this would know better than me XD

Instead of looping and removing one item at a time perhaps we can do this
Segments["A"] = {}
I’d like to think this would completely clear the segment without needing to call removeFromSegment repeatedly

local mod = require(script.Parent)

local function printMemoryUsage(label)
    local usedMemory = collectgarbage("count") * 1024 -- Convert KB to bytes
    print(label .. " Used memory: " .. usedMemory .. " bytes")
end

mod:createSegment("A")
printMemoryUsage("Before Adding Data")

-- Add data to the segment
for i = 1, 100 do
    mod:addToSegment("A", i, i * 2)
    task.wait(0) -- Simulate processing delay
end
printMemoryUsage("After Adding Data")

-- Remove data from the segment
for i = 1, 100 do
    mod:removeFromSegment("A", i)
    task.wait(0) -- Simulate processing delay
end
printMemoryUsage("After Removing Data")

-- Simulate keeping the segment but clearing its contents
if mod:retrieve("A") then
    table.clear(Segments["A"])
    Segments["A"] = {} -- Reassign to a new empty table
end
printMemoryUsage("After Reassigning Segment")
local Cluster = {}
local Segments = {}

--/ Private functions
local function doSegmentExist(SegmentName: string): {}?
    return Segments[SegmentName]
end

--/ Segments management
function Cluster:createSegment(Name: string)
    if doSegmentExist(Name) then return end
    Segments[Name] = {}
end

function Cluster:deleteSegment(Name: string)
    if not doSegmentExist(Name) then return end
    table.clear(Segments[Name]) -- Clears all entries in the table
    Segments[Name] = {}        -- Reassigns to a new empty table
end

--/ Segments data management
function Cluster:addToSegment(SegmentName: string, Index: any, Value: any)
    local Segment = doSegmentExist(SegmentName)
    if not Segment then return end
    Segment[Index] = Value
end

function Cluster:removeFromSegment(SegmentName: string, Index: any)
    local Segment = doSegmentExist(SegmentName)
    if not Segment then return end
    Segment[Index] = nil
end

function Cluster:insertToSegment(SegmentName: string, Value: any)
    local Segment = doSegmentExist(SegmentName)
    if not Segment then return end
    table.insert(Segment, Value)
end

function Cluster:retrieve(SegmentName: string, Index: any): any?
    local Segment = doSegmentExist(SegmentName)
    if not Segment then return end
    return Segment[Index]
end

return Cluster

so something like this? idk i don’t think we tried this yet. Hope it helps.

1 Like

After testing it seems like even for one item at the time it doesn’t work, idk what to do because i nulify the index and there is no other reference/code that works

1 Like

I tried to ask someone more qualified than me, This isn’t their main field either. This was the response i got.
Lua’s garbage collector doesn’t immediately reclaim memory after nullifying references or clearing a table—it waits until it determines that memory is no longer needed.
What they’re observing may not be a true leak but rather Lua holding onto memory for future reuse. This behavior is common when repeatedly adding/removing data to/from a table.

im not sure if you tried to see whats being left behind

local mod = require(script.Parent)

local function printMemoryUsage(label)
    local usedMemory = collectgarbage("count") * 1024 -- Convert KB to bytes
    print(label .. " Used memory: " .. usedMemory .. " bytes")
end

local function printSegmentContents(segmentName)
    local segment = mod:retrieve(segmentName)
    if segment then
        for k, v in pairs(segment) do
            print("Remaining in segment:", k, v)
        end
    else
        print("Segment does not exist or is empty.")
    end
end

-- Create the segment
mod:createSegment("A")
printMemoryUsage("Before Adding Data")

-- Add data to the segment
for i = 1, 100 do
    mod:addToSegment("A", i, i * 2)
end
printMemoryUsage("After Adding Data")

-- Print contents of the segment
print("Contents after adding data:")
printSegmentContents("A")

-- Remove data from the segment
for i = 1, 100 do
    mod:removeFromSegment("A", i)
end
printMemoryUsage("After Removing Data")

-- Print contents of the segment again
print("Contents after removing data:")
printSegmentContents("A")

-- Optional: Replace the table
mod:deleteSegment("A")
printMemoryUsage("After Deleting Segment")

-- Force a manual garbage collection (only for testing)
collectgarbage("collect")
printMemoryUsage("After Garbage Collection")
function Cluster:retrieve(SegmentName: string, Index: any): any?
    if Index then
        return Segments[SegmentName] and Segments[SegmentName][Index]
    end
    return Segments[SegmentName] -- Return the full segment if no index is provided
end

…Right as i was about to hit send i got this. So i will add it with it.
Steps to Test for a True Memory Leak:

  • Run Multiple Iterations:
  • Create and clear segments repeatedly in a controlled loop.
  • Track memory usage after each iteration to see if it steadily increases.
  • Force Garbage Collection:
  • Trigger garbage collection explicitly using collectgarbage("collect") to ensure unreferenced memory is cleaned up after every iteration.
  • Compare Initial and Final Memory Usage:
  • Measure memory before starting the loop and after all iterations are complete.
  • If memory usage stabilizes and doesn’t grow indefinitely, it’s likely not a memory leak.

the process i think should look something like this?

local mod = require(script.Parent)

local function printMemoryUsage(label)
    local usedMemory = collectgarbage("count") * 1024 -- Convert KB to bytes
    print(label .. " Used memory: " .. usedMemory .. " bytes")
end

-- Test Parameters
local iterations = 100
local segmentSize = 100

print("Starting Memory Leak Test")
printMemoryUsage("Initial Memory")

for i = 1, iterations do
    mod:createSegment("A")
    
    -- Add data to the segment
    for j = 1, segmentSize do
        mod:addToSegment("A", j, j * 2)
    end

    -- Remove data from the segment
    for j = 1, segmentSize do
        mod:removeFromSegment("A", j)
    end

    -- Optionally delete the segment (uncomment to test this case)
    -- mod:deleteSegment("A")

    -- Force garbage collection
    collectgarbage("collect")
    
    -- Print memory usage
    printMemoryUsage("After Iteration " .. i)
end

printMemoryUsage("Final Memory")

was told if the memory usage stops growing after a few iterations, it’s not a true memory leak and you can rest easy XD
however if the memory usage grows consistently after each iteration, than it might be a leak.

and last but not least it if its Negligible Differences… Than you shouldn’t stress it at all
for example
If the difference in memory after all iterations is minimal (e.g., a few KB), it’s likely not an issue for practical purposes.

XD instead of Best of luck,
Ima go with “Please work this time” XD

2 Likes

I decided to try it on module script and index it then unindex, the same problem happened here, with gc info i managed to get this:

  1. GC grows when i add/remove indexes
  2. GC decreases 5x then
  3. GC increases when gc thread is measured
  4. GC decreases again when it ends
  5. There is still memory going around
1 Like

I’m not sure if i answered this part about your edit.
But from my understanding,
During the first loop, as data is added, Lua assigns memory to store it. However, when the second loop begins, the data is removed incrementally. Lua may defer or batch reclaiming the memory for efficiency, meaning that the memory footprint might not decrease immediately, even though the data is logically removed.
also
Lua’s garbage collector doesn’t immediately free memory as soon as objects become unreachable. It works on a schedule or is triggered by specific memory pressure thresholds. This delayed reclamation might give the appearance of a “leak” when there’s actually none.
When adding and removing data within the same loop, Lua might optimize memory usage more aggressively since it can predict that the table isn’t holding persistent data across iterations.
Kinda feeling like this might be the case,
This behavior isn’t necessarily a true memory leak but rather a characteristic of how Lua balances memory allocation and performance.
At least for that edit part. hopefully it makes sense.

2 Likes

My theory is that because it happened only at first cycle (add/remove 100 indexes) but didn’t occured on second, this may mean lua knows this table will be reused and it locks memory until the table itself is removed, idk what you think about it, but with my current measures it makes some sense

EDIT:

BTW, i used this code and the same thing happens soo it’s maybe not really a memory leak, but rather smart optimization or smth :confused:

local mod = require(script.ModuleScript)

task.wait(8)

for i = 1, 100 do
	mod.A[i] = i * 3
end

task.wait(0)

for i = 1, 100 do
	mod.A[i] = nil
end

1 Like

Another things:

local t = require(script.ModuleScript)

task.wait(8)
print("h")
for i = 1, 100 do
	t.A[i] = i * 2
end

for i = 1, 100 do
	t.A[i] = nil
end

task.wait(0)

for i = 1, 100 do
	t.A[i * 2] = i * 3
end

for i = 1, 100 do
	t.A[i * 2] = nil
end

Memory is twice as much as in case below

local t = require(script.ModuleScript)

task.wait(8)
print("h")
for i = 1, 100 do
	t.A[i] = i * 2
end

for i = 1, 100 do
	t.A[i] = nil
end

task.wait(0)

for i = 1, 100 do
	t.A[i ] = i * 3
end

for i = 1, 100 do
	t.A[i] = nil
end

Memory doesn’t change even after second cycle

1 Like

I think it’s very likely that it’s not a memory leak. From my understanding
your pretty much spot on, It’s not actually locking the memory But its behavior will make it seem like Lua anticipates the table’s reuse, but it’s actually just the allocator avoiding unnecessary overhead by retaining memory. It’s a smart an intentional design choice.
Preformance/Optimization wise…
Avoiding Repeated Allocations: Memory allocation and deallocation are expensive operations. By retaining memory, Lua avoids the cost of frequently resizing tables or reallocating memory when tables grow or shrink.
Example: If a table frequently fluctuates in size (e.g., adding and removing elements in cycles), reallocating memory every time would waste CPU time. Retaining memory ensures better performance for these scenarios.
I’m quite confident that 16k you observed is negligible, in most practical scenarios esp in Roblox game development, im pretty sure its actually normal as well.
A true memory leak should show significant growth in my opinion.

3 Likes

Breaking It Down:

  1. Case 1: t.A[i * 2] = i * 3
  • Here, the keys are non-contiguous (e.g., 2, 4, 6, ...).
  • Lua treats these as sparse table entries and may allocate more memory to manage the sparse index mapping.
  • Sparse tables have overhead because Lua uses additional structures to store non-sequential keys.
  1. Case 2: t.A[i] = i * 3
  • In this case, the keys are contiguous (e.g., 1, 2, 3, ...).
  • Lua optimizes memory usage for contiguous keys by using an array-like structure internally, which is much more memory efficient than sparse mappings.

Why This Happens:

  • Sparse vs. Dense Storage:
    • Lua tables dynamically adjust between array-like (dense) and hash-like (sparse) storage based on key patterns.
    • Non-contiguous keys (as in i * 2) force Lua to switch to a hash-like representation, consuming more memory.
  • Memory Retention:
    • Once the table switches to sparse storage, Lua may retain the additional memory allocated for managing the sparse keys, even after they are removed. This is part of the allocator’s behavior to avoid frequent resizing.
1 Like

I would recommend taking advantage of the Contiguous indices whenever possible. The optimized array storage for the win.

1 Like

Maybe, I’m far to sleepy but those aren’t backwards right? XD
or maybe what i said was backwards… Tbh i think ima call it a night. If i wasn’t able to help hopefully someone else can. I got a feeling ima wake up read all this and probably decided not to help people half awake XD

2 Likes

At the end i asked chat gpt and made some tests, from both sources, i’m 100% sure lua have index memory, when i tested it with hash, it was almost instantly removed, and at the end only this is what happens i think:

  1. I assign hash map key to array
  2. I remove it, soo 1 memory box (or smth like that) is saved because it was here
  3. I repeat the process and because there is memory box, it’s reused and freed again
  4. I repeat process third time and the same happens

When i did this:

  1. I assign hash map key to array
  2. I assign another hash map key to array, second memory box is created
  3. I free both of them
  4. I create another hash map key and this time no memory is added to storage because it reused one of those 2 boxes
  5. I free this hash map key and because there weren’t need the boxes are reused

And finally when i remove table, it can be freed

Thx for talk because i learned something crazy and also very usefull, thx for help

EDIT: It also seems like those memory boxes need to be in binary number, because when i created 3 it was the same as 4, and as we know 4 is power of 2

2 Likes

Glad to help i just tried to use Roblox Ai To see if i need sleep or if i got that backwards or what.
it pretty much told me I was wrong,
"The code provided by the user seems to be a demonstration of how to properly remove references to objects in Lua to allow for garbage collection. However, the second code snippet appears to be a reversal of the first one, intentionally introducing a memory leak.

In the second code snippet, the line t.A[i * 2] = i * 3 assigns a new value to the A array at an index that has already been set to nil in the previous loop. This means that the object at that index is still being referenced, preventing it from being garbage collected."
to quote it

i told it something along the lines of this

When keys in a table are set to nil, Lua does not immediately reclaim the memory but retains it for potential reuse. This is an intentional optimization to avoid the overhead of frequent allocations and deallocations. While it might appear that memory is not being released, this behavior is different from a true memory leak, where unused memory is permanently inaccessible.

and then it changed it’s mind…

and gave me this

"Based on the provided code and the discussion, it appears that the second code snippet consumes more memory due to the use of sparse keys (i * 2) in the table. This is because sparse keys are stored in the hash map portion of the table, which uses more memory than contiguous keys.

To clarify, this is not a memory leak, but rather a normal behavior of Lua’s table implementation. When keys in a table are set to nil, Lua retains the memory for potential reuse, which might appear to be a memory leak, but is actually an optimization to avoid frequent allocations and deallocations.

The second code snippet consumes more memory because it uses sparse keys, which require the use of Lua’s hash map structure instead of the more efficient array part of the table."

I could’ve asked it from the start but I’m glad I didn’t hind site.

Chat gpt would probably actually know better than me as well on all your issues and if the 16k was neg. I got a feeling it’s going to say don’t stress it XD
This was a good chat for remember how things worked i appreciate it as well.

1 Like