Does this utility function helps optimize the code?

I made some get dictionary length function to check how many items are there in the dictionary. What it does is when it’s first called. I’ll be making new cache table, then check if dictionary was already processed. If so, return the length number, otherwise do for loops to get length of dictionary and then return it out.

Feel free to give feedback and a suggestion.

local dictionary_cache = {}
local utils = {}


function utils.GetDictionaryLength(dict : {[any] : any})
	local new_dictionary_cache = {}
	if dictionary_cache[dict] then return dictionary_cache[dict] end
	
	new_dictionary_cache[dict] = 0
	for _,_ in dict do new_dictionary_cache[dict] += 1 end
	
	dictionary_cache = new_dictionary_cache
	return dictionary_cache[dict]
end


return utils

judging from the name, you want to GetDictionaryLength of the inputed dict
now trace what the following is doing

local tableA = { tA1 = "1", tA2 = "2" }
local tableB = { tB1 = "1", tB2 = "2", tB3 = "3" }
local tableACount = utils.GetDictionaryLength(tableA)

-- expand and substitute dict with tableA
local dictionary_cache = {}
local utils = {}
--function utils.GetDictionaryLength(dict : {[any] : any})

	local new_dictionary_cache = {}
	if dictionary_cache[tableA] then return dictionary_cache[tableA] end
	-- didn't find tableA in cache, so we proceed
	new_dictionary_cache[tableA] = 0
	for _,_ in dict do new_dictionary_cache[tableA] += 1 end
	
	dictionary_cache = new_dictionary_cache
	-- notice we replaced the old cache here
	-- dictionary_cache is now { [tableA] = 2 }
	return dictionary_cache[tableA]
--end

local tableBCount = utils.GetDictionaryLength(tableB)
-- expand again but with tableB this time
--function utils.GetDictionaryLength(dict : {[any] : any})

	local new_dictionary_cache = {}
	-- notice this is a new dictionary cache again
	if dictionary_cache[tableB] then return dictionary_cache[tableB] end
	-- didn't find tableB in cache, so we proceed
	new_dictionary_cache[tableB] = 0
	for _,_ in dict do new_dictionary_cache[tableB] += 1 end
	
	dictionary_cache = new_dictionary_cache
	-- notice we also replaced the previous cache here
	-- dictionary_cache is now { [tableB] = 3 } , we lost the tableA cached count
	return dictionary_cache[tableB]
--end

although you may try to fix it, unless you need the count of table really very often, and that the tables will have really really many elements. It doesn’t quite worth it to bookkeeping the count in another dictionary.

also because it seems a module script, meaning the cache itself stay for all the time. any table added as the key in the cache will not be garbage collected. which mean if you don’t remove the table entries, it will cause memory leak

Does it mean It replaces a table cache from tableA and turns into tableB?

yes simply says, with that code in its state, every time you cache the length of another table, all the other table count is lost.

and even worse, if you insert an item into tableB. and then call the function again, it will return a wrong count

I tried using the cache table itself without making a new cache table inside a function. Does this work and does it optimize the code aswell?

some fixes can make it work as you intended. but it will not do much just for the table count in general

1 Like