A Basic Guide To Caching For Roblox Lua

Caching

Hello, my fellow developers! I’m Boston and today I am going to teach you the reader about the magic of caching, and how it can be easily implemented into your scripts to optimize runtime efficiency! By the end of this tutorial you should see that it isn’t really magic at all and that it’s super useful and has many different applications in programming.

However before we get technical and get into the code part of this tutorial we need to define what caching is since, what good is knowing the syntax association of something if we can’t define it.

Definition of caching in regards to how it applies to programming:

  • Caching is the systematic storage of any kind of data that is stored from a request that was processed or returned. This is so that when a similar request is made or a similar set of data is asked for it will be readily available much quicker than making that same request over and over.

Great! Now that we know what caching is I thought I would include some practical applications of caching that you interact with on a day by day basis.

Practical Applications of Caching:

  • Web Development: Every day you use the internet you are interacting with systems using a cache to speed up different processes, the forum you are reading this article on even uses a cache. These caches are used to store the different interface components that make up a particular page. You may notice this happening when you visit a page for the first time and it takes longer to load on average than subsequent visits, because all of the elements are now stored in the cache!

Despite the perceived advantages of caching up until this point in the article; it’s important to highlight the downsides of caching since they do exist. This is in order to allow you to make a more well informed decision when deciding to implement a cache into your program.

Downsides to Caching

  • Caching uses memory to store the request data, therefore you need to assess if the memory usage of the cache is justified by the performance enhancement to your program.

We now know the concept of caching and we are now armed with information regarding how it’s used and examples of caching in everyday life that you take for granted we are now going to get technical! :slight_smile:

Basic Form of Caching for Roblox Lua:

local avatarThumbnailCache = {} -- this is going to be our table which will hold our requested data which can be indexed from the cache much quicker once it's added 

local playerIDS = {
	55605782, --@coefficients
	1364639953, --@JoshSedai
	554895233, --@M_caw
	491773343, --@CrazedBrick1
	8403307, --@railworks2
	129574502, --@Incapaz
	2231221, -- @TheGamer101
	287654442, --@TheCarbyneUniverse 
	1273918, --@EchoReaper
	41018588, --@itsKoiske / Krunnie
	308165, -- @Sleitnick
}

local PlayerService = game:GetService("Players")

local function getThumbnailURL(userId) -- this function will be used to make the classic yiedling function to get the url of a player's thumbnail id, whilst utilizing caching to speed it up if it's already been searched for
	if avatarThumbnailCache[userId] then
		return avatarThumbnailCache[userId] -- return the url from the cache 
	else
		local thumbnailURL, didReturn = PlayerService:GetUserThumbnailAsync(userId, Enum.ThumbnailType.HeadShot, Enum.ThumbnailSize.Size420x420)
		if didReturn then 
			avatarThumbnailCache[userId] = thumbnailURL -- add the thumbnailURL to that indice in the table
			return thumbnailURL -- return the data you got from the request 
		end
	end
end

local function timedThumbnailAllocation()
	local startInterval = os.clock()
	for i = 1, #playerIDS do
		print(getThumbnailURL(playerIDS[i]))
	end
	warn("it took " .. os.clock() - startInterval .. " seconds to get the ThumbnailURL for " .. #playerIDS .. " userIDS")
end

timedThumbnailAllocation()
timedThumbnailAllocation()

Above I have included an example of how you can combine caching with the requests made by most yielding functions on the Roblox platform to store their returned information for future requests so their data can just be indexed by the query parameters so you don’t need to make the same request twice. I even included some cool prominent figureheads in no particular order that I just named randomly off the top of my head and used their user-id’s to test this system. :sunglasses:

To prove that caching makes retrieving previously requested data much faster is, I have the results of running the script above, in this image for all of you to see. :slight_smile:

ThumbnailExecTime

Much better!

One thing that is worth noting here is that some of Roblox’s yielding functions will automatically cache the data collected after its initial return. This means that you don’t need to create your own cache for this data as it would make it un-necessary.


More advanced implementations of caching:

What An Object Oriented Implemention of Caching Could Look Like:

Code
local cache = {}
cache.__index = cache

function cache.New(cacheName, cacheType, cacheSize)
	assert(cacheName and cacheType, "ERR (Cache) : Nil arguments on instantion")
	local self = {
		cName = cacheName, -- the name of the cache
		cType = cacheType, -- type of data stored in the cache
		
		cacheData = {}, -- this is the actual cache, that stores all of the data
		cacheDataLimit = cacheSize or 300, -- this sets a default maximum number of indices to 300 or whatever parameter is provided
		
		signalHandlers = {} -- this is for if you want a cache with different signals and stuff
	}

	return setmetatable(self, cache)
end

function cache:ConnectOverflow(callback) -- method used if you want an overflow system for the cache so when it resets a callback will be run
	self.signalHandlers[1] = Instance.new("BindableEvent")
	    	
	self.signalHandlers[2] = self.signalHandlers[1].Event:Connect(function()
		warn("WARNING : ResultCache Overflow, clearing")
		
		callback() -- this will run the callback once that all happens :)
	end)
end

function cache:AppendData(index, data)
	assert(index and data, "ERR (Cache) : Nil arguments")
	if (#self.cacheData >= self.cacheDataLimit) then
		self.cacheData = {} -- clears the cache out if it reaches it's limits
		
		if self.signalHandlers[1] and self.signalHandlers[1]:IsA("BindableEvent") then
			self.signalHandlers[1]:Fire() -- this will fire the bindeable event once it overflows to repopulate or whatever
		end
	end
	    	
	self.cacheData[index] = data -- this will set the index in the table to the data you give it
end

function cache:HasDataAtIndex(index) 
	assert(index, "ERR (Cache) : Nil index argument")
	return self.cacheData[index] ~= nil 
end

function cache:GetDataAtIndex(index)
	assert(index, "ERR (Cache) : Nil index argument")
	return self.cacheData[index]
end

return cache

Here are some other caching systems which have been made by members of the community:

- Kache:
@lewisakura 's Kache : The Kache module. · GitHub
Article: Kache - it's a caching system!


Why I wrote this article:

  • I know people might say that there is information regarding caching on the forum, however, it’s scattered throughout the forum and people might miss it easily unless they know what they are looking for. I decided to centralize information about caching into one article for the most part. If there is any information or feedback you have about this article please do reply to this article or message me here on the forums, I am definitely open to constructive criticism. I also recommend you also check out the different caching systems above. I hope this article helped you learn about caching or further your understanding of caching.

How helpful do you feel this article is?
  • still confused
  • somewhat understand
  • good understanding

0 voters

Thank you for reading and have a good day! :smiley:

31 Likes

thanks for the help! I have removed it from the contents of the article

[EDIT : 9 / 29 / 2022]

  • Removed second examples from practical applications regarding CPU’s use of caching
  • Removed personal caching usage examples (wasn’t necessary)
  • Added information about Roblox’s automatic caching process
2 Likes

Print some actual time stamps using os. time so it’s easier for the people who are seeing this to understand.

local startTime = os.time()

-- Run Stuff
print("Proccess Completed: " + os.time() - startTime)
1 Like

os.time() wouldn’t be well suited for this purpose. os.time() is precise up to a number of seconds therefore there would be a negligible difference even for the uncached execution speed. Therefore the in this case a precise method (os.clock()) is what is used.

1 Like

Yeah, it was something like that. I haven’t gotten on studio in a hot minute or used Lua

1 Like

Not to combat your point or anything, I am totally with you that finding the most visually convenient way to display the performance impacts that caching provides is a great idea. However from my original image I think showing how there is an immediate differece between the first result 1e-1 and the second result 1e-3 (Which is still relatively slow, however this code example is about a year old and was written for the purposes of this article) is still a striking difference. If you would like to offer any other ideas for this I’m all for it though!

Now that your bringing it up however, this weekend i’ll work on providing a more informative analysis of caching and it’s performance enhancement including some graph benchmarks.

2 Likes