Caching
Hello, my fellow developers! I’m Boston and today I am going to teach you the reader about the magic of caching, and how it can be easily implemented into your scripts to optimize runtime efficiency! By the end of this tutorial you should see that it isn’t really magic at all and that it’s super useful and has many different applications in programming.
However before we get technical and get into the code part of this tutorial we need to define what caching is since, what good is knowing the syntax association of something if we can’t define it.
Definition of caching in regards to how it applies to programming:
- Caching is the systematic storage of any kind of data that is stored from a request that was processed or returned. This is so that when a similar request is made or a similar set of data is asked for it will be readily available much quicker than making that same request over and over.
Great! Now that we know what caching is I thought I would include some practical applications of caching that you interact with on a day by day basis.
Practical Applications of Caching:
- Web Development: Every day you use the internet you are interacting with systems using a cache to speed up different processes, the forum you are reading this article on even uses a cache. These caches are used to store the different interface components that make up a particular page. You may notice this happening when you visit a page for the first time and it takes longer to load on average than subsequent visits, because all of the elements are now stored in the cache!
Despite the perceived advantages of caching up until this point in the article; it’s important to highlight the downsides of caching since they do exist. This is in order to allow you to make a more well informed decision when deciding to implement a cache into your program.
Downsides to Caching
- Caching uses memory to store the request data, therefore you need to assess if the memory usage of the cache is justified by the performance enhancement to your program.
We now know the concept of caching and we are now armed with information regarding how it’s used and examples of caching in everyday life that you take for granted we are now going to get technical!
Basic Form of Caching for Roblox Lua:
local avatarThumbnailCache = {} -- this is going to be our table which will hold our requested data which can be indexed from the cache much quicker once it's added
local playerIDS = {
55605782, --@coefficients
1364639953, --@JoshSedai
554895233, --@M_caw
491773343, --@CrazedBrick1
8403307, --@railworks2
129574502, --@Incapaz
2231221, -- @TheGamer101
287654442, --@TheCarbyneUniverse
1273918, --@EchoReaper
41018588, --@itsKoiske / Krunnie
308165, -- @Sleitnick
}
local PlayerService = game:GetService("Players")
local function getThumbnailURL(userId) -- this function will be used to make the classic yiedling function to get the url of a player's thumbnail id, whilst utilizing caching to speed it up if it's already been searched for
if avatarThumbnailCache[userId] then
return avatarThumbnailCache[userId] -- return the url from the cache
else
local thumbnailURL, didReturn = PlayerService:GetUserThumbnailAsync(userId, Enum.ThumbnailType.HeadShot, Enum.ThumbnailSize.Size420x420)
if didReturn then
avatarThumbnailCache[userId] = thumbnailURL -- add the thumbnailURL to that indice in the table
return thumbnailURL -- return the data you got from the request
end
end
end
local function timedThumbnailAllocation()
local startInterval = os.clock()
for i = 1, #playerIDS do
print(getThumbnailURL(playerIDS[i]))
end
warn("it took " .. os.clock() - startInterval .. " seconds to get the ThumbnailURL for " .. #playerIDS .. " userIDS")
end
timedThumbnailAllocation()
timedThumbnailAllocation()
Above I have included an example of how you can combine caching with the requests made by most yielding functions on the Roblox platform to store their returned information for future requests so their data can just be indexed by the query parameters so you don’t need to make the same request twice. I even included some cool prominent figureheads in no particular order that I just named randomly off the top of my head and used their user-id’s to test this system.
To prove that caching makes retrieving previously requested data much faster is, I have the results of running the script above, in this image for all of you to see.
Much better!
One thing that is worth noting here is that some of Roblox’s yielding functions will automatically cache the data collected after its initial return. This means that you don’t need to create your own cache for this data as it would make it un-necessary.
More advanced implementations of caching:
What An Object Oriented Implemention of Caching Could Look Like:
Code
local cache = {}
cache.__index = cache
function cache.New(cacheName, cacheType, cacheSize)
assert(cacheName and cacheType, "ERR (Cache) : Nil arguments on instantion")
local self = {
cName = cacheName, -- the name of the cache
cType = cacheType, -- type of data stored in the cache
cacheData = {}, -- this is the actual cache, that stores all of the data
cacheDataLimit = cacheSize or 300, -- this sets a default maximum number of indices to 300 or whatever parameter is provided
signalHandlers = {} -- this is for if you want a cache with different signals and stuff
}
return setmetatable(self, cache)
end
function cache:ConnectOverflow(callback) -- method used if you want an overflow system for the cache so when it resets a callback will be run
self.signalHandlers[1] = Instance.new("BindableEvent")
self.signalHandlers[2] = self.signalHandlers[1].Event:Connect(function()
warn("WARNING : ResultCache Overflow, clearing")
callback() -- this will run the callback once that all happens :)
end)
end
function cache:AppendData(index, data)
assert(index and data, "ERR (Cache) : Nil arguments")
if (#self.cacheData >= self.cacheDataLimit) then
self.cacheData = {} -- clears the cache out if it reaches it's limits
if self.signalHandlers[1] and self.signalHandlers[1]:IsA("BindableEvent") then
self.signalHandlers[1]:Fire() -- this will fire the bindeable event once it overflows to repopulate or whatever
end
end
self.cacheData[index] = data -- this will set the index in the table to the data you give it
end
function cache:HasDataAtIndex(index)
assert(index, "ERR (Cache) : Nil index argument")
return self.cacheData[index] ~= nil
end
function cache:GetDataAtIndex(index)
assert(index, "ERR (Cache) : Nil index argument")
return self.cacheData[index]
end
return cache
Here are some other caching systems which have been made by members of the community:
- Kache:
@lewisakura 's Kache : The Kache module. · GitHub
Article: Kache - it's a caching system!
Why I wrote this article:
- I know people might say that there is information regarding caching on the forum, however, it’s scattered throughout the forum and people might miss it easily unless they know what they are looking for. I decided to centralize information about caching into one article for the most part. If there is any information or feedback you have about this article please do reply to this article or message me here on the forums, I am definitely open to constructive criticism. I also recommend you also check out the different caching systems above. I hope this article helped you learn about caching or further your understanding of caching.
- still confused
- somewhat understand
- good understanding
0 voters
Thank you for reading and have a good day!