Kache is a simple* caching system. The tiny module is designed to work with very little effort from a developer.
*Simple if you want it to be. You can make shared caches or cross-server caches!
Well, lets say you run a game that utilizes group data quite often. You don’t want to constantly call the Roblox API; that takes time and can cause delays in script execution. Instead, you get the data once, and cache it temporarily, only refetching the data once it expires. This means less hits to the Roblox API and faster script execution in exchange for data possibly getting out of date whilst it is stored in the cache (although this can be solved with a shorter TTL).
Here’s a detailed example showing ways of using Kache:
local cache = Kache.new()
-- With no default TTL, Kache acts as a regular dictionary:
cache.test = "whatever!"
print(cache.test) -- whatever!
cache.test = nil
-- But you can create more complex behaviour:
cache:Set("test", "whatever!", 10)
print(cache.test) -- whatever!
wait(10)
print(cache.test) -- nil
-- You can even skip out on :Set() completely with default values:
print(cache:Get("test", "whatever!")) -- whatever!
print(cache.test) -- nil
-- And even commit default values to cache if so desired (which uses default TTL):
print(cache:Get("test", "whatever!", true)) -- whatever!
print(cache.test) -- whatever!
-- You can even do function calls!
local userId = cache:Get("Lewis_Schumer_userId", function()
return game.Players:GetUserIdFromNameAsync("Lewis_Schumer")
end, true)
print(userId) -- 25704749
print(cache.Lewis_Schumer_userId) -- 25704749
-- Lets create a cache with a default TTL of 10 seconds:
local expiringCache = Kache.new(10)
-- And now we have created a dictionary that automatically expires:
expiringCache.test = "wew!"
print(expiringCache.test) -- wew!
wait(10)
print(expiringCache.test) -- nil
-- At any point, you can override the TTL using :Set:
expiringCache:Set("test", "wew!", 15)
print(expiringCache.test) -- wew!
wait(10)
print(expiringCache.test) -- wew!
wait(5)
print(expiringCache.test) -- nil
-- Once again, using :Get, you can set default values, but now they expire:
print(expiringCache:Get("test", "wew!", true)) -- wew!
print(expiringCache.test) -- wew!
wait(10)
print(expiringCache.test) -- nil
-- If you don't like the dictionary style, you can use the function calls instead:
cache:Set("test", 123)
print(cache:Get("test")) -- 123
-- Also, Kache has a built in shared cache system which works cross script:
print(Kache.shared("test") == Kache.shared("test")) -- true
-- It even works cross-server!
-- Although, it only works correctly with :Get() (metamethods can't yield) so it's advised to use :Set() too.
local csCache = Kache.crossServer("test")
csCache:Set("test", true)
-- ...then on another server (or upon a server restart)
print(csCache:Get("test")) -- true
-- And the cross server caches can have a default TTL (or override TTLs) as usual, much like a shared cache.
csCache:Set("test", true, 10)
print(csCache:Get("test")) -- true
wait(10)
print(csCache:Get("test")) -- nil
Here’s the three methods from the Kache module itself:
Kache.new(defaultTTL?, passiveExpiry?) -- creates and returns a cache
Kache.shared(name, defaultTTL?, passiveExpiry?) -- gets a shared cache by name, or creates it if necessary
Kache.crossServer(name, defaultTTL?) -- gets a cross server cache by name, or creates it if necessary
And the methods you can use on the cache instances themselves:
Cache:Set(key, value, ttl?) -- add an item to the cache
Cache[key] = value -- is equivalent to Cache:Set(key, value)
Cache:Unset(key) -- unset a key (convenience function, mainly)
Cache[key] = nil -- is equivalent to Cache:Unset(key), NOT Cache:Set(key, nil)
Cache:Get(key, default?, persistDefault?) -- get an item from the cache, if present, or return the default
Cache[key] -- is equivalent to Cache:Get(key)
Cache:Clear() -- clears the cache
Cache:Clean() -- cleans the cache of expired entries
Cache:Count() -- counts how many items (expired or unexpired) are in the cache
Cache:Connect(callback) -- connects to the cache's internal event handler (see Kache.Enum.Event for event types)
Cache:Wait() -- yields until an event occurs
The module contains comments describing what each method does, its parameters, and what it returns in detail.
Notes
- Kache implements the passive expiry system that Redis uses. Every second, it will take 20 random keys and trigger an expiry if needed. If over 25% of the keys expire, it will rerun the loop until <25% expire or all keys have expired. This is pretty efficient and is written like this so Kache can gauge how much of your data has expired.
- If you have a need for a cache, but don’t access its keys often enough (or have many unique keys) and you disable passive expiry, you may cause a memory leak. Kache only expires keys upon access (similar to Redis) since it expects that you use the keys often enough for this process to happen. You may want to call
Cache:Clean()
on a loop (with delays depending on your TTLs). - Cross-server caches are a BETA feature at present. Please report any bugs in this thread.
- Cross-server caches do not work correctly with the
__index
metamethod (as stated in the example). You will receive metamethod/C-call boundary warnings saying that Kache failed to retrieve information if you do. - Cross-server caches are backed by a datastore so writing/reading quickly may cause datastore requests to be queued.
- Usually, cross-server caches only contain a portion of the overall cached data instead of all of it at once. This means
:Clean
may not function as expected and will only clean up the local copy’s data (and then commit it to the datastore). Although, I wouldn’t recommend cleaning a cross-server cache due to the amount of requests it has the potential to make, especially on large caches.
Grab the latest release from GitHub: Release 1.0.0 "To Your Eternity" · lewisakura/Kache · GitHub