Kache - it's a caching system!


Kache is a simple* caching system. The tiny module is designed to work with very little effort from a developer.

*Simple if you want it to be. You can make shared caches or cross-server caches!

Well, lets say you run a game that utilizes group data quite often. You don’t want to constantly call the Roblox API; that takes time and can cause delays in script execution. Instead, you get the data once, and cache it temporarily, only refetching the data once it expires. This means less hits to the Roblox API and faster script execution in exchange for data possibly getting out of date whilst it is stored in the cache (although this can be solved with a shorter TTL).

Here’s a detailed example showing ways of using Kache:

local cache = Kache.new()

-- With no default TTL, Kache acts as a regular dictionary:
cache.test = "whatever!"
print(cache.test) -- whatever!
cache.test = nil

-- But you can create more complex behaviour:
cache:Set("test", "whatever!", 10)
print(cache.test) -- whatever!
wait(10)
print(cache.test) -- nil

-- You can even skip out on :Set() completely with default values:
print(cache:Get("test", "whatever!")) -- whatever!
print(cache.test) -- nil

-- And even commit default values to cache if so desired (which uses default TTL):
print(cache:Get("test", "whatever!", true)) -- whatever!
print(cache.test) -- whatever!

-- You can even do function calls!
local userId = cache:Get("Lewis_Schumer_userId", function()
    return game.Players:GetUserIdFromNameAsync("Lewis_Schumer")
end, true)
print(userId) -- 25704749
print(cache.Lewis_Schumer_userId) -- 25704749

-- Lets create a cache with a default TTL of 10 seconds:
local expiringCache = Kache.new(10)

-- And now we have created a dictionary that automatically expires:
expiringCache.test = "wew!"
print(expiringCache.test) -- wew!
wait(10)
print(expiringCache.test) -- nil

-- At any point, you can override the TTL using :Set:
expiringCache:Set("test", "wew!", 15)
print(expiringCache.test) -- wew!
wait(10)
print(expiringCache.test) -- wew!
wait(5)
print(expiringCache.test) -- nil

-- Once again, using :Get, you can set default values, but now they expire:
print(expiringCache:Get("test", "wew!", true)) -- wew!
print(expiringCache.test) -- wew!
wait(10)
print(expiringCache.test) -- nil

-- If you don't like the dictionary style, you can use the function calls instead:
cache:Set("test", 123)
print(cache:Get("test")) -- 123

-- Also, Kache has a built in shared cache system which works cross script:
print(Kache.shared("test") == Kache.shared("test")) -- true

-- It even works cross-server!
-- Although, it only works correctly with :Get() (metamethods can't yield) so it's advised to use :Set() too.
local csCache = Kache.crossServer("test")
csCache:Set("test", true)

-- ...then on another server (or upon a server restart)
print(csCache:Get("test")) -- true

-- And the cross server caches can have a default TTL (or override TTLs) as usual, much like a shared cache.
csCache:Set("test", true, 10)
print(csCache:Get("test")) -- true
wait(10)
print(csCache:Get("test")) -- nil

Here’s the three methods from the Kache module itself:

Kache.new(defaultTTL?, passiveExpiry?) -- creates and returns a cache
Kache.shared(name, defaultTTL?, passiveExpiry?) -- gets a shared cache by name, or creates it if necessary
Kache.crossServer(name, defaultTTL?) -- gets a cross server cache by name, or creates it if necessary

And the methods you can use on the cache instances themselves:

Cache:Set(key, value, ttl?) -- add an item to the cache
    Cache[key] = value -- is equivalent to Cache:Set(key, value)
Cache:Unset(key) -- unset a key (convenience function, mainly)
    Cache[key] = nil -- is equivalent to Cache:Unset(key), NOT Cache:Set(key, nil)
Cache:Get(key, default?, persistDefault?) -- get an item from the cache, if present, or return the default
    Cache[key] -- is equivalent to Cache:Get(key)
Cache:Clear() -- clears the cache
Cache:Clean() -- cleans the cache of expired entries
Cache:Count() -- counts how many items (expired or unexpired) are in the cache
Cache:Connect(callback) -- connects to the cache's internal event handler (see Kache.Enum.Event for event types)
Cache:Wait() -- yields until an event occurs

The module contains comments describing what each method does, its parameters, and what it returns in detail.

Notes

  1. Kache implements the passive expiry system that Redis uses. Every second, it will take 20 random keys and trigger an expiry if needed. If over 25% of the keys expire, it will rerun the loop until <25% expire or all keys have expired. This is pretty efficient and is written like this so Kache can gauge how much of your data has expired.
  2. If you have a need for a cache, but don’t access its keys often enough (or have many unique keys) and you disable passive expiry, you may cause a memory leak. Kache only expires keys upon access (similar to Redis) since it expects that you use the keys often enough for this process to happen. You may want to call Cache:Clean() on a loop (with delays depending on your TTLs).
  3. Cross-server caches are a BETA feature at present. Please report any bugs in this thread.
  4. Cross-server caches do not work correctly with the __index metamethod (as stated in the example). You will receive metamethod/C-call boundary warnings saying that Kache failed to retrieve information if you do.
  5. Cross-server caches are backed by a datastore so writing/reading quickly may cause datastore requests to be queued.
  6. Usually, cross-server caches only contain a portion of the overall cached data instead of all of it at once. This means :Clean may not function as expected and will only clean up the local copy’s data (and then commit it to the datastore). Although, I wouldn’t recommend cleaning a cross-server cache due to the amount of requests it has the potential to make, especially on large caches.

Grab the latest release from GitHub: Release 1.0.0 "To Your Eternity" · lewisakura/Kache · GitHub

39 Likes

This is great!, Thankyou so much!

2 Likes

So why should we use this module over just making a module that hold the data anyway? I just don’t see why I or anyone would use this as it seems rather stupid in my opinion, also you should remove the data after the time instead of checking if its there if your indexing it again as thats just a massive memory leak for the people that “would” use this

2 Likes

No one is forcing you to use it. I just wanted to release it to the community in case people wanted to. I sure found it useful. I gave a use case in the post, and my personal use case is temporarily caching group data to prevent hitting the API constantly (which is, in fact, what I based the example use case on). Regardless, it’s purely just a convenience module to prevent you from reimplementing similar systems over and over.

Let’s take an example of a popular in-memory data store: Redis. Redis’ passive expiry doesn’t do this to prevent thread buildup and removes data when it is accessed if it is past its expiry date. It’s just more efficient. Yes, it means data that has expired is held until it is accessed, but if you aren’t accessing the data frequently enough that it builds up massively and causes a severe memory leak a caching system probably isn’t the best idea for whatever you’re doing.

Despite this, I will most likely add an active expiry mode that runs every second, enumerates the cache, and destroys expired keys.

4 Likes

If you don’t want it then don’t use it.

3 Likes

wow thanks for this! i will definitely use this in my future projects

This could be a cool shared library, I mostly don’t see the point. It’s cool if you don’t know caching and stuff and wanna make a module, but I just don’t see it.

Since you said it only deletes expired data when accessed, I think a function like Cleanup() or Optimize() would be cool.

1 Like

Hey everyone, it’s been a while since Kache was updated but I have finally released the update containing some new goodies.

  1. I’ve redone the module to use a class system similar to how the Nevermore Engine’s classes are done. This is mainly for convenience.
  2. You can now use Kache like a dictionary! This reduces the amount of ugly :Get and :Set calls you may have had needed to do before, especially with a default TTL. Now your code can look pretty with this module!
  3. Kache’s :Get now supports default values, and values that are generated by function calls. You can optionally commit these to cache too.
  4. :Clean has been implemented for those who need to optimise their caches.
  5. The other things I’ve done are minor changes (generally related to #1).

I hope you get some use out of Kache! The Gist has already been updated.

Oh, also, I redid the thread with these banners I made a while back. Tell me how they look!

Fixed a bug with Kache.shared not returning the created cache on the first call. Minor bugfix so no fancy banner.

Two updates in one day, and within a couple hours of each other!

This update introduces cross-server caches as a beta feature. These are caches backed by datastores which allow you to cache data… across servers!

There are a couple issues with the current system that I’m aware of:

  • This makes a lot of datastore requests, but you should’ve expected that. However, if you write to a cross-server cache often enough, it can cause datastore requests to queue, so have some restraint.
  • Caches can fall out of sync in the case that two servers set data. Since a cross-server cache keeps a copy of the data locally (for performance and to reduce datastore requests), this will only update their local copy whilst updating the datastore as well. New servers will receive the newer information, but the two existing servers won’t sync up. I have a plan to use MessagingService to force update local copies upon a server writing to a cross-server cache.
  • Have I already said to have some restraint with cross-server caches? I have? Good.
  • You can’t access data using cache.key as this causes a metamethod/C-call boundary error. There’s nothing I can do to fix this, it’s a limitation of metamethods.

You can create them by using Kache.crossServer with the same API as a shared cache.

As this feature is in an early beta, please provide feedback and bug reports. Of course, provide general feedback and bug reports too!

I understand the purpose of this, but to my understanding it would just be the root cause for memory leaks in a project. For this to be of any use, I would want to see a forced garbage cleanup system.

I’m confused, can you elaborate?

:Clean exists to provide a way for the developer to remove any expired keys whenever they want. The point of the Redis-style passive expiry system is to not add any extra overhead with running loops. It provides more control to the developer to clean up the cache when they want. Are you looking for Kache to also be able to do this by default without any extra implementation?

@iGottic I’ve implemented a passive expiry in the latest Gist, can you give it a go and see what you think? Since it’s based on the Redis algorithm, you need >20 items for it to be effective. :Count() is now available for you to check how many items are in the cache (expired or unexpired), so it should be a good gauge of being able to tell if it’s leaking or not. In theory, if you had over 1000 items that had a TTL of 1, you should be left with 19 or less by the end of the passive expiry cycle.

I think this is what you wanted. Try this:

local cache = Kache.new(3)

for i = 1, 1000, 1 do
     cache[i] = true
end

print(cache:Count()) -- 1000
wait(3)
print(cache:Count()) -- 0

Oop, I didn’t see this. Thanks for the heads up then!

Kache 1.0.0 “To Your Eternity”

Kache will now start finally following a proper versioning system (specifically SemVer)! All old versions of Kache will be retroactively labeled 0.0.0 to indicate their unfinished nature.

What’s new?

  • :Connect and :Wait methods are now exposed for listening to Kache events.
    • Check out Kache.Enum.Event to see what events you can listen to.
  • Redis-style passive expiry added to reduce memory usage.
  • Unit tests added.
  • Using Rojo and Roblox LSP for future development.
  • Kache will now follow SemVer with different names for new major versions.

Kache has been moved from a GitHub Gist to a proper GitHub repository for future development, mainly for my own convenience.

Thank you for your continued support. Please suggest new features using the Issues tab or commenting on the DevForum post.

1 Like

What do you mean? It already can be used with a datastore:

local DataStoreService = game:GetService("DataStoreService")
local dataStore = DataStoreService:GetDataStore("testStore")

local datastoreCache = Kache.new(10 * 60)

function GetFromDataStore(key)
    return datastoreCache:Get(key, function()
        return dataStore:GetAsync(key)
    end, true)
end

print(GetFromDataStore(key)) -- yields
print(GetFromDataStore(key)) -- cached
2 Likes