MemoryStoreService limits are an inherently flawed system that hurts developers

The usage limit is global and scales by daily concurrent players.

The maximum expiration time is 45 days (3,888,000 seconds).

When users leave the experience, the quota doesn’t reduce immediately. There’s a grace period of 24 hours before the quota reevaluates to a lower value.

The problem here is not immediately obvious, so developers are likely to run headfirst into a brick wall they didn’t see coming. Allow me to walk you through a scenario.

Builderman wants to make a game system that utilizes MemoryStoreService. The expiration max is 45 days, so to play it very safe Builderman sets his data to expire in just 3 days, using only 6% of the allowed time. Very cautious and mindful, this Builderman! Well, his game is released and everything is running smoothly. Great! Then, Builderman runs an event. A large influx of players join, and they do activities that create some MemoryStoreService usage. No problems so far. Then the event ends, and the concurrent count drops back down to normal. One day later, the limit drops dramatically. Now, Builderman is way over the limit! All requests to MemoryStoreService are now failing and his game is suffering from an outage. He did nothing wrong, but the service ruined his game anyway, punishing him for having a spike in players.

Oh, you say to yourself, that’s not a likely situation. Well, it happens every single weekend! Player counts are high on Saturday, leaving behind a lot of MemoryStoreService usage, and by Monday that limit is now lower again and the data is still there eating up quota.

Why is the limit reduced 24 hours after a player leaves if data from that player is permitted to stick around for 1,080 hours??? That is a disaster waiting to happen. The limit declines after 24 hours, making it extremely easy to go over the limit after a player surge.

The limit should decline after 45 days, so that it is impossible for player spikes to create data at the higher limit that remains after the limit is lowered.

Issue Area: Engine
Issue Type: Other
Impact: High

31 Likes

Hi, thanks for sharing your feedback about Memory Stores limit. We understand the player count and thus your quota fluctuates. Could you explain more about your use case for the service? What features we want to build with it? If the player data is not needed anymore after they leave the game, would it be possible to set a shorter expiration time, i.e. less than 24 hours?

1 Like

Why can developers set a 45 day expiration if you do not actually support it? What a weird footgun.

Either way, the use cases are straight up listed by Roblox in the documentation for MemoryStores, right at the very top.

Global leaderboards - Store and update user rankings on a shared leaderboard inside a map with key-value pairs.

Leaderboard should be able to hold data for longer than 24 hours, what good is a leaderboard that resets every day?

Auction houses - A global marketplace where users from all servers list and bid for available goods. Store marketplace data inside a map as key-value pairs.

Being able to have an auction up for more than 24 hours is vital for a good trading house. Ebay, for example, allows listings to remain up and actively bid on for 10 days.

13 Likes

Hi boatbomber,
Thanks for providing these details on how you are using Memory Stores and its pitfalls. Do you think the short expiration window after the player count drops is a big problem for many devs who might consider using Memory Stores? We are discussing this as a team and trying to figure out how best to improve it. Please keep the feedback coming!

Whenever I personally used Memory Stores I was not aware of this pitfall and would have ended up learning the hard way. It is a very big problem that this exists as developers like myself would have run into this roadblock without understanding why I can’t store data for longer than 24 hours during player spikes. I would be extremely annoyed running into this rate limit, especially as whenever I use Memory Stores I tend to use them for important systems that need to work reliably.

1 Like

I believe, as my original post states, that this is a hidden footgun that developers blindly run into.

3 Likes

As the developer who asked for more than 30 days on memory data retention, one of the primary use cases I’ve had for memory stores is for leaderboards and one of my strongest motivations for asking for an extension on expiry. This is supported by the documentation as well, as pointed out earlier, that suggests leaderboards are a good use for this feature.

I also use memory stores for matchmaking data. I typically remove matchmaking memory data almost immediately after creating an instance and teleporting visitors but my team wants to build a more robust matchmaking queue that will require longer data retention (“longer” meaning I may move to not removing memory data immediately after confirming players have arrived) and budget accommodation. This means more global quota consumed and disaster during player spikes (often for us that’s updates). We’ve even had to calculate estimates of how many concurrents we can have before systems start failing and that’s not considering future use cases of memory stores.

I was not even aware of this brick wall until I happened upon this thread by chance and that worries me. Prior to the implementation of more aggressive limits, the service would frequently experience outages and during maintenance our stores would be entirely wiped out. Just recently, a team member let me know that we could brick our systems by simply… having too many players?

I can’t even begin to imagine how this would impact more heavy memory store usage if a low concurrents experience like mine has to be wary about memory store usage for some simple small features and can’t expand outwards in light of that. Memory store service is highly valued because of its high throughput and ephemeral nature. Most of the features I want to build involving save data often fall in that category and not so much the low throughput permanency that is DataStores.

4 Likes