Is memory store service synchronous or asynchronous? I currently already have a global matchmaking system with http service that is based around my external server being able to handle requests synchronously, so if it is synchronous I can make the change from http service to memory store service quite quick.
We have been trialling storing the keys like this, noting the use of 0 for padding:
0000127_Player123456
Which represents a score of 127 for Player 123456.
When we store the current score, we cache the key value we used. Then, we have a loop that runs every N seconds and checks if the score for player 123456 has changed. If so, we delete the old key and replace with a new key, like: 0001024_Player123456
So when you sort the keys Descending, you get the highest values first and each leaderboard should have one value per user.
About to release this to prod today, so hopefully it all works
Can we please have a way to toggle this to production in Studio? DataStores allow this, which is super helpful for debugging issues. Being able to communicate with production data will allow us to create tools that help developers see into the data easier than trying to write code in a finicky UI within a game server.
Thanks for the detailed response. The use case makes lots of sense. I’m wondering why this can’t be done with just HttpService plus local game server’s memory. What’s the value of access data in MemoryStore?
It absolutely can be done this way (although there isn’t any way to directly communicate to RCC instances from the outside world - all network traffic must be initiated by the Roblox server, and it only supports RESTful methods. This means we’re limited to long polling).
My view though is that if it isn’t practical or feasible to facilitate communications to specific game instances, it may be a better approach to have externally facing APIs that allow us to interact with the same global data & memory stores that our servers can talk to, which would still facilitate useful applications!
Example: Let’s say a popular frontpage Game A is introducing a live concert in collaboration with an artist; however, this live performance will be selling the best tickets via an in game bidding system. This is powered by memory stores. However, the advertisers for the band would like to embed on their website live ticket bidding information.
Without direct external access to the memory stores, the game server(s) would have to either send their latest bid information via http request (which would be expensive if we had many servers!), or they would have to elect amongst themselves which server to send this information (which is hard).
With direct external access to the memory stores, the game servers can handle their bidding and purchase flows whilst the advertisers site can query the memory store at its own pace without having to proxy through active game servers.
Yep! Both in this way (long polling eats up nearly the entire Http budget pretty easily), and also the fact that most of the big cloud companies charge heavily for these kind of reqeuests (particularly when you have a big response).
Memory stores are going to help a ton when doing matchmaking, but I find that some functionality to help us control this more is missing, due to two reasons. I made feature requests for both, but the use case I have for them is very connected and so I think it’s worth detailing here:
There’s no way to tell if a player in queue should still be in queue
You could only tell if the server which processes the player is the same server the player is/was in, which is very unlikely
The only way to remove items from queue is through the string identifier returned by ReadAsync, but due to the limits this would be impossible to read items one by one just to get an identifier for removing it
Even if not for limits, you’d have to make items go invisible to achieve this which is incredibly inefficient.
Servers players arrive in don’t have the ability to see which players are still on their way over, which have left the game, cancelled joining or maybe failed to teleport for whatever reason.
Having the ability to understand from the destination server which players are still in teleport would let us wait for all players to arrive without assuming players not in teleport are still in teleport.
In a case where a player is no longer on their way, we currently can’t see that immediately and react to it by pulling a replacement player from the queue. This keeps our players waiting longer than necessary.
In a case where a player is still on their way but taking far too long to arrive, we currently can’t know whether or not they are on their way still. This isn’t as important as we could pull a replacement player, and should the original player arrive we can send them back to the lobby. This is less ideal than just being able to cancel their teleportation from the destination server.
Overall, we need more control over our queues and over our teleportation traffic. Without these, matchmaking will end up being slower and players will end up in cases where they must wait longer before playing.
Some of my thoughts on this. Currently it’s a little weird to work with.
There’s no way to clear all memory easily unless you want to read the entire queue and then remove all with the identifier which doesn’t really make sense.
The given example of matchmaking isn’t possible with MemoryQueue (I’m working on a MatchmakingService right now and ran into this problem, I plan on making this open source and posting it in #resources:community-resources when it’s done, but I don’t have a ton of time to work on it right now due to college). It’s not possible because we can’t dequeue a specific player unless they’re at the front of the queue which is not something you can guarantee. We would need a MemoryQueue:RemoveAsync(Variant item) that would remove a specific item from the queue regardless of its position. Otherwise what if a player dc’s while they’re queued? Their id just sits in the queue with no way to remove it until they actually come up in the queue. I switched to using a sorted map with a table of all user ids that are queued in a specific skill group. I spent a long time trying to get this to work with a memory queue but I could not think of a good way to do this. But with a sorted map I can just do this:
function MatchmakingService:RemovePlayersFromQueueId(players, skillLevel)
local memoryQueue = MemoryStoreService:GetSortedMap("MATCHMAKINGSERVICE_QUEUE")
memoryQueue:UpdateAsync(tostring(skillLevel), function(old)
if old == nil then return nil end
for _, v in ipairs(players) do
local index = table.find(old, v)
table.remove(old, index)
end
return old
end, 86400) -- 86400 is for testing but isn't really necessary
end
function MatchmakingService:RemovePlayerFromQueueId(player, skillLevel)
local memoryQueue = MemoryStoreService:GetSortedMap("MATCHMAKINGSERVICE_QUEUE")
memoryQueue:UpdateAsync(tostring(skillLevel), function(old)
if old == nil then return nil end
local index = table.find(old, player)
table.remove(old, index)
return old
end, 86400) -- 86400 is for testing but isn't really necessary
end
which is all in one update operation. I can call the RemovePlayerFromQueue method when they disconnect and guarantee they won’t be accidentally found in the queue.
Basically all I would like to see is a way to clear all the memory at once, clear specific sorted maps and queues in one operation, and a way to remove a specific item from a memory queue without reading it first.
I didn’t see that @Extuls had basically the same grievances as I did with a matchmaking system. They made a great suggestion to return an identifier to remove it from the queue when you add it which could also work if a RemoveAsync(Variant item) isn’t feasible.
If you’re worried about the security of teleport data in your case then I would say this is a great option for bringing players’ data to a new server by saving their data to a sorted map with a short expiration time.
If security isn’t at all a concern in your case then teleport data should be sufficient enough.
I’ve been reading the Requests Quota of this service and I’ve come to a misunderstanding, it says that the API requests have a rate limit of *1000 + 100 ⨉ [number of users], but below it says that “the rate of requests to any single queue or sorted map is limited to 100,000 requests per minute.”
What is the maximum requests per minute? If you have over 1k players, will the limit still be 100,000 requests per minute?
Sorry if this was specified earlier, but I couldn’t find any information. By the way, this is a nice service to be added!
The upper limit for your total requests is 1000 + 100 ⨉ [number of users]. We also have the 100K requests/min limit for a specific data structure no matter how many users you have. It’s similar to the per key request limit for DataStore. The throttle will be triggered when any of the criteria is met.