MessagingService for matchmaking limitations?

I have quickly coded a module to allow cross-server group creation for players in every server to join, and then to initialise private servers for that group. Currently I have a “master server” which does all the leg work, with other servers posting data to it for join/leave/start requests and adding/removing servers when servers are created/destroyed.

However, I have just noticed the wiki page on the MessagingService notes that the limit for ‘Messages per game server per minute’ is 100 + 50 * connected game server(s). With 2 servers that’s only 200 requests/min… and so, I’m very concerned about scalability.

  1. Does anyone have any experience in using this service for matchmaking, and if so, have they had to account for the limitations and throttle the requests, or has it been successful thus far without many issues?

  2. There doesn’t seem to be a method to determine the request budget for the MessagingService. Any work arounds? I’m doubtful there is, but I may as well ask.

Sadly this isn’t something I can really test due to MessagingService being limited in Studio, so any information is greatly appreciated.

2 Likes

Maybe instead of using your current system, you could send a dictionary of data every few minutes?

I think my only issue with this method is that there would be a significant delay between [event] → [result], which would likely be incredibly frustrating for the end-user(s).

Also, sadly, since there’s a request limit I imagine there’s also a limit on the amount of data you can send per request.

Any further thoughts on this would be appreciated :slight_smile:

The information provided on the developer hub says nothing about limitations of data sent per request, so I don’t think that will be an issue.

Regarding your concerns with limitations on requests sent, I don’t think it’s a major issue unless you plan on sending a large amount of requests every minute.

I just wrote this ‘throttling’ module for you, which will queue failed requests to be retried every second when the MessagingService meets its limit. I have no idea how efficient it is but feel free to test it out and let me know if there are any issues.

ModuleScript:

local MessagingService = game:GetService("MessagingService")

local MSModule = {}

local throttleQueue = {}
local throttleManagerEnabled = false
local throttleManager = coroutine.create(function()
	while true do		
		print("Retrying throttled.. There are " .. #throttleQueue)
		
		local currentQueue = throttleQueue
		for index,throttled in next, currentQueue do
			table.remove(throttleQueue, index)
			MSModule.Throttle(throttled.topic, throttled.data)
		end
		
		if #throttleQueue == 0 then
			throttleManagerEnabled = false
			coroutine.yield()
		end
		
		wait(1)
	end
end)

function MSModule.EnableThrottleManager()
	if throttleManagerEnabled then
		return
	end
	
	throttleManagerEnabled = true
	coroutine.resume(throttleManager)
end

function MSModule.Throttle(topic, data)
	local success, returnData = pcall(function()
		return MessagingService:PublishAsync(topic, data)
	end)
	
	if not success then
		table.insert(throttleQueue, {topic = topic, data = data})		
		MSModule.EnableThrottleManager()
	end
end

return MSModule

Server script:

-- might want to change the game.ReplicatedStorage.MSModule reference 
-- to be whatever you name the ModuleScript
local MSModule = require(game.ReplicatedStorage.MSModule)
MSModule.Throttle("TestTopic", "TestData")
6 Likes

There are limitations, they just aren’t documented.

1 Like

Interesting that Ozzy’s notes on limitations differs from the API page, if it’s by number of players per server though that’s incredible. Assuming there’s 50 players per server, even with the maximum of 10,000 subscriptions you can have roughly 100,000 people playing your game and it’ll still be good - noice. Thanks for this.

This looks like as good as an implementation of your own throttle as any considering there’s no budget method, will look into this further - thanks. Going to give you the solution as you put a fair amount of effort into that reply

1 Like

Thanks for providing this, very useful as I wasn’t aware it differed from the documentation page.

1 Like