GetAsync Limitation (2.5k?)

Hey,
I’m trying to GetAsync every single key of my datastore, but after EXACTLY 2.5K GetAsync’s, it stops and put it’s into a queue.

Is there a way to bypass the 2.5K limit? Otherwise I can’t GetAsync all datas.

For anyone curious, this is my code:

while true do
	task.wait(1)
	local DataStore = DSS:GetDataStore("Backup_PlayerData")
	local KeysPages = DataStore:ListKeysAsync()
	while true do
		local CurrentKeysPage = KeysPages:GetCurrentPage()
		for j, Key in pairs(CurrentKeysPage) do
			local KeyStore
			repeat
				local success, msg = pcall(function()
					KeyStore = DataStore:GetAsync(Key.KeyName)
				end)
				if not success then
					task.wait(3)
					warn("uhh?")
				end
			until success
			print("Scanning Player: " .. Key.KeyName .. "...")
		end
		if KeysPages.IsFinished then warn("break") break end
		KeysPages:AdvanceToNextPageAsync()
	end
end
2 Likes

You cannot bypass the budget for request types. The budget is set by the DataStoreService and all it means is requests after that budget are subject to throttling.

However, you can manage the rate at which you send requests to the store by keeping an eye on how many requests you have left until throttling. You can use DataStoreService:GetRequestBudgetForRequestType to do so, and you can use it to modify things like your save interval for an autosave.

Example:

local ds = game:GetService("DataStoreService")

local getRequests = ds:GetRequestBudgetForRequestType(Enum.DataStoreRequestType.GetAsync)
if getRequests <= 100 then
    warn("Only "..getRequests.." GetAsync requests remaining.")
else
    print(getRequests.." GetAsync requests remaining.")
end

Why are you constantly listing all data entries? What is the need for it? There might be a more efficient way of what you’re trying to do.

I’m looping trough all players data to count all of their pets, and in the future possibly export their data to an external database. But how is that possible with a request limit of 2.5K? I have had way more than 2.5K unique players.

1 Like

Why do you want to use an external database? Roblox stores are fine, it would be quite a lot more hassle to use an external one. I’m not sure how you could efficiently iterate over all player’s data whilst staying in the request budget, you might need to look elsewhere.

Here’s the documentation about what I said in my previous post:
DataStoreService:GetRequestBudgetForRequestType | Documentation - Roblox Creator Hub

Also, make sure to close this topic once it is solved.

So maximum I can do is 2.5K requests? How will I count everyones pets if I can only request 2.5K times? The code you gave me only checks for the limit, that doesn’t help me go past 2.5K though, and this is the exact reason I want to switch to something external, this just doesn’t seem to be possible at all on roblox itself.

you can track the amount of new pets that got hatched in running servers and then on server shutdown or every couple minutes you save them to a seperate datastore that contains a table with all types of pets and amounts of them in existance, and for the ones that currently exist idk

Like I said before, you cannot get around the budget for requests. The budget does not mean that you can only make that amount of requests, it means requests after that budget has been used are subject to throttling. Budgets exist for all DataStore request types. Also bare in mind that the budget can change throughout runtime.

The best way to stay within the budget is by sending less requests, that’s the only way. Before deciding to use an external database, just compare what you would have to do with that one with the Roblox ones.

More info is in the documentation.

Agreed, this is the only way AFAIK.

@RaffaDevs - it should be noted that @12345koip isn’t suggesting that you don’t do it. If you read their comment again: they’re just telling you that you’re going to have to limit the rate at which you send requests and/or yield until the throttle is lifted. You can still do it, it’s just going to take much longer than you would’ve hoped.

1 Like

My bad, now I understand, thank you very much.
After my 2.500th requests have been done, how long do I have to wait until I can run GetAsync again? Is that specified somewhere?

Sadly, datastoreservice has its limits. The only way to somehow bypass getasync limits is to find a module or make your own that doesn’t use the same framework as datastoreservice, just try not to do getasync too often and you’ll be fine.

A quick fix might be to just reset your datastore by just changing the datastore’s name, that way you can restart.

datastoreservice:GetDataStore("MyData") -- changing the name of "MyData" 
 -- will create a brand new datastore for you to use

I don’t think I want to reset 100.000 peoples data.

Your game is actually big? Well then, you sorta got yourself in a sticky situation here. You’re gonna have to reset SOME people’s data, because for some reason you’ve literally filled it to the core.Apologize to the community and hope they forgive you, maybe compensate all the players who lose their data with a reward or something. After you erase some data to make room for your requests, focus on trying to reduce your getasync requests to not get in the same problem again because you must’ve been doing something to reach that limit.

It’s not a big deal that I have to reset almost everyones data, I’m just trying to count ALL pets in the ENTIRE GAME to get an accurate existance amount of each pet, you know?

And roblox only allowing me to do 2.5K getasync requests is the problem, after that it just throttles forever.

Each player = 1 request that has to be made, but the game has had way more than 2.5K unique players.

Ohh, I see. Yeah maybe try to find a module or make your own which doesn’t use the same framework as Datastoreservice

I’m not entirely certain, I can’t find any documentation that discusses this unfortunately. These resources might interest you though: limits documentation and an informative thread found here,

In reality though, that probably doesn’t even matter. You just need to wait until you’re able to call ::GetAsync again as @12345koip has told you.

For example, you could do something like this:

Example Code
local DatastoreService = game:GetService('DataStoreService')

--[!] CONST
local INTERVAL_BUDGET = 1       -- time interval, in seconds, between attempts to check if we have request budget
local INTERVAL_ADVANCE = 1      -- time interval, in seconds, between attempts to get the next page
local INTERVAL_REQUEST = 7      -- time interval, in seconds, between attempts to call ::GetAsync

local MAX_PAGE_SIZE = 100       -- default page size for ::ListKeysAsync(pageSize: number)
local MAX_REQUEST_ATTEMPTS = 5  -- maximum number of request retries for a specific key
local MAX_ADVANCE_FAILURES = 5  -- maximum number of times we attempt to call ::AdvanceToNextPageAsync


--[!] UTILS
local function parseParameter(value, typing, defaultValue)
  local t = typeof(typing)
  if t == 'string' and typeof(value) == typing then
    return value
  elseif t == 'table' then
    t = typeof(value)

    for _, desired in next, typing do
      if t == desired then
        return value
      end
    end
  end

  return defaultValue
end

local function collectAllKeys(pages, page, length, results)
  page = page or 0
  length = length or 0
  results = results or { }

  local this, size, finished, attempts = nil, nil, false, 0
  repeat
    this = pages:GetCurrentPage()
    finished = pages.IsFinished

    page += 1
    size = #this

    table.move(this, 1, size, length + 1, results)
    length += size

    if not finished then
      local success
      while not success and attempts < MAX_ADVANCE_FAILURES do
        success = pcall(pages.AdvanceToNextPageAsync, pages)
        if not success then
          attempts += 1
          task.wait(INTERVAL_ADVANCE)
          continue
        end

        attempts = 0
      end
    end
  until finished or attempts >= MAX_ADVANCE_FAILURES

  if attempts >= MAX_ADVANCE_FAILURES then
    warn(string.format('Failure attempts exceeded when attempting to ::AdvanceToNextPageAsync at page %d', page))
  end

  return { Keys = results, TotalPages = page, Length = length }
end

local function getDatastoreKeys(name, scope, prefix, pageSize, cursor, excludeDeleted)
  name = parseParameter(name, 'string', '')
  scope = parseParameter(scope, 'string', nil)
  prefix = parseParameter(prefix, 'string', nil)
  cursor = parseParameter(cursor, 'string', nil)
  excludeDeleted = parseParameter(cursor, 'boolean', false)

  pageSize = parseParameter(pageSize, 'number', MAX_PAGE_SIZE)
  pageSize = math.floor(pageSize + 0.5)

  local success, store, pages, results
  success, store = pcall(DatastoreService.GetDataStore, DatastoreService, name, scope)
  if not success then
    return false, string.format('Failed to call ::GetDataStore(name: %q, scope: %q), got exception: %s', name, scope, tostring(store))
  end

  success, pages = pcall(store.ListKeysAsync, store, prefix, pageSize, cursor, excludeDeleted)
  if not success then
    return false, string.format(
      'Failed to call ::ListKeysAsync(store: %s, prefix: %q, pageSize: %d, cursor: %q, excludeDeleted: %s), got exception: %s',
      store, prefix or '[NULL]', pageSize, cursor or '[NULL]', tostring(excludeDeleted), tostring(pages)
    )
  end

  success, results = pcall(collectAllKeys, pages)
  if not success then
    return false, string.format(
      'Failed to call retrieve pages from Datastore<name: %q, scope: %q>, got exception: %s',
      name, scope or '[NULL]', tostring(results)
    )
  end

  results.Store = store
  results.PageSize = pageSize

  return true, results
end

local function iterateDatastoreKeys(results)
  local keys = results.Keys
  local size = results.PageSize
  local length = results.Length

  local item, page, position = nil, 1, 1
  return coroutine.wrap(function ()
    while position <= length do
      page = math.ceil((position - 1) / size) + 1

      item = keys[position]
      if not item then
        break
      end
      coroutine.yield(page, position, item.KeyName)

      position += 1
    end
  end)
end

local function tryGetKeyValue(store, keyName)
  local budget = DatastoreService:GetRequestBudgetForRequestType(Enum.DataStoreRequestType.GetAsync)
  while budget < 1 do
    task.wait(INTERVAL_BUDGET)
  end

  local attempts = 0
  while attempts < MAX_REQUEST_ATTEMPTS do
    local success, result = pcall(store.GetAsync, store, keyName)
    if success then
      return true, result
    end

    attempts += 1
    task.wait(INTERVAL_REQUEST)
  end

  return false, nil
end


--[!] EXAMPLE USAGE
local success, results = getDatastoreKeys('SomeDatastoreName')
if not success then
  return warn(results or 'Unknown error occurred')
end

local store = results.Store
for pageNumber, keyIndex, keyName in iterateDatastoreKeys(results) do
  local succ, res = tryGetKeyValue(store, keyName)
  if not succ then
    -- failed to get result from `store::GetAsync(key: keyName)`
    continue
  end

  if res then
    -- do whatever you want with the player's data ...
    print(string.format(
      'Row<key: %q, dataType: %s, page: %d, index: %d>',
      keyName, typeof(res), pageNumber, keyIndex
    ))
  end
end


Note: this is just something I threw together quickly, it’s quite likely there’s a better way than this.

I had decided to get the keys from each of the pages in advance in this example, just in case you wanted to store the key names somewhere to do this in batches. Though, it should be noted that I’m not completely certain that there isn’t a throttle on the ::AdvanceToNextPageAsync method so you may have to change this example if that’s the case.

Have you checked out MessagingService? The limits seem quite high, with the exception being that it’ll only work for players that are online, so you might have to do the count over the course of a few days to catch as many players as possible.

Maybe, every time someone’s pets are loaded, if they haven’t been added to a count yet, add it to a count in a datastore? This would also be affected by the players-need-to-join-to-be-counted caveat.

You could also stagger the number of datastore requests, like a maximum of 1k every minute to not hit the maximum requests per minute limit, but that would increase the amount of time it would take to do a total count. How many datastore keys do you have?