Whats the limit of a table in DataStore?

If I store something like this

local t = {}
DataStore:SetASync(key, t)

How much stuff can I put in t before I run out of Data?

print("Within limits", string.len(game:GetService("HttpService"):JSONEncode(t)) < 260000))`

Is this the correct way?


Yes, that is the correct way.


Do we have confirmation from Roblox Staff that the encoding for tables in data stores is the exact same as HttpService’s JSONEncode? Just using JSON is not the same as using HttpService’s JSONEncode. A minor difference could cause a difference in length and a failed save.

If not then you should not rely on that assumption. Do the JSON encoding yourself using HttpService and save that string so you know for sure whether or not it’ll save properly.

1 Like

Wasn’t there a recent update saying JSONEncode is RPC compliant or something like that? I ran some tests on it and it still bugs out with math.huge, but I’m not sure how data stores handle it either.

You bring up a good point, there should be further testing on this.

Thought I’d mention text compression here for cross reference

I’m using this method in one of my games and it’s working fine saving tables with 240k length (margin). I can’t remember if I tested going above the limit though but I should have.

Incorrect. It would be fair to say that this method will give you a reasonable estimate, but HttpService and DataStoreService do not use the same method of encoding.

You can test this by round tripping a table which contains a Vector3. When encoded by HttpService this value will be lost when encoding, while it will remain when using DataStoreService.

As long as you aren’t storing any Roblox datatypes this is a reasonable way to calculate the size of your data structure.


Make sure to check out the “Size limits” subsection under section 3 here:
(The link goes to the right section of the thread)

Most of the answer has been given already in this thread, but watch out if you have strings in your data that can contain irregular characters consisting of multiple UTF-8 codepoints. The #/string.length operator won’t give you the correct length then.


If you want to split it into separate tables, you can try converting it all to JSON beforehand and then setting async separate json sections as strings and parse them together