With the new update, I noticed that there’s a character count, etc like you said,
but it says 4,000,000 characters, and not the more specific actual limit which is 4,194,301.
I wonder if this is something you decided on, or just didn’t know, or maybe you thought it was better to show that instead so that people don’t rely on the more specific limit? I don’t know.
Just wondering if that was intentional or not, and if not if you plan to fix it.
If this is the case, then it’s possible an engineer miscommunicated with a wiki writer, causing them to write 4 Mb instead of 4 Mib. Might be worth a report on the typos thread.
That’s fine, that number is meant to be what people should use as a reference point as to how much data they can save. However, like I said if you don’t want confusion with people around being confused as to why they’re able to save more than 4 million characters, then I suggest changing it to the more specific limit.
You could also only show the more specific limit once the data saved is >= 4,000,000.
It doesn’t really matter, but anyhow, it’s basically a 192KB difference, which is a lot.
Therefore I believe you should use the more accurate higher actual limit.
Note: Like the person who shared this info said, his test only worked after he removed ~3 characters from the string, so ¯\_(ツ)_/¯
local CharacterCount = 4000000 --// edit this with the more specific limit
local Str = {}
for _ = 1, CharacterCount do
table.insert(Str, "a")
end
DataStore:SetAsync("key", table.concat(Str))
The limit is 4*1024*1024-1 (4,194,303) bytes of JSON data. For a string containing non-escaped characters, the maxmimum length would be 2 less to account for the double-quotes of the JSON string.
local limit = 4*1024*1024-1
local ds = game:GetService("DataStoreService"):GetDataStore("test")
print(pcall(function() ds:SetAsync("test", string.rep("A", limit-2)) end))
--> true
print(pcall(function() ds:SetAsync("test", string.rep("A", limit-1)) end))
--> false 105: Serialized value exceeds 4MB limit.
ds:RemoveAsync("test")
print("DONE")
That’s showing limit-2, rather than just limit. The length of the final encoded data has a limit of 4,194,303 bytes.
Lua value --> (JSON encode) --> JSON data --> DataStore
^
|
Limit enforced
4,194,301, on the other hand, is the effective limit of a simple Lua string before it is encoded. The difference is caused by the overhead produced by the JSON encoding. Other Lua values will produce different amounts of overhead, but the final result of the encoding will always have a limit of 4,194,303 bytes. Sorry that I wasn’t clear on this.
Since others have asked about this, here’s my reasoning:
From a developer perspective, 350 R$ is equivalent to $1 USD. The fact that I am making less than $1 per sale is a pretty good deal based on the work I continuously put into it. I know that the Robux purchase rate is higher, but it’s still quite cheap for a utility plugin.
Current price breakdown:
User spends 200 R$
I get 70% of that (market fee), so I get 140 R$
Convert to USD: 140 * 0.0035 = $0.49
So I make less than 50 cents per sale. To me, I’m selling this for way less than the work I’ve put into it, especially since I continuously upgrade it, which is at no extra cost to users that have already purchased it.
Any chance we could get a recent keys used feature? Very time consuming to constantly have to go back and getting my datastore keys, considering majority of them use user id’s.