As the OP stated, this is to help relieve some of the complicated processes that large games such as Adopt Me have to do to store the data without hitting the data limit.
Wow. Thatās a lot of data. I aināt complaining but holy heck.
Is this update possible due to better/bigger servers or better data compression?
Thereās a high chance it was 256KB since thatās a perfect round from 64. But yes, you misread that.
Also they say 4MB but isnāt it actually 4096KB? It would also make sense due to it being 64 bit based.
But well, letās just say this is a 1562.5% increase in amount of able-to-store data, which is absolutely incredible! Good job again, Roblox.
Since that post, I checked a web archive. The wiki indeed said 260k chars:
Now Iām confused on whether the documentation was wrong or if they just decided to round in a way that wasnāt clear.
Looks like I mightāve confused a one letter difference of KB and K lol.
A character should take up a byte, so 260K chars is 260KB.
This is awesome. I can actually have a lot of player leader-stats stored in my games now.
Thanks staff, very cool!
I gotta say this update caught me off guard. Lovelyā¦
Never expected thisā¦ but I welcome it. Iāve only had one case from 3 years ago where I needed to use more data than allocated per key, which still exceeded 4MB. Even so, this will make the processing for my app light years more efficient. Canāt wait to see the numbers Iāll crunch.
I love seeing how quickly this came out compared to my expectations. I know it was mentioned recently but I wasnāt expecting it to be this soon!
Super cool for any game looking to store inventories with metadata on items or very large unique item counts. Just thinking about how I was compressing data before and how many items I could fitā¦ Nowā¦ Jeez.
As Todd Howard once, sort of said?
You can now hold SIXTEEN times the data (or around that point)
This is actually pretty exciting, hope it helps people out.
I canāt wait to see what people do with this.
Doesnāt this mean that we can save more data in 1 datastore? Nice update, anyway.
This is HUGE for My Restaurant. Big thanks to everybody that worked on this.
I canāt believe this update was pushed out this quickly. Iām now awaiting the many other great changes to DataStores, especially atomic operations. Nothing would tickle my fancy more than being able to perform several writes at one go.
This also will allow me to avoid splitting data across multiple DataStores (I use different scopes if I feel that one scope may expand values too largely for data) and I can avoid compression if Iām not going overboard with what kinds of data Iām storing, thus less overhead/worries for me.
Iām going to have a lot of fun with this later.
This is huge! Tons of building games need to split between keys because itās too small, and now they wouldnāt have to with all this storage space!
This is a miracle, how can I give thanks to users who made this possible!
If I remember it was 256 kibibytes (262144 bytes), so it was wrong to say 260000 was the maximum number of characters.
The new documentation is still incorrect, the maximum is 4 mebibytes (4194304 bytes). For instance you can store a string of 4194301 repeats of the letter a
.
I presume the inaccuracy is to give themselves some leeway in lowering the limit for whatever reason.
Will there ever be support for saving binary data? Currently you can only save a valid utf8 strings and control characters except 7F as well as \
and "
need to be escaped so they take extra space. Currently I save values using base64, and I mangle strings in a weird way to make them base64 digits which avoids saving \
and "
, although it does come at a performance cost (like 90% of time spent encoding/decoding is with strings). This makes a 100 character string becomes 117 characters, but without it, it could be 200 characters after JSON encoding. If there was a way to save binary data I wouldnāt have to mangle strings so it would a lot more efficient and could save a small amount of space too.
4 mebibytes should be far more than enough even without compressing the data, so should I not worry about compressing data and simply store the table? It seems like a better option with the increase in max datastore size, considering the cost (in time) of what Iām doing.
Is that like 4194301 characters of a JSON string? YES WE DO!
Thatās really nice thank you whom ever is responsible, I am forever grateful, you literally made my day!
@goldzun Just curious if this change also affects OrderedDataStore?