It is quite interesting to learn that our users in Furana, despite having the possibility to modify and colour their character, only use about 200 gb out of the 5tb we are given.
On top of that, we used to use DataStore2, which created so many ordered copies of data.
I would like to have a tool to remove keys in bulk with a certain filter, then we can remove all of the old datastore keys and use the additional features. It would also help a lot to have a filter for handling GDPR requests automatically without an external server.
On top of that, it has already been suggested, but I would love a higher per key limit for users. Loading and saving could be done more reliably if there is a higher, or infinite limit on the keys.
Could it be handled internally?
By internally changing the function of UpdateAsync to use multiple keys, it should handle multiple keys and act like one key, in one request.
That way, I could make a sandbox game without worrying for hitting the limit of data while also keeping data reliable (not needing to make an auxiliary key for handling multiple keys) and long term users won’t hit a hard limit or reliability issues.
If you were to have 100000 blocks in a chunk, you would need to mainly store the position and the enum type of the block.
I believe Roblox added support for storing buffers in a DataStore. If you were to store the block’s position as a buffer of 2 bytes x 3 and the enum for what type of block it is as a 1 byte buffer that would mean that 100000 blocks would take up around 0.7 MB.
Hey, datastores manager update is actually a good one and it would’ve better if we could delete the entire datastore (mentioned multiple times^).
About memory limits, i have mixed opinions, if you look this situation from a different view, it’s good for reducing memory since devs actually have to compress their data. At other hand, because the most of simulation games does not use that much memory, it only affects open world (or at least limited) games and even creation games if game is huge, it will absolutely kill them.
It would’ve really awesome if there was an option for open world and creation games to increase memory
I guess this will affect a bit games using berezaa’s method of saving data, which has been infamously known for using a lot of data on roblox’s end, but being completely error proof
fortunately, data versioning is now directly supported by roblox so i guess it can be an easy migration, as normal saving consumes far less
as for the limits. i think people are overreacting. i got a game that could be considered “dead” or “zombie”, 30k keys of around 50 strings each are just ~500kb, and the game got a 46.3gb limit. So i can assume i am pretty safe, now imagine what the limit is going to be for big games
You wouldn’t even need to store the position if you structured your data correctly. you could just derive the block’s position in the chunk based on its index in the block list of that chunk, meaning you only spend probably 4~ bytes representing an air block for any blank spaces in-between blocks in the list.
no need cuz we know exact amount of blocks in chunk. Just use index to retrieve it.
About block type - it uses N amoung of data. But blocks can have their datas too - stairs orientation, slabs - upper, down, or mb full. Chests - their content, wool - it’s color.
This is only somewhat true, you can “store” buffers in a DataStore, but everything stored in a DataStore is a string, no matter what. Roblox simply converts the buffer to a string automatically when you call SetAsync(), there’s a huge amount of overhead from this, so this actually ends up making buffers more inefficient on average than just using your own optimized data format.
You can actually see this pretty well with the new DataStore manager:
This was generated with this code:
local ds = game:GetService("DataStoreService") local ds = ds:GetDataStore("datatest") local buf = buffer.create(8) buffer.writeu8(buf,0,255) ds:SetAsync("TestKey", {["TestValue"] = buf})
If there is some change to this in the future, like allowing developers to write custom binary file formats (which I really doubt, because of security reasons.) then buffers would definitely be by far the best way to go about storing a mass amount of data.
I don’t think even 50% of the people in this thread are making UGC creation experiences despite citing it as their problem scenario. It’s not good to wax hypotheticals and fearmonger; you should actually provide exact data from your observability dashboards, though if you actually try and do that to prove a negative point, you’ll realise that even for a building game you aren’t getting near 1MB even without your own compression strategies applied before Roblox’s and that there’s nothing to be scared of.
I am the programmer of a data-heavy dungeon-crawling RPG that is still running on a mix of both modern optimisations and outdated practices from 2017 due to the difficulty of converting over and lack of attention to it yet. Our strategies are not great yet I wouldn’t even sneeze at 4MB. One of my top players who has a save file from 2020 doesn’t even push 100KiB of data.
Just trying to put things into perspective by opening their entire data profile in JSON format in Visual Studio Code. There are also tons of (better) resources for DataStores than the “berezaa” method which, for its time, was helpful, but Roblox has made so many strides since then that it doesn’t take much effort anymore to design good data infrastructure for your experiences anymore.
Suggesting that the only possible game to make now are “cash grabs” and “simulators” is crazy, unfounded and entirely untrue. Don’t spread misinformation if all you have to go off of is hypotheticals. You need to show real data and real problems. Go to your dashboards now and see that you barely crack the limit, and if you do, then review your data systems. Only if you’ve tried everything and still are crunching should you actually start to be worried; and the people that need to be worried have already been contacted for expedited support.
A DataStore should be considered “deleted” when it has no keys left. Try running a job to clear out the keys that you don’t need more (and migrate where necessary) and see if that impacts your DataStore count. DataStore2 really did a number here huh.
I mean if Roblox isn’t gonna store the data why not just allow the client to play with the data.
Seriously though 1 MB isn’t a lot of data points. If an integer is 4 bytes (a common default int size): 1,048,576 / 4 = 262,144 integers
Seriously a disappointingly low number. Sure everyone who decides to only make a standard player inventory or a standard player housing or a standard leaderboard… great. Was kinda hoping developers would use Data to start doing interesting things. But I guess that’s out the window.
If i’d be honest, in my opinion this is one of the riskiest updates to the datastores. This update will add vulnerabilities and will destroy some games. What if some exploiters will decide to flood the requests and the game will hit the limit? Also if CCU counter will not be always up to date and update immediately when someone leaves/joins it will make so much troubles with server migration when developers will update the game (imagine a game with a lot of leaderboards and the server migration, when servers will be migrated leaderboards will start to load and if CCU counter and the request limits will not be updated immediately it will most likely cause the rate limit and broken leaderboards along with broken players’ data).
I may be wrong somewhere, but it’s my current opinion.
Yes of course LUA uses 8 byte numbers but with a 1 MB limit are you seriously going to be using native 8 byte numbers? That alone will blow 50% of your budget for precision that isn’t used. Guess I haven’t really had a need to explore how you’re supposed to compress data in Roblox because previously the data store was without limits… there’s no way they just introduced limits without a way to store values with lower precision right? I assume something like pack?
I completely agree with your perspective—this update does introduce significant risks to the datastores, particularly in terms of vulnerabilities that could impact the stability of certain games. The issue of exploiters flooding requests is a serious concern; once the limit is hit, it could cause massive disruptions, essentially breaking critical features like leaderboards and potentially players’ data. This would be a devastating consequence for experiences reliant on seamless data flow.
Moreover, the CCU (Concurrent Users) counter not updating immediately adds another layer of uncertainty. During server migrations or rapid player transitions, delays in updating the CCU counter could lead to inaccurate request limits being applied, triggering rate limits unnecessarily. As you’ve pointed out, the scenario where leaderboards start loading during server migration is a perfect example of how this oversight could escalate problems, resulting in broken leaderboards and compromised gameplay data.
While Roblox’s intentions to provide tools for optimization are commendable, the potential for unexpected issues arising from these limits feels like an unnecessary burden on developers who already have to juggle various challenges. Introducing such a change without ensuring absolute precision in metrics like CCU or robust safeguards against exploitation risks leaves a lot of room for error. I hope Roblox revisits these changes and considers feedback like yours to prevent negative outcomes for developers and players alike.
In a nutshell, the dashboard is a good idea, but the limits are not frfr