I think changing the requests to per experience is much better for cross server interactions, such as messaging service, memorystore or match making, where some datastore requests may be “off loaded” to one server on behalf of the cross server interaction, without intervening with the limits of the server itself and potentially impacting players in one server. This will literally eliminate any type of such worry or consideration of looking into custom implementations of when and where to off load requests. Awesome change.
I really love this! I love that this allows creators to see the datastores, the keys in each one, and even the versions of each key. It really helps if a player experiences data loss.
However, I noticed that I have a lot of “trash” data stores from old scripts/modules I haven’t used in years. Will there ever be a way to delete entire DataStores in the future?
I used to pray for times like this
I think I completely missed what makes this so awesome the first time I read it - so any key in the datastore can now be Get/Set the maximum times the universe allows? No more per-key rate limits?
Thank god, I hate rate limits.
I don’t know why it took me so long to question this, but will Enum.DataStoreRequestType be adjusted to suit the new approach? As in the layout itself, since SetAsync and RemoveAsync are grouped in Enum.DataStoreRequestType.SetIncrementAsync
, but they will have different request limits.
I can make assumptions, such as Enum.DataStoreRequestType.SetIncrementAsync
and Enum.DataStoreRequestType.SetIncrementSortedAsync
being combined, Enum.DataStoreRequestType.RemoveAsync
being added. Enum.DataStoreRequestType.ListAsync
documentation being adjusted to include ListDataStoresAsync. But I cannot act on assumptions, as my approach using datastores relies heavily on these requestTypes to ensure I have enough budget to splurge when a server goes down, or enough to continue looking through keys/versions (mainly for this**).
i mean, they host everything for you, they have to make money somehow… it doesn’t just grow on trees man, i get 73% is a pretty unfair amount but they gotta do what they gotta do to in order to even pay you in the first place
Now that I have finally had the visual bug of 100MB resolved; my “Total Size” remains above what I’ve been granted for the game.
My main problem is caused by the old data format my game used to use based on “Datastore2” which would create a new key within a scope for each “version” thus inflating my datastore significantly.
Could we please see introduction for devs to have the capability to select individual “datastores” entirely and select for entire “scopes” within the selected “datastore” to wipe all keys excluding the latest one added to that “scope” for those selected datastores to prevent complete dataloss of players who remain to have all their data saved via. that old system?
If we can’t get that; could we get the ability to run a script and run through all created datastores in that respective game along with being unrestricted to fetch all the latest keys saved of each and every scope within that datastore without restriction and be capable we can do a one-time solution to run our data conversion systems without the need of these older players that haven’t played in a while (or was just one-time players) to join the game?
With this as the converter implemented for my game it deletes datastores left over from systems such as my case used to use Datastore2 providing minimal issues and ultimately freeup the entire space and will absolutely benefit the total size to be far far lower and likely be no more than 400MB at an extreme over-estimation.
I personally believe these kinds of features would be critical for any games that have been essentially blocked from benefiting from these changes by a open source module used years before.
And no; I’ve not been able to find any DMs about what my options to resolve this are as has been mentioned multiple times by members from the Roblox team here.
I hope this can be answered by someone at Roblox to provide some transparency at what cases like these will be able to resolve issues caused by such flawed systems as used as an example Datastore2 within this reply.
Edit: Could we also get the ability to view how much “Size” each datastore key uses individually?
Thanks for the report, I will DM you about your case
Okay. I’ve been mulling this over, looking at numbers, and making careful considerations. I’ve come to a verdict.
I wouldn’t be entirely upset with this update, if and only if there were a way to effectively delete and remove datastores from our games. I mean completely remove entire datastores, not just deleting all the keys or whatever. I say this because old, unused datastores presently will take up space regardless of if we even need them anymore. This is completely unacceptable.
If you guys add methods to effectively clean up datastores (or better yet: just add an option to clear & remove datastores from the new datastore manager!), then I wouldn’t be too upset with this update. But since there is presently no way to do that, I only see this being an enormous headache. Especially for my game, which has to store data for potentially THOUSANDS of procedurally-generated, persistent star systems.
Until you do that, this update remains one that I will staunchly oppose.
p.s. if security/accidental deletions is a concern with deleting datastores, you can mitigate that.
- Require a verified phone, authenticator app code, and/or security key to remove a datastore.
- Require completing a captcha to remove a datastore.
- Have a warning pop-up show up that does not let you close it for 5 or so seconds. This would force people to slow down and think about what they’re doing, and I wouldn’t mind it for such a dangerous feature.
- Have a countdown timer that allows a dev to cancel a datastore removal.
Can you please provide how you are calculating that out of 55m experiences theres only 30 over the limit. If you are calculating it as (100+1unique visits)<TotalSize then that is not the correct way you should be calculating it. That is a simplified but unreliable answer. You shouldve instead done. (100+1unique visits)<(maxkeysize*totalkeys) as not all users will have used their data.
Lets say there are 3 users. We wont care about 100MB as that acts kind of like a buffer or for developer objects.
User 1: 400KB
User 2: 2MB
User 3: 30KB
With the old calculation this means it is in range. But obviously User 2 is using more data than roblox allocates. Which means once the changes are done then if User 1 and User 3 start playing and gain more data then itll go over the limit. I really feel as if the amount of data per user should be 4MB as that is what developers usually go off of because of the key being 4MB. Did you also check how often games update the data of users to make sure some games dont break due to being rate limited?
Why is there a throughput limit? Are the GetAsync,SetAsync,etc limits not good enough?
Its like MessagingService. Theres a limit for “Messages received per topic”, “Subscribe requests per game server”," Messages sent per game server" which could all just be turned into the experience limits. Messages received per topic & Messages sent per game server can just be Messages received for entire game and Subscribe requests per game server can be Subscriptions allowed per game server making users have a limit on how many messages there can be per topic just means theyre going to have to separate it into different topics in order to just bypass it. Meshes are another example. 10K triangle limit for a single mesh(no matter the size) means developers just bypass it by splitting their meshes into chunks and importing it like that and welding it together. Probably being less efficient with performance than just importing it as 1. Why do meshes have limits? For lower end devices? That should be the developers concern. Not robloxs. Im not sure as to why these changes are being made to reduce the amount of data developers can have as it has a possibility to break many games if users data starts to rise.
Lets say i have a code system that uses the datastore. For whatever reason the codes equal exactly 2.5MB.
Roblox’s read limit is 25MB/s. This means you can only read that code database 10 TIMES per minute. Now if I restarted all the servers for something like an update. Then I can only have 10 servers access the codes before it gets throughtput limited… Now if i had 50 servers. That means the code would need to be loaded in 5 minutes. Which is bad. But if you do have more than 10 servers, you would need to then either:
- Duplicate the key (servers/10) times
- Create a system to delay servers to load it over time(without ratelimiting another service)
Both of these are really bad.
A game on roblox right now currently has 700k players and 16 server limit. That means there are 43,750 servers( assuming every server is full). If that game triggered an update to restart all servers. it would take 72 hours(3 whole days) for every single server to retrieve those codes.
ngl this just seems like it would encourage devs to bot their experiences
and it wouldn’t even be caught because the bots need to stay in the game for 10 seconds then never join again
also the 30 out of 55 million experiences thing seems misleading when a ton of those experiences are probably slightly modified default places or those weird randomly generated games that steal thumbnails of existing games
So, max limit per user still 4MB, but max storage limit is lowered.
I never used any compression and i store a lot but it seems i can use about 1000 times more data for each player (without any compression).
I feel like the limit could have been lowered a lot more with allowing requests from games that need more, and up them to 1MB for each player (for free) or more with payment. I guess you guys don’t need that much space for now.
I’m honestly not too thrilled about having datastore per-player data be reduced from 4mb to 1mb.
1 megabyte is enough for majority of games, but not for games that might have large, open-world systems where players can build anywhere for instance.
Forget about saving voxels, if you for some reason have a game that utilizes voxels or block building similar to Minecraft, you’re effectively screwed.
Even the best data compression algorithms can only compress data so much before it just becomes infeasible.
In a different post I read that creators might soon be able to purchase more data storage in exchange for money but if you’re a small indie developer then this is basically a terrible idea.
While some things about this update are good, please reconsider storage limits.
Maybe change the limit per genre or estimate how much a experience might need.
Let’s say you have a Data Store named Store1
and then Store2
.
Do both have a separate limit, or do they all share the same one?
Is it per experience or universe? Like the wording is correct, right?
Because I am not sure about Catalog Avatar Creator, but let’s say they run out of storage for avatars, couldn’t they just create a sub-place and then use the API to write and read?
seems like it’s per universe, experiences are considered universes