How to use DataStore2 - Data Store caching and data loss prevention

I’m getting this warning in my output:
Request was throttled. Try sending fewer requests. Key =
And now my main script seems to be running abnormally slow (Although I don’t really use the module in there, that’s handled in another script.)

Any idea why this is happeneing, if it’s even connected?

18 Likes

How many unique keys are you using for DataStore2? That warning is the throttle for ordered data stores.

15 Likes

Quick question, would I store multiple inventories under one variable or would I make multiple?

My logic in place:

local Item1Store = DataStore2("Item1Store", Plr)
local Item2Store = DataStore2("Item2Store", Plr)
local Item3Store = DataStore2("Item3Store", Plr)
18 Likes

That seems like it’d be up to preference, rather than anything related to DataStore2. Neither will add more memory.

17 Likes

An update has been pushed with a new feature: combined data stores. Documentation has been added. Tell me if you have any issues with or without using these.

14 Likes

Sorry for waiting long to reply. I had another script somewhere using an OrderedDataStore but it was updating every 60 seconds. Is that too often to do with this module saving data as well?

13 Likes

You should be fine, but I’d recommend only using one unique key in that case (or use the new combined data store feature for projects that don’t have existing data/you’re willing to port old data).

11 Likes

Did someone say tutoriaaal? alright give me a week. so pay attention at dutchdeveloper on youtube for the tutorial

18 Likes

I believe here you missed a .OnServerEvent :yum:

10 Likes

You’re right, fixing.

11 Likes

I just made a tutorial on this module:

275 Likes

Does this module set itself up correctly for garbage collection? (i.e. clearing data on leave and removing additional stored data that isnt used or a garbage collect function to optimize memory)

10 Likes

Yes.

14 Likes

Does it also work correctly or have a preference for storing all the data in a single table instead of spreading it across multiple different saves or does it automatically store the “coinstore” or whatever as part of the save dictionary? Does it also support data that is not up to date with current data? (replacing with defaults or with new calcs)

6 Likes

Saving all the data in one table is the point of combined data stores, which I have a tutorial for in the post. By default, all keys will be split among different data stores, however it is recommended to use combined data stores so that you aren’t throttled.

I’m not sure how this would be a built in feature, but the closest it has is GetTable, which is the same as Get only if the table you pass it doesn’t have a key that the default you gave GetTable does, it’ll add it.

7 Likes

you should add default tables with functions so that nonexisting elements (or saves) are appended on load (or the function is called with the save data in order to calculate it)
An example being:

default = {
Cash = 1000,
Exp = 0,
Level = function(t) return floor((t.Exp or 0)/100) end
}
--later 
local data = module:LoadWithDefault(name,key,default) --or whatever the equivalent is

and if the old table had no Level it will add the key with the called value

6 Likes

I’m not sure what you’re suggesting here, can you provide an example?

You added an example, I see. This seems like something you’d either just write your own wrapper for or use BeforeInitialGet, the latter being more extendable.

9 Likes

I just think imo though that it shouldve been implemented so that it only uses a different datastore name when specified or when the data size is too large. It could store those data in 1 dictionary and it could also store them in separate json tables as a stream if the data limit is reached. This helps with managing space and also makes it easier to deal with. (i.e. from dss:GetDatastore(name) when making a new item, to datastoretbl so that all new keys go in that as opposed to individual keys)

6 Likes

I’m not sure you understand how DataStore2 works? Either way, I’ve never seen data in the wild go over the 260k character limit, and if you are, you have serious data saving problems.

5 Likes

Yah, just to put that in comparison, the bible has 3.5 million letters(not including spaces or punctation, that’s over 7% of the bible). 260k characters is quite enough space for the vast majority of things.

6 Likes