The main datastore for my game is just one datastore named “PlayerData”.
local datastore = game.DataStoreService:GetDataStore("PlayerData")
However, I want to split the player’s data into different categories within the main PlayerData datastore. It seems like I can use scopes to achieve this. For example:
local statsDatastore = game.DataStoreService:GetDataStore("PlayerData", "Stats")
local inventoryDatastore = game.DataStoreService:GetDataStore("PlayerData", "Inventory")
This way, if I only wanted to get the player’s level, for example, I could get the much smaller data from the “Stats” datastore, rather than getting all of the player’s data at once.
I would recommend against using scopes. From personal experience, they hurt more than they help. If you’re not running the risk of exceeding the limit of a DataStore entry or hitting throughput ratelimits, it’s preferable to store all data in a single table.
There isn’t a performance impact you need to worry about that will be avoided by only loading a small amount of data, and you’re more likely to run into request ratelimits than data ratelimits in most scenarios. Highly fragmented data requires more reads and writes to interact with and is just harder to manager overall.
I was pushing the limits of a singular datastore, and the size ended up to be over 2MB. While it’s definitely an exaggerated figure, would it still be the best option to get all of that data, even for a small value?
notice each stats will need a separate request, and there is request limit for datastores.
it is better to group them up even if it means to send more data in one go
However, you will think this increases unnecessary data transfer BY ALOT, and you are right,
that is why ProfileStore only get it once from data store, and (optionally) replicate it to client side.
We would access and modify the local copy on the server, (and access the local copy on client), the data is saved to datastore when player leaves (or at some period of time)
If you’re running into data throughput limits from interacting with large files, I’d look into how often you read and write and see if you can reduce the frequency, but as long as you’re 100% certain a file cannot exceed the 4MB single-key limit, I would recommend keeping everything in one place so you can write to any and all parts of the file in a single call, reducing the risk of a file being put into an unhealthy state if some writes succeed while others fail.
If your certainty of not hitting the 4MB limit is ever less than 100%, though, then you will have to do something about it, but I’d recommend trying to compress your data before fragmenting it. The most general-purpose approach would be encoding it in JSON and then performing generic string compression on it with whatever algorithm you prefer. I don’t really like this option because it’s difficult to estimate how much space you’ll save ahead of time, but it’s a bit more drag-and-drop than coming up with a custom system. You are more likely to see better results from a custom system, though, since it can be tailored based on the specifics of your data since you have the information regarding what everything means. Either option will likely render your data un-human-readable, though, which can be a minor debugging annoyance.
If you do end up needing to fragment your data, I’d recommend keeping as many “small” things together as possible and only shipping out bigger structures to new entries. For instance, I’ve worked on systems where players can have creations whose individual files can exceed a single DataStore entry. To manage this, we keep an identifier for the file in the main player data table and save the file as however many fragments we need to. Reading and writing takes a lot of requests, but it does work at least.
There’s always the matter of fields with the potential for infinite growth, though. An example that springs to mind is recording Developer Product transactions. While you don’t necessarily need to keep track of all of them indefinitely, if you want to offer a “restore purchases” option or something where players can wipe their data but keep purchases they’ve made, you’d need to keep track of them all. In a situation like this, I’d recommend reserving some space in the main file to store at least some receipts so that saving the effects on the player data and recording the receipt can happen in the same call, but then you could periodically flush receipts to a “cold storage” key to reclaim the space and then set a flag in the main file to indicate that more receipts exist elsewhere.
This post is a bit ramble-y, but I mostly want to convey that I’ve only ever regretted fragmenting player data. Even in cases where it is necessary, it’s still a source of complexity that has to be maintained that I’d always prefer to not have to deal with.
seems like some voxel building games.
where the player is granted a patch of land, and then they use different blocks and props to build.
assume the space can be 40x40 blocks, usually can support very high like 100 blocks
then merely record whether a voxel has or hasn’t a block we need 160,000 bits “yes/no” = 19.53kb in some fixed order. if we also want to store which block for the voxel then we might need a byte for each voxel, which is 160,000 bytes = 156.25kb. notice this is a compact format.
if we instead store it in other formats such as character strings “{0,155,155,0,0,254…}” (sequential voxel block type), we see that a byte of block type ranged from 0-255, max we need 3 characters=3 bytes plus the ,=1 byte. so we can say each block needs 4bytes. so we are at 625kb now.
if we want 60x60x100, 4 bytes format, it is 1.4mb
if we need to store props, like a door or table that is not a cube, then we also need to store their orientations. we may store only 0-4 for the 90 degree turns. if the position is not so strict at the voxel centers, then we need 3 floats for that. if rotation is not 90 degrees, we also need more bytes.
Yes, single entries make life much easier, especially when dealing with Developer Product transactions since I can know for sure whether or not the transaction was handled successfully.
As for what I worked on that would hit the limit, it was a 3D drawing game. Drawing on its own would take quite a bit of work to overflow a single key, but then we added copy-and-paste tools, at which point it became fairly trivial…