How to load massive data efficiently?

I have a map made of 100k+ parts, I want to use datastore to save it and load it on new servers, and I already can save it because I save every block when it’s changed, however, to load the data of every single block it takes up to hours or days if I want to load the entire map.

All I save is the name of each part, and a table with 3 numbers that I use later to load them, what would be an efficient way of loading an entire map in less than a minute?

Use Serialization to convert parts into a table (storing part color, size, position etc in table) and JSON:Encode to make that table into a single string (string takes less storage in datastore) and use JSON:Decode to convert that single json string into table.

2 Likes

Once you get the data from the datastore, I would then use Heartbeat to load small portions of data at a time.

Now, in terms of actually generating and loading the data into parts… Rather than scaling up the number of parts per heartbeat, I would have multiple heartbeat connections that are loading in small amounts of parts. - maybe 5.

Heartbeat runs at the server’s FPS, so for example if you’re at 40 FPS, and each of your 5 Heartbeat connections loads 6 parts, that’s 6540 = 1,200 parts per second. That’s about 83 seconds for a 100k part map. So, if you need it to be under a minute, you can either do more parts per connection, or you can do more connections. You can experiment to see which gives you more performance.

Hope this helps!

Seems like the best solution, I will give it a try and see how it goes

Well, after looking at it for a few days, I’ve tried that but I’ve been struggling with loading the data anyways, I’ve also tried to make chunks and load data separately for each of them, but it was really hard for me to suddenly start working with tables and datastores, while I’m bad at both.

Thanks for trying to help, and also thanks to @Icee444 for the other reply, even though it didn’t make sense to me

1 Like