Fast way to store alot of data?

Hello.

Recently I have been trying to store a lot of data to data stores, the data I have been trying to save is just a very long string. I have attempted turning it into a string of hex numbers but the converting to and from takes way too long.

Anyone have any ideas?

1 Like

In my opinion , Using table to store multiplier data is the fastest + most effective way .

Im not trying to store a table, I’m trying to store a very very very (insert a very amount of veries) long string.

What type of string you trying to store ?Can you show me ?

example:


but alot longer

Have you tried using this?

Just tested it, too slow; took 13 seconds to encode the data.

I found a good solution, I used this module on github: Base64/Base64.lua at master · Reselim/Base64 · GitHub which quickly encodes the data into base 64. For me it took about 1 second for 16 million random characters.

1 Like

instead of storing a very long string, why not store in some chunks?

Well the main problem is the encoding, I have to make it so the data is in another base because json cannot handle pure ascii.

Additionally there was a bit of a lag spike when the data got encoded, so what I did was add a few task.wait in the encoding process and that made it run pretty smooth. There was a last little bit of lag sending the data to the server, so what I did was just send it slowly in about 2000 packets.

I would be interested in hearing more about your solution though, and how I could possibly implement it.

I save the encoded data in chunks, and each chunk is encoded in Base64. This can only work because my chunks are all saved in an array that you can save in the DataStoreService.

I have no idea of what you are trying to save, but I guess that it’s a file of some sort… I’d just recommend you to work by chunks.
Make a loop that fetches every 2048 bytes, compresses, converts to Base64, and add to the final array. Then you can just save the array in a DataStore