Is it a good idea to have a ModuleScript with tons of data? Like a lot?!
Not all data will be used at once, the script is organized as hell so if another script needs something, it can easily acces the table with all the data. In the table there are categories each representing another table and so on.
Don’t ask me why I want to know this! And another question, if I do let a function go all over the tables, how long would it take to do that? If we have like 1k tables in 1 table in the main table? So probably like around 1.000*1.000.000 tables…
Translated to lines of code, probably 5*(the amount of tables) so around 1 billion lines of code. Ooops…
Technically speaking if it’s well organized and you’re using an efficient algorithm to store and retrieve the data, you should be good to go, although admittedly 1 billion lines of code is a bit excessive
@henrydanger5472 If you meet the requirements above, then memory issues are the most likely limit to be encountered in the future once the data reaches a higher value than it currently is, dividing the data into separate modules might be able to prevent this issue from occurring since the data would only be loaded into memory when the module is required
However, there are only code in this game nothing else, only the basepart and UI’s.
And all of it scales together with the data of the ModuleScript. So basically I can buy servers that are capable of doing huge datasets and then just use http to roblox. And then the data just sits externally so no problem here anymore.
Atleast… I hope. Probably only the several millions of dollars I am going to be spending on this project eventually.
EDIT: Or I just don’t use Roblox at all anymore and go to something more reliable.
Hosting an API is a good option since it would make it possible to retrieve one part of the data at a time, although if you need to update it very frequently you might hit the rate limit that HttpService’s methods have
yeah that would work, but i know from experience that roblox studio will struggle to stay alive and break down at ~10,000 lines of code within a single script
maybe, but if you have that much data you should always use external databases. how do you even end up with trillions of characters worth of data to begin with?
I’m gonna take a gamble and say this is something to do with AI, I can’t think of anything else that would need so much data, but then again I’m likely wrong.