I’m here to share with you my implementation for utilizing Luau as seemlessly as possible !
DISCLAIMER !
After some testing, it has been decided that the use case of a setup like this is very limited. Only use this module if the computing time far exceeds the time it takes to transfer data (each load takes about .002 seconds to send and receive). It takes extraneous effort and design to make parallel lua worth it.
Place the module in ReplicatedStorage (make sure there exists the Actor and ClientActor prefabs as children of the module).
Do the following :
-- Require the Instance of the ModuleScript
local ParallelModule = require(\_path.ParallelLua)(\_path.MyModule)
-- Create an array of parameters (Can be an array of tables as well)
local paramsList = table.create(100, true) -- Just an example
-- Returns an array of the returned results
local returns = ParallelModule("FunctionName", paramsList)
print(returns)
Upon calling “ParallelModule”, the code will yield until all of the parameters in ‘paramsList’ have all been processed. Internally, we’re simply iterating through the array and executing the function within coroutine.wrap. The function is yielded until this process has finished
Example
Here’s an example of the difference. More details can be found on the Github page.
It contains an Actor. Within the actor is a bindable function, a script, and an objectValue. When you call the ParallelLuaModule (the module returns a function, so you just call it) with params (ModuleScriptInstance, NumberOfActors). Internally, this Clones the actor prefab that many times and theyre sitting, ready to be used by the function that gets returned.
The function thats returned is what you then call with parameters (FunctionName, ParametersArray).
This function pauses the running thread, gets the next Actor (every time you call the function, it cyclically moves to the next actor) iterates over all the parameters, calling the bindable function for that actor that many times for each set of parameters, and adds the results of that call to another table of return values.
What I will say is kinda off topic but I wish Roblox would just add:
task.runParallel(function, …) → yields the current thread until the parallel execution finishes and returns anything that the function passed returned.
task.isParallel() → returns a bool value indicating weather or not the current code is being run in parallel.
The first dot can be achieved by making your own custom function and using a combination of task.desynchronize() and task.synchronize(), but as far as I know, Roblox stated that using more actors is better.
All this actor stuff drives me crazy and is why I have avoided using the parallel functions and actors.
There are also functions that in my opinion should be totally safe to call, but they ain’t, for example: GetPivot() GetBoundingBox() GetExtentsSize()
Now, I know parallel Lua was just released and they said they will be working more on it, but jeesh. It would have been better, from my perspective, to just have the two functions I mentioned at the beginning to ease things. With them you could do something like:
local HugeTable = workspace.MainInstances:GetChildren() --> Imagine this has 5000 Instances that you want to do an expensive check for.
local SmallerTables = list.split(HugeTable, 5) --> Imagine we splitted the HugeTable into 5 parts, 1000 Instances in each table.
local InstanceChecked = {}
for i, Table in ipairs(SmallerTables) do
task.spawn(function()
local PassedCheck = task.runParallel(function()
--------------------------
--> Do some expensive code
--------------------------
return GoodInstances --> A table containing all instances that passed the check from the 1000.
end)
table.insert(InstanceChecked, PassedCheck)
end)
end
-- BOOM! You now have 5 functions running in parallel and the work is getting done faster. (You could create a bindable event that triggers whenever they all finish).
Yeah, no doubt the parallelism available to us is extremely limiting. I tried the best I could to at least get some use out of it, however I’m finding out that in most cases, the time lost by computing data transfers between VMs through bindable events usually outweighs any benefit of having the extra processing power for the functions.
The only time this is really useful is if the processing time greatly outweighs the time it takes to transfer data.
Having those functions wouldn’t work as some variables could be present in the function running in parallel while also being outside the function, running in serial, in a different lua VM. Though, maybe they could make it throw an error if a variable is not local to the function or something