Given this (apparently) awesome update, I was trying to see in what circunstances one should use it. Examples and explanations work!
Just for optimization, that’s the point.
Yes, however, I dont know in what situations it is more appropriate to use. That is what I would like to know. Obvsiously I will not be using it for every single line of code.
Actually, you can’t run most of the actions in a parallel due to memory safety.
So it’s useful to use it in heavy scripts wherever you can.
From what I’ve read from here, parallel Lua is just multi-threaded behaviour; the ability to execute code belonging to one program simultaneously, with the added twist from ROBLOX that the programs have to belong to different actors.
In the past, when developers have wanted to change the properties of instances asynchronously, they’ve had to use coroutines, which are in practice, callable sub-programs allowing for two segments of code (in Luau, two functions) to run in parallel.
By introducing this in the way ROBLOX has stated in the linked post, they allow developers to streamline this problem without having to appreciate the entire scope of coroutines and asynchronicity. Whilst I prefer lobbing everything in a script / ModuleScript and editing one portion of code, this method allows developers to place (if needed, where code isn’t repeatable) scripts within models like NPCs and have functions within those scripts call at the same time.
I can imagine this update will be useful for games simulating hordes of zombies, where each individual zombie has an AI. Instead of iterating through every zombie and updating their pathfinding, hordes can now change their behaviour quickly, proportionate to the number of cores being allocated to their associated VM.
Hope this clarifies a little. As mentioned in the post, this update is still in beta - please forgive (and correct) me for any assumptions / misunderstandings I have made.
In the past, when developers have wanted to change the properties of instances asynchronously, they’ve had to use coroutines, which are in practice, callable sub-programs allowing for two segments of code (in Luau, two functions) to run in parallel.
Coroutines and yielding functions are not parallel, they are “Evented” meaning scripts get queued to run and take turns on the thread.
Here is a graphic example of what happens when a coroutine is run, or a yielding function is called. It also demonstrates why your player variables may be invalid after a yielding function finishes, and why task.wait() can take longer than the time specified if other heavy usage code is run as Script B.
I can imagine this update will be useful for games simulating hordes of zombies, where each individual zombie has an AI. Instead of iterating through every zombie and updating their pathfinding, hordes can now change their behaviour quickly, proportionate to the number of cores being allocated to their associated VM.
This is a very good example, parallel code works best over large swaths of objects doing simple tasks that do not interactact with each other. The zombies in this case must not try to avoid other zombies, all behavior should be based on variables outside of parallel actors. This is because parallel code can read from the same variable without any problem, but if Actor A is reading from something and Actor B overwrites it will crash!
Hope this clarifies a little. As mentioned in the post, this update is still in beta - please forgive (and correct) me for any assumptions / misunderstandings I have made.
It is out of beta as of June! Release posted here!
Blimey, quite a few misunderstandings then!
Thank you for the visual example of coroutines, they’re often difficult to explain with words alone, so I appreciate the visual aid and correction on my poor choice of words. From my use of coroutines, whilst they’re not completely parallel, they’ve often simulated multi-threaded behaviour quickly enough to be as close to parallel as needs be, hence the confusion.
I haven’t delved into detail on the new update, so I appreciate the clarification, but do actors not reed from the global _G
scope? I understand that there could, potentially, be issues there if actor A were to read a variable x
at the same time as actor B, but couldn’t actor A cache the variable (if needs be, deep copy it) to avoid this?
Again, forgive my naivety on the update, I haven’t read as much about it as I should have, yet.
I believe I understand, thanks!
Do you think it would be worthwhile to use it for a bullet hell game?
I believe Actors can read from _G
. Actor A can read the variable x
at the same time as Actor B reads it. The problem is only when an actor writes while another is reading variable x
. This is referred to as Thread Safety as outlined in this document there are API members with varying safety levels, make sure any roblox functions or objects you use are at least marked “Read Parallel” safe.
Thanks for the info! Very useful.
Cheers!
How can you update the path per a zombie when you cant utilize the Pathfinding Service in parallel? I can’t figure out a way to constantly calculate paths in parallel. Do you have an idea/solution?
custom path finding is a solution, and it isn’t difficult to implement
One extra thing I’ve noticed: the cost of having more threads running in your game is quite high, even if they are sitting waiting for their turn to access something (In my case terrain generation where threads need to take turns updating the world after doing calculations). For that reason I would not recommend a 1:1 correlation of actors to NPCs, if you have like 100 or so NPCs. Instead, start an actor when you need to do heavy calculations in a short burst, this is how I was able to get good performance out of it.