Function Queue | Module for running functions sequentially using different scripts

OOP Queue Module for Functions

| Download Model |

Although queues in ROBLOX tend to be useful for niche cases I use this module frequently and have included it in other resources so I thought I’d make it a standalone resource and provide documentation.

This module allows the creation of queues using any datatype as keys and can run functions back to back (hence a queue) instead of at the same time. It is useful for storing code to run at a later time or for features in your game that you do not want to run simultaneously but want to run eventually, otherwise unexpected behavior would occur- examples of such cases include usually include making batch changes to a model’s parts such as: loading mesh part outfits on characters, modifying all joints/constraints in a model, or in the case of my resource VoxelDestruct processing destructive actions on the same part.

If your queue is meant to modify an instance, like a model/character/part, I recommend using the instance as the key.
If you use this for anything cool please share!

Documentation

Create a Queue
Queue.New(Object: any, MassRun: boolean?)

Create a queue with any data type as a key. Ignore the MassRun parameter as it hasn’t been implemented yet.

Queue.Fetch(Object: any)

To retrieve an existing queue elsewhere, fetch it by passing its key.

Methods
Queue:Add(func: Function, waits: boolean?, run: boolean?)

Adds a function to the queue.

waits: boolean?

Optionally yield until the added function has been run by the queue.

run: boolean?

Optionally trigger the queue to run.

Queue:Run(waits: boolean?)

Starts the queue if it is not already running.

waits: boolean?

Optionally yield until the queue has completed.

Queue:Clear(RunFunctions: boolean?)

Will clear the queue.

RunFunctions: boolean?

Optionally run functions in the queue before clearing.

Queue:Pause()

Pause the queue if it is currently running.

Queue:Resume()

Resume the queue if it is currently paused.

Events
.Completed()

Fires when the queue has completed.

Properties
Queue.Paused

Whether or not the queue is paused.

Queue.IsRunning

Whether or not the queue is active.

Example
local key = "MyQueueNameHere" -- Can be any datatype or any instance
local Queuer = require(game:GetService("ReplicatedStorage"):WaitForChild("Queue"))
local myQueue = Queuer.Fetch(key) or Queuer.New(key) -- Ensures only one queue exists for this key

myQueue:Add(function()
	print("Hello")
end)

myQueue:Add(function()
	print("World!")
end, true, true)  -- Yield until this function has  run,  and run the queue automatically so calling :Run() is not necessary.

print("Finished.\n")

myQueue:Add(function()
	print("More features to come!")
end)
myQueue:Add(function()
	print("Comment your suggestions, please!")
end)
task.wait(3) -- Wait before running queue although functions have been added.
print("Running queue again...")
myQueue:Run() -- Run the queue.

--[[
Output:

Hello
World!
Finished.

    (3 second wait seen here)
Running queue again...
More features to come!
Comment your suggestions, please!
]]
7 Likes

Great Module!

ThirtyCharacters30

2 Likes

Model is currently private, Please undo this

Oops, sorry about that, thanks for letting me know.
It should be public now!

No Problem!

Question, How did you use this for VoxelDestruct? (And can you teach me how to use it there is no real documentation and its very confusing)

Sure thing! I probably should have made documentation in the module itself but all of the methods should be described inside the dropdowns on this thread.

If you are unfamiliar with OOP, or Object Oriented Programming, then I suggest reading an article on the practice before using any of my resources, this resource for example creates an object that has built in methods to control said object. I’m going to assume you’re familiar with queue data structures which follow the FIFO rule (first in first out), but if you aren’t then imagine a queue as being a check out line at a cash register where the first person gets checked out and exits, then the next person, until the line has completed, but no two people can be checked out at the same time. The queue objects that this module allows you to create works like that, the queue runs functions one after the other.

As to how I used this with VoxelDestruct, that module breaks down parts in a voxelated manner, meaning it has more steps included than just breaking down parts- such as greedy meshing to reduce part count. A function queue helps with this by processing destruction to the same part sequentially instead of at the same time, if I didn’t have this then the greedy meshing process of earlier destruction would be interrupted by the current destruction process. It essentially safeguards unintended behavior if you try to break a part multiple times at the same time.

Here is a general overview of how this module works:

Require the module like so to start creating queues.

local Queuer = require(game:GetService("ReplicatedStorage"):WaitForChild("Queue"))

Create a queue using the .New() method, the parameter key can be any instance or datatype (so parts, models, strings, numbers, etc.). The key is used to identify the queue so you can locate it from any script requiring the module. If you already have a queue made you can fetch it using the .Fetch(key) method, which also has a key parameter. You should always format your queue variable as shown below to ensure only one queue is created for a specified key.

local Queue = Queuer.Fetch(key) or Queuer.New(key) -- Ensures only one queue exists for this key

Once you have a variable initialized with your newly created queue you can use the following methods on that queue to control it:

Queue:Add(func: Function, waits: boolean?, run: boolean?)
Queue:Run(waits: boolean?)
Queue:Clear(RunFunctions: boolean?)
Queue:Pause()
Queue:Resume()

To add a function to the queue, which will append it to the end of the queue, use the :Add() method. The :Add() method has 3 parameters.

The first parameter func: Function is the function you want to add to the queue.

The second parameter waits: boolean? is optional (meaning it can be ignored and passed as nil hence the ? type check), this second parameter is meant to be a boolean type and determines if the :Add() method will yield (code past this line will not run yet) until the function it is adding has been run by the queue and completed.

The third parameter run: boolean? is optional too and determines if the queue will begin running after the function was added, this parameter is useful so you don’t need to call the :Run() method after calling the :Add() method, it is the same as if you called :Run() right after :Add().

Queue:Add(func: Function, waits: boolean?, run: boolean?)

If you don’t pass true as the third parameter run: boolean? of the :Add() method then the queue will not begin running any of the functions inside the queue until you call the :Run() method.

This method has one parameter waits: boolean? which functions exactly like the waits: boolean? parameter of the :Add() method. If this parameter is passed as true then calling the :Run() method will yield until the queue has finished running all of the functions.

Queue:Run(waits: boolean?)

You can pause and resume the queue using the Queue:Pause() and Queue:Resume() methods. Note that paussing a queue won’t pause the execution of a currently running function, it only prevents the functions preceding it from being run. It is also safer to pause a queue before calling the :Clear() method.

Queue:Pause()
Queue:Resume()

You can empty a queue (clear all functions inside of it) by calling the :Clear() method. This method has 1 parameter RunFunctions: boolean? which is an optional boolean. If you pass the parameter as true then the remaining functions inside of the queue will all be run at the same time, this goes against the purpose of the queue but allows you to retain the remaining functions’ execution.

Queue:Clear(RunFunctions: boolean?)

I forgot to mention this but you can see if the queue is paused or is running with the following properties.

Queue.Paused
Queue.IsRunning

There’s also a single event .Completed() which fires when the queue has finished running all the functions inside of it. Note that this event will not fire when you use the :Pause() or :Clear() methods.

I hope you find this useful, if you have any questions please feel free to ask!

From what I understand, you want to defer the greedy meshing after all the delete operations, so you don’t do useless greedy meshing that will have to be redone right after?

From your post and this reply, the wording kind of implies that normal code can run at the same time in parallel (and that you could run into race conditions (unintended behaviour?) and stuff, and that the module could be for that), while that can only happen when using parallel lua, and roblox doesn’t let us mess with stuff in the desynchronized phase anyway. I think that should be cleared up in your post. When running normal lua, only 1 thread can be ran at a time

Either way, you use this to queue all the destroys, run them, then the greedy meshing is queued, and ran once all the destroys completed, to run the greedy meshing on the part only once? This seems like an interesting and clean way to do it, but what are the advantages of doing this over using task.defer when calling the greedy meshing function from the destroy function to ensure other destroys happen before the greedy meshing, and then all the greedy meshing called for the same part can be done as one

I was implying that functions could overlap if called at the same time. Not that the code executes parallel. I can see what you mean though about my wording-

This is close to but not necessarily what happens, the entire process for breaking a part (bounding box > splitting > voxelizing intersections > greedy meshing) is a function added to a queue. Let’s say I was breaking a part twice one right after the other, if I didn’t use a function queue then I’d be splitting parts at the same time and greedy meshing parts at the same time, or worse there would be a delay and I’d be greedy meshing the first break while splitting meshed parts on the second break. Both scenarios would cause unintended problems, especially since I added PartCache.

Greedy meshing doesn’t actually happen once at the end of the queue, it happens at the end of every break process. That might seem unnecessary but I’ll explain why it helps. Since greedy meshing is a process happening after splitting and they are queued together as part of the same function I don’t need to use defer. Defer is useful only if both functions are within the same scope, since my module is used by multiple scripts I needed a queue data structure within a module.

As for why it greedy meshes after every break process instead of once at the end of the queue- essentially meshing at the end of every break reduces script activity more drastically than meshing altogether (even though I assumed it would’ve been the other way around?) because its not only reducing the amount of slices necessary (6 slices per part in the bounding box) but also reduces the amount of parts needing to be greedy meshed, this is more evident the more parts are included. As the part count increases drastically the runtime saved is exponentially greater.

In simple terms, reduced part count is prioritized for every step in sequential breaking processes by greedy meshing at the end of every break.

1 Like

I’m not very experienced with Parallel Lua, but I will say that this does seem like a good module to use. I’ve tried once before to use parallel scheduler and implement it into VoxBreaker, and I had tried using it to calculate cframe data for voxels, but it would yield for every voxel, so it would end up being pretty slow. But if this module doesn’t have that problem, then that would be awesome, and I will definetely let you know if this works for me.

Also the code in the example script should use colons instead of periods for method calls, otherwise the script just errors.

Why did it yield for the voxels? Parallel Scheduler doesn’t “yield” until you use :Work() (and it yeilds for less than a frame, unless the work itself has yeilds)

You can reply on the parallel scheduler thread or to me directly

There might have been some misunderstanding somewhere due to my language. This module doesn’t have anything to do with parallel lua, in fact I’m barely versed in the doc. This module just allows you to chain functions together in a queue, it makes it easier to schedule tasks to run as soon as possible without them overlapping each other.

The reason that this beats a loop is that a loop will run the same code unless you specify conditions for it to vary, this module also allows you to add any code from any script while preventing code from interfering with other code.

There aren’t many uses for this aside from scenarios where running functions together would cause issues but you still want to ensure they run as soon as possible, it removes the need for you to validate that previous tasks have finished and also allows you to work from any script. This was useful for my voxel module since i didnt want to destroy a part again until previous destruction was finalized by greedy meshing. Otherwise it would be destroying parts before they were greedy meshed, if meshing weren’t included in the steps then this module would be irrelevant.

1 Like