What’s the performance cost of task.spawn()? Some relative comparisons would be very useful. For example, how many math operations can I do in the same time, or what sort of memory allocation does it incur?
spawn
and task.spawn
called on a function both create a new coroutine, so there’s no raw performance benefit in that regard. However task.spawn
gives you the option of passing a coroutine instead if you want to manage things more closely yourself including potentially reusing coroutines for better perf.
The task
library is a low level library, it does what you tell it to do. If you use task.spawn
and/or task.wait
to resume 100,000 threads in a frame it will dutifully try to run the 100,000 threads that frame (though there’s no guarantee the client / server in question can actually handle that without lagging / running out of memory).
Is it exactly equivalent in terms of performance as game:GetService(“RunService”).Heartbeat:Wait()?
Hey there , quick question, how does task.wait(n) look like in lua, and how does wait(n) look like in lua?
This creates a lot more simplicity for programmers looking to create accurate solutions to wait()
. I’m always excited to see improvement, especially in an area like this!
To my understanding, some of the built-in instances/functions use the wait
and delay
functions. Will these eventually use the task library?
Swap the two, you’ll get opposite results. It’s exactly the same.
That’s a great update, finally.
It would be cool if you added
function task.debounce(event, duration, callback, ...)
for example
-- Triggered by touched event only after 3 seconds passed of the last call
local function PartTouched(debounce, HitPart, RandomData)
if RandomData then
print("The part",HitPart)
-- the cooldown of the debounce refresh after being consumed properly
debounce.consume()
end
end
task.debounce(part.Touched, 3, PartTouched, RandomData)
Thank you!
It would be really helpful for beginners
The latter would be faster due to task.wait() being built into the roblox engine (and therefore runs on C++ rather then lua).
Amazing, amazing, amazing - some more detail on what’s changed from the old implementations would be great! No longer having to rely on open source modules such as Sleitnicks handy Thread module is really nice but sadly am losing a little bit of built in functionality, Thread.DelayRepeat()
for instance but suppose that’s outside the scope of what an engine should provide at a Libary level.
I’m very proud of you Roblox, finally we get a proper upgrade
so the first noticeable difference between coroutine.wrap
and task.spawn
is that one would return back a value while the other doesn’t
Code
local values = task.spawn(function()
return "string"
end, 1, 2, 3)
print(values) -- nil
local values = coroutine.wrap(function()
return "string"
end)(1,2,3)
print(values) -- string
Performance test:
task.spawn
causes a crash if it is used too many times.
local function R(N,V)
if N ~= 0 then
return V,R(N-1,V)
end
end
task.spawn(R(16000,task.spawn))
This should generate a C stack overflow like coroutine.resume.
Ah yes. Now we can wait faster.
Will this have any performance benefits over coroutines and spawn?
Do libraries run on C++? I thought they went through Lua.
You shouldn’t be doing this anyways, it’s no wonder it would crash.
I’m not 100% sure, since this is directly using the task manager, I’m assuming the task manager is run on C++.
The fact that there is a crash is a problem regardless of whether or not its good practice to be spawning a bunch of threads.
It turns out the reason is unrelated to the task.spawn
calls, a few other people, Hal and I have been discussing and it turns out that task.spawn
crashes after a certain number of arguments, and the last few arguments start becoming nil:
task.spawn(print, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48)
Crashes 100% of the time on my machine.
task.spawn(print, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47)
Crashes 0% of the time on my machine, followed by 3 nil
values:
The amount of arguments that can be used before a crash seems to differ per machine but weirdly its remained consistent in every test I’ve done, and seems to be consistent for others.
It solves that issue: task.wait(delay)
will dutifully unsuspend after the delay regardless of what the performance consequences may be, compared to wait(delay)
which “helpfully” throttled execution for you, sometimes delaying resumption by extra frames in an attempt to keep the frame rate smooth.
I can confirm that 44 arguments is currently 100% safe, any more must be avoided for now.
So, as long as you aren’t passing arbitrarily long variable argument lists to the API you won’t run into this issue (unless you actually wrote code that takes more than 44 hand-coded arguments… I won’t judge).