I did. The other link I’m too dumb to understand, sorry.
If you want to optimize your code with data-structures, I would recommend only doing so when handling a large amount of input or when your code is handling some heavy work. Otherwise, it would just lead to overcomplication and a negligible speed boost.
With a custom signal module it is very unlikely to have to do any heavy work internally.
Why are data structures so important in this topic? Well it is about concurrency. If your game consists of tens of thousands of signals and you are expecting to disconnect and connect signals thousands of times a second, it becomes a necessity worth considering. This was the case for loleris (for example).
If we are on about negligible performance boosts, this whole library is very unnecessary. This one is a copy of the second signal class of the featured reply that I attached. I know its shortcomings. You are giving up solid safety for little performance boosts if any. Hence, if you are just trying to pursue speed, this is how you do it. If you are trying to develop a solid signal class library? Use bindables.
Could you elaborate more on what data-structure you used in your implementation and how it fixes the thread concurrency issues?
I didn’t use any particular data structure in my updated signal class because it uses bindableEvents. But in the miniscule event that I would use a signal class for a monster project, I would reference this script: ReplicaService/MadworkScriptSignal.lua at master · MadStudioRoblox/ReplicaService · GitHub
You can try to pass Lucas’ signal library into this function (guaranteed to fail btw)
local function SignalBreaker(new_signal) --> is_all_good
local signal = new_signal()
local confirm = {}
for i = 1, 10 do
signal:Connect(function(check)
if check ~= 1 then return end
confirm[i] = true
end)
end
local red_flag = false
local bamboozle
bamboozle = signal:Connect(function() -- bamboozler
bamboozle:Disconnect()
local special_signal
for i = 1, 10 do
if i == 5 then
special_signal = signal:Connect(function()
red_flag = true
end)
else
signal:Connect(function() end)
end
end
special_signal:Disconnect()
end)
for i = 11, 100 do
signal:Connect(function(check)
if check ~= 1 then return end
confirm[i] = true
end)
end
signal:Fire(1)
if red_flag == true then
return false
end
for i = 1, 100 do
if confirm[i] ~= true then
return false
end
end
return true
end
Simply put, it breaks if you violate pairs()
usage by creating new index during iteration, a single-thread concurrency issue occurs.
A double-ended linked queue fixes this ^
Then should I create a deepcopy of all connected singlas and illerate through that instead? Didn’t think that would’ve been a problem with coroutines.
That isn’t useful but ok.
Adding on to @Ukendio point on why Bindables are Better:
I have been using 1 pair of Remotes (one RemoteEvent and one RemoteFunction) for a long time now but I switched to use 1 RemoteEvent for each thing or at least each module that needs it because it’s a lot easier to understand and maintain.
Similar to BindableEvent they are a lot easier to understand and maintain because they have to actually exist and you can use things like WaitForChild to make sure that it has actually been created before using it.
There are times when FastSpawn and Signal modules are useful, but usually not for Cross Script Communication.
BindableEvents do have some downsides like making a deep copy of your Tables which means they aren’t mutable when passed through but that’s not difficult to circumvent and your code shouldn’t rely on that behavior…
As for this
I’m asking because you made a copy of those existing and well known modules, not that I have any problems with innovation nor the functionality that the module offers but I’m questioning why you are reinventing the wheel.
You should definitely take Ukendio advice they are very knowledgeable
Got another reason to not use BindableEvents, at least not with BadgeService3.
I’m working on a update to fast signal. Might be released soon.
Thanks for pinging me! I’ll now provide constructive criticism on your event.
First off, I started off by testing the code you have on your GitHub. I was unable to get it to work the first time due to an oversight on line 77
table.insert(self, #self + 1, connection)
where it should be (at least I’m assuming);
table.insert(self._connections, #self._connections + 1, connection)
After adding this fix I got your module to work for me. It looks like it’s repro’d in your Roblox model as well. Additionally, I attempted to use the Destroy
method and was met with a nil value error on line 28:
connection:Disconnect()
Line 29 makes the index’s value nil and I think that should be enough to satisfy what you wanna achieve out of a dispose method. Also the self = nil
line doesn’t do anything because self is just a local reference to the table itself within the method’s scope, but the diligence is appreciated
At this point I’ve gotten your module to work, lets see what my benchmarks are here (benchmark script at the bottom). I noticed you guys in this thread are running benchmarks on how fast ::Fire
takes to execute; I’ll be benchmarking the time inbetween Fire
and the callback’s entry point.
21:10:28.628 bindable event - 9.22 microseconds - Edit
21:10:45.311 signal module - 8.82 microseconds - Edit
21:11:01.993 bindable event - 10.79 microseconds - Edit
21:11:18.675 signal module - 9.32 microseconds - Edit
21:11:35.358 bindable event - 9.25 microseconds - Edit
21:11:52.040 signal module - 13.11 microseconds - Edit
21:12:08.723 bindable event - 9.32 microseconds - Edit
21:12:25.406 signal module - 8.60 microseconds - Edit
21:12:42.089 bindable event - 9.26 microseconds - Edit
21:12:58.771 signal module - 8.21 microseconds - Edit
It seems like at best I save a little more than a microsecond, and at worst it costs a little under 3 more microseconds. I wanted to see if I could make this any faster, so I started poking at the Fire
method and changed a few things
- I replaced the
pairs
iterator with a numerical loop. This is known to be faster than bothpairs
andipairs
, and evennext
- I removed the
if
statement, since (correct me if I’m wrong) it seems like the statement will always be true
function Signal:Fire(...)
for index = 1, #self._connections do
local thread = coroutine.create(self._connections[index].Function)
coroutine.resume(thread, ...)
end
end
So here we have the new fire method, lets check the benchmarks
21:25:32.109 bindable event - 9.02 microseconds - Edit
21:25:48.789 signal module - 7.60 microseconds - Edit
21:26:05.490 bindable event - 9.41 microseconds - Edit
21:26:22.173 signal module - 8.00 microseconds - Edit
21:26:38.857 bindable event - 9.34 microseconds - Edit
21:26:55.537 signal module - 8.19 microseconds - Edit
21:27:12.221 bindable event - 9.69 microseconds - Edit
21:27:28.904 signal module - 8.04 microseconds - Edit
21:27:45.587 bindable event - 9.90 microseconds - Edit
21:28:02.269 signal module - 8.01 microseconds - Edit
Cool, it seems we’re 5 for 5 here and shaving off more than a microsecond each time. I wanted to see if I could improve it further so I poked at the ::Connect
method as well, and instead of creating a table for each connection and storing it in _connections
I just have it insert the function itself directly into self
, a-la:
table.insert(self, #self + 1, givenFunction)
This required me to change the for loop & coroutine creation line ins ::Fire
to
for index = 1, #self do
local thread = coroutine.create(self[index])
This cuts down on the amount of indexing we do. Lets see where that gets us
21:42:41.350 bindable event - 9.52 microseconds - Edit
21:42:58.033 signal module - 7.88 microseconds - Edit
21:43:14.716 bindable event - 9.59 microseconds - Edit
21:43:31.400 signal module - 7.86 microseconds - Edit
21:43:48.081 bindable event - 9.22 microseconds - Edit
21:44:04.763 signal module - 7.60 microseconds - Edit
21:44:21.446 bindable event - 8.95 microseconds - Edit
21:44:38.128 signal module - 7.92 microseconds - Edit
21:44:54.812 bindable event - 9.55 microseconds - Edit
21:45:11.494 signal module - 8.10 microseconds - Edit
yay! we’re hitting sub-8 microseconds more often than not now.
My only other criticism is to use runService.Heartbeat
instead of .Stepped
in your ::Wait
method, as Heartbeat will have fired 3 times by the time Stepped fires once. And sorry if I come off as pedantic here, I really do want to help so I tried explaining as much as I could here. I think this is a neat idea and I hope you keep up the good work!
Benchmark.lua
local Signal = require(workspace.Signal);
local tests = 1000;
local runserv = game:GetService('RunService');
local beat = runserv.Heartbeat;
local function TestA()
local total = 0;
local completed = 0;
local signal = Instance.new('BindableEvent', workspace);
signal.Event:Connect(function(t)
total += (os.clock() - t);
completed += 1;
end);
for i = 1, tests do
signal:Fire(os.clock());
beat:Wait();
end;
repeat
beat:Wait();
until completed == tests;
signal:Destroy();
return total / tests;
end;
local function TestB()
local total = 0;
local completed = 0;
local signal = Signal.new();
signal:Connect(function(t)
total += (os.clock() - t);
completed += 1;
end);
for i = 1, tests do
signal:Fire(os.clock());
beat:Wait();
end;
repeat
beat:Wait();
until completed == tests;
signal:Destroy();
return total / tests;
end;
local i;
for i = 1, 5 do
print( string.format('bindable event - %.2f microseconds', TestA() * 1000000) );
print( string.format('signal module - %.2f microseconds', TestB() * 1000000) );
end;
I’ll be applying these changes to the version I got here, thanks for the tips.
As for glitches I didn’t notice; I stayed with the same version I believe for BadgeService3; Anyhow thanks.
(bro i was watching you type for 1 hour )
I had to be thorough! Best of luck to you!
One thing, I think I might not be able to do :Disconnect() only saving the function instead of table inside the connections?
Well I guess I can keep the index and use that…
Correct me if I’m wrong but shouldnt something like this work?
function Signal:Connect(givenFunction)
assert(typeof(givenFunction) == "function", "You need to give a function.")
table.insert(self, #self + 1, givenFunction)
return setmetatable({
Disconnect = function()
table.remove(self, table.find(self, givenFunction));
end;
}, __metatable = 'iz siganl?' });
end
Signal:Connect will return a table whose only method is Disconnect, like an RBXScriptSignal does, and it essentially noscopes givenFunction
out of whichever index it’s stored in self
as.
local signalMaster = Signal.new();
local signal1 = signalMaster:Connect(print);
signalMaster:Fire('hello world');
signal1:Disconnect();
It would work, the problem is that you’re creating a new function everytime for :Disconnect(), I’mma keep storing the function in a table I guess. It also does allow me to have a “Connected” property. I will add the other optimizations and some others and benchmark and see what I get.
It is worst for speed, but it’s better for performance.
I came up with this solution following the Event/Signal hierarchy. It throws another class into the mix but it prevents us from spamming new functions like you pointed out earlier
Code
local RunService = game:GetService('RunService');
local Signal = { };
Signal.__index = Signal;
function Signal.Disconnect(self)
table.remove(self._event, table.find(self._event, self._callback));
end;
local Event = { };
Event.__index = Event;
function Event.Connect(self, callback)
table.insert(self, #self + 1, callback);
return setmetatable({ _event = self, _callback = callback }, Signal);
end;
function Event.Fire(self, ...)
for index = 1, #self do
local thread = coroutine.create(self[index]);
coroutine.resume(thread, ...);
end;
self._lastFire = os.clock();
end;
function Event.Wait(self)
local init = os.clock();
repeat
local delta = os.clock() - self._lastFire;
local beat = (os.clock() - init) + RunService.Heartbeat:Wait();
until delta < beat;
return os.clock() - init;
end;
function Event.new()
return setmetatable({ _lastFire = 0; }, Event);
end;
--
local event = Event.new();
local signal = event:Connect(print);
coroutine.wrap(function()
event:Wait();
print('sus');
end)();
event:Fire('amogus');
-- "amogus" is printed first
-- followed by "sus"
signal:Disconnect();
event:Fire('does not work');
-- is not ran
This still seems to still beat out BindableEvents each time in my benchmarks
I’ve got a couple constructive feedback points with this module. Hopefully you find 'em useful. Firstly, I have a couple criticisms on the Wait
implementation:
(1) My main gripe is how it is different from the typical pattern of signals on Roblox: It oughta return the values which were sent to Fire
that caused the thread to resume. So, if I have a print(signal:Wait())
, then do signal:Fire("Hello, world")
, I should get a nice Hello, world
. Most of the time, users won’t care about the total waited time (which, by the way, you could calculate not using os.clock
but rather the deltaTime return value from Stepped, but I digress). At the end of the day, user code can calculate this.
(2) Second thing is using RunService.Stepped to do your waiting with a busy-loop. This is almost certainly not performant if you have a ton of threads waiting.
For Connect
, I’m not keen on the returned connection object not providing a Disconnect
function, a key deviation from most Signal pattern implementations. Overall, the connection object should really just use the same OOP pattern as the Signal
class. Also, 77 might have a typo?
table.insert(self, #self + 1, connection) -- did you mean self._connections?
And I guess finally, if you’re going to call it fast, you really should have some data to back that claim up! Your only evidence can’t be “BindableEvent slow therefore avoid for fastness”. You need to show your prospective users in what cases your module is faster and therefore desirable.
table.insert
automatically adds the value to the last if passed in as the second argument:
-- redundant
table.insert(self._connections, #self._connections + 1, connection
-- Ok
table.insert(self._connections, connection)
- I replaced the
pairs
iterator with a numerical loop. This is known to be faster than bothpairs
andipairs
, and evennext
No, numerical loops are only faster only if you don’t index the value. Therefore use ipairs
as it much faster than a numerical loop at retrieving values.
What’s the point of micro-optimization when it won’t really prove any meaningful?
PS: @OP:
I also have some criticism for this module. What’s the point of having a wait function for an event when you’re just going to be polling unnecessarily:
-- highly inefficient
function Signal:Wait()
local fired = false
local connection = self:Connect(function()
fired = true
end)
local startTime = os.clock()
repeat
runService.Stepped:Wait()
until fired or not connection.Connected
connection:Disconnect()
return os.clock() - startTime
end
Setting self
to nil won’t work as expected, even since it’s passed by reference.
function Signal:Destroy()
for i, connection in ipairs(self._connections) do
connection:Disconnect()
self._connections[i] = nil
end
self = nil -- redundant
end
Lastly, you also claim that this module is “way faster and optimized”, which is definitely not true from the benchmarks many have concluded. What is your source for this?
I actually didn’t update the module yet, I was working on some other stuff recently, you can try out the edit from mew903 above, which I will probably turn into the newest version of the module.
Btw the .Stepped thing is weird, I don’t know what happened to me that day, I just kind of went with it thinking it was better because it’s supposed to run first? Idk.