Deferred Engine Events

It would be really cool if we can create custom events

6 Likes
13 Likes

So is backwards compatibility just… not a thing anymore? It doesn’t matter how long you wait to make a breaking change, it still breaks all of my games that depend upon the behavior unless I go through and meticulously audit each one. It also breaks games that I like to play whose developers aren’t around to maintain them anymore. Something seriously needs to be done about this because the problem is only going to compound in the coming years.

16 Likes

You can try looking up Good Signal


@WallsAreForClimbing so this is fully released to games right?

I have been waiting to test Deferred Event behavior for a really long time

If not then I can give you a list of place ids to enable for

I struggle slightly to understand what exactly would change, if someone can explain in more simple terms I would really appreciate it.

Does this mean something like Basepart.Touched is going to function differently?
Does it affect timing and to what extend?

Should I expect a simple .Touched event to fire a few frames later/earlier than normal?

2 Likes

it’s in the old post you can go read all of the details from there


additional for anyone with doubts and reluctance I can say that you should definitely try enabling this and see what’s broken then fix it overtime, you don’t have to understand what’s happening fully right away as long as you are able to fix bugs caused by the new behavior otherwise you’d be missing out a lot.

6 Likes

I’m not gonna lie chief, I think the team needs to reconsider what they should be working on and prioritizing

5 Likes

There’s likely different teams that work on different things rather than just a single team

24 Likes

There’s a problem that needs to be resolved before this feature becomes the default, I’m not able to connect cleanup functions to .Destroyed or .AncestryChanged events of a script or its ancestry because the scripts thread is terminated and all connections are disconnected once its destroyed, before the deferred callback can even run, this makes self-cleaning scripts impossible to make, an example is this simple script that creates a part, keeps track of it and then destroys it once the script or its paren model is destroyed

local part = Instance.new("Part")
part.Name = "TestPart"
part.Parent = workspace

script.Destroying:Connect(function()
	part:Destroy()
	print("destroyed part")
end)

this works in immediate signal behavior but breaks in deferred mode, it is crucial to have a script be able to clean up after itself once its destroyed, currently my only workaround for this is to let the script know ahead of time when it will be destroyed, wait for a bit and then destroy it.

this issue becomes larger if you have custom classes and objects in the script that create many instances, these are all wrapped up in cleaner objects such as maid and janitor but they don’t have a chance to run the cleanup code, it all just gets deleted instantly with the main script thread

26 Likes

Hm, while deferred engine events may have some benefits such as improving performance and security, it’s important to consider the potential downsides as well. One issue that could arise from using deferred events is that it may make debugging and troubleshooting more difficult, since events will not be triggered immediately and may be queued up to be processed later. This could make it harder to identify and resolve issues in the code, especially if the event queue becomes backed up.

I can see that the SignalBehavior of my games–including local files–has already been reset to Default. If I change it back to Deferred and publish that change, will it automatically go live with the push on the 12th, or should I wait until then to publish the changes?

I have 50k lines of code in my game, if I were to use this feature, I would have to dive deep into my code base and look at possible issues that may arise as a result of this.

I also don’t know still which event is Roblox specifically referring to, is it any event? Like even changing properties that trigger the .Changed event? Is it any event? (Except the task scheduler related ones)

I imagine a lot of developers rely on this default behavior already a lot and just changing the Default to be Deferred is not a good change to do. This can create issues as well for developers who will start experiencing weird issues when they expect an event to trigger immediately after them changing or triggering something, but then the event just triggers afterwards.


I am just asking to not make it the default, unless you are willing to change all developers who have it set to default to immediate because that’s what truly is at the moment. You don’t want games to have unexpected issues by changing the defaults now.

Sounds for sure broke some games (specially the games who expected the sounds to be loaded before playing a sound) and this one could break even more games.

1 Like

It’s impossible to maintain full backwards compatibility with Roblox’s update model. There is always only one version of the engine, period.

2 Likes

I have a solution for this: make it a permanent option to be able to switch from deferred to immediate at any time, as long as Roblox lives. Also, only new experiences will have deferred on from now on, and old experiences will have immediate as the default setting, which will never change. This is a logical solution, but do what you want.

4 Likes

I don’t really understand how this works, I know nothing about services/events. I assume it involves external devices to manage actions on scripts better. Could we get a more detailed explanation on how to set this up. I don’t know if it is useful for my game or not. Thanks!

1 Like

It’s definitely not impossible. There are two obvious solutions:

  1. Never force breaking changes onto existing games. Let them remain optional forever.
  2. Version the engine.

Both of these options are challenging and expensive to engineer, but isn’t that why we give Roblox such a large cut of our revenue? To handle all the complex parts of maintaining an online game?

7 Likes

This is the right call for the future of game architecture. A bad practice I’ve commonly seen in the past is a reliance on race conditions and immediate running to achieve synchronization across systems.

It’s a really unhealthy, haphazard way of engineering games, and I would highly encourage other programmers to avoid these practices in favor of safer defensive code that doesn’t rely on strict timing. It’ll make your codebases far less brittle, and far more predictable.

31 Likes

That’s the whole point, the two “obvious solutions” are not an option because of how Roblox works.

You would end up with a constantly expanding pile of spaghetti code that becomes harder to maintain every update.

The engine is an integral part of the whole platform. Any legacy engines would have to be constantly maintained as well so that they keep working. It would be much, MUCH harder and more expensive to version the Roblox engine compared to something like the Unity Engine, which in itself is not a platform but can be used to make games - all of which are standalone products that do not need to cooperate.

Moreover, players would be forced to download multiple versions of the engine to play various games, increasing disk space usage and time spent downloading. This gets even more complicated when we take mobile and console devices into consideration where the update mechanisms are very different. Plus, mobile users are much more sensitive to download sizes and they’re much less likely to download an app if it’s 5 GB in size.

4 Likes

I don’t really care about the implementation details, I just care about Roblox finding a solution. A lack of backwards compatibility is only going to become more and more of an issue over time as the catalog of high-quality legacy games grows. Adopting a defeatist attitude of “it’s too hard/impossible so they should just break old games” won’t help anything.

6 Likes

Most of the reason this is being changed is for cases like this:

Player.CharacterAdded:Connect(function(Character)
	Character:PivotTo(CFrame.new(13, 32, 51))
end)

On paper this looks like it should work, but the problem is with Immediate SignalBehavior the callback for CharactedAdded gets ran immediately after the character is added, which unfortunately means the character is most likely going to be positioned back to spawn after this runs.

Deferred SignalBehavior fixes this by running the callback after the engine’s schedular has resumed (Called the Resumption Cycle), meaning Character:PivotTo will work as intended.

Before, people would probably have tried to mitigate this with yielding an arbitrary amount of time or (more appropriately) wrapped it with task.defer.


A similar scenario would be if you’re trying to reparent/destroy something after it got removed, which would throw an error.

workspace.ChildAdded:Connect(function(Child)
	-- Something unexpectedly tried to set the parent of Child to NULL
	Child:Destroy()
end)

These error messages require case-by-case sanity checks inside the C code or else the client might crash. With this change these edge-cases would become non-existent and engineers would no longer have to worry about taking care of them.

That’s why this change is required, even if it breaks backwards-compatibility. While it might be annoying for us to fix old code, it’s going to make writing new code much easier for both developers and engineers, making things more forward-compatibility in the long run.

28 Likes