Which RunService event is the fastest? My findings

TLDR: Fastest is RunService.PreRender

RunService events - an in depth comparison.

These are my findings on the “fastest” RunService event. (aka. which one fires first) I’ll be discussing how different conditions affect the accuracy of updating an Instance according to some variables in real time - in this example, I’ll be syncing the X position of a part according to the position of my right controller in VR. Important thing to keep in mind ism that this should only be used for for snychronizing accurately in real time, other use cases like checking whether the player is dead should be done after different events, (in this case the Stepped event) because they fire at different times in the frame. This reply clarifies the different use cases pretty well.

Some things to keep in mind:

  • This post covers only the use case of synchronizing something in real time! Different RunService events have different use cases and you should use them accordingly. @Zomebody did a great job clarifying the use cases of some of the different events in reply 5.
  • Not all your functions need to necessarily run as soon as they can.
  • I’ve performed this test in an experience with relatively few things, however it was not a clean new baseplate, nor a fully fledged game where you might want to apply this.
  • I’ve done this in like an hour and don’t actually know much about how the engine works, so take my word with a grain of salt.
  • I’ve not used FPS unlocker.
  • This is my first ever post like this, so I’m open for suggestions and constructive critique.
  • I didn’t use RenderStepped as according to the task scheduler, BindToRenderStep is called first.
  • I never bothered to try using the same priority for two different BindToRenderStep, which might or might not mitigate some delay.

Setup

Specs

I’m doing this in VR my headset is the Meta Quest 3 with the Touch controllers. I’m connecting to my computer via a 5m long USB 3.0 Kiwi cable (C to A) that usually does between 1.7Gbps and 2Gbps. I use Queset Link via cable. My CPU is i5-7500 and GPU is AMD RX 750. 16GB RAM.

Studio

My setup consists of a local script in StarterCharacterScripts that connects PreRender, PreAnimation, PreSimulation, Heartbeat (equal to PostSimulation) and uses BindToRenderStepped with priority of 2. All of these connections inside them get the current UserCFrame for the right hand and set the position of their respective part to a calculated position. I’ve scripted it in a way that it’ll not execute unless everything was ready. It is also important to know what we’re actually testing. I’ll be moving my right hand or my character and seeing, which event is delayed the most. The top blocks (text colored green) are updated by the server and the bottom blocks (text colored cyan) are updated by the client.

Code
local VR = game:GetService("VRService")
local RUN = game:GetService("RunService")

if not VR.VREnabled then return end

local pre_render = workspace:WaitForChild("PreRender")
local pre_anim = workspace:WaitForChild("PreAnimation")
local pre_sim = workspace:WaitForChild("PreSimulation")
local stepped = workspace:WaitForChild("BindToRenderStep")
local heartbeat = workspace:WaitForChild("Heartbeat")

local cam = workspace.CurrentCamera

RUN.PreRender:Connect(function()
	local r_cf = VR:GetUserCFrame(Enum.UserCFrame.RightHand)
	pre_render.Position = Vector3.new((cam.CFrame*(r_cf.Rotation + r_cf.Position*cam.HeadScale*5)).X,pre_render.Position.Y,pre_render.Position.Z)
end)

RUN.PreAnimation:Connect(function()
	local r_cf = VR:GetUserCFrame(Enum.UserCFrame.RightHand)
	pre_anim.Position = Vector3.new((cam.CFrame*(r_cf.Rotation + r_cf.Position*cam.HeadScale*5)).X,pre_anim.Position.Y,pre_anim.Position.Z)
end)

RUN.PreSimulation:Connect(function()
	local r_cf = VR:GetUserCFrame(Enum.UserCFrame.RightHand)
	pre_sim.Position = Vector3.new((cam.CFrame*(r_cf.Rotation + r_cf.Position*cam.HeadScale*5)).X,pre_sim.Position.Y,pre_sim.Position.Z)
end)

RUN:BindToRenderStep("client_performace", 2, function()
	local r_cf = VR:GetUserCFrame(Enum.UserCFrame.RightHand)
	stepped.Position = Vector3.new((cam.CFrame*(r_cf.Rotation + r_cf.Position*cam.HeadScale*5)).X,stepped.Position.Y,stepped.Position.Z)
end)

RUN.Heartbeat:Connect(function()
	local r_cf = VR:GetUserCFrame(Enum.UserCFrame.RightHand)
	heartbeat.Position = Vector3.new((cam.CFrame*(r_cf.Rotation + r_cf.Position*cam.HeadScale*5)).X,heartbeat.Position.Y,heartbeat.Position.Z)
end)

There’s also another local script with all the same connection, with the difference that instead of setting the positions directly, it fires an UnreliableRemoteEvent to the server with it’s respective part and the calculated position.

Code
local VR = game:GetService("VRService")
local RUN = game:GetService("RunService")

if not VR.VREnabled then return end

local pre_render = workspace:WaitForChild("Server_PreRender")
local pre_anim = workspace:WaitForChild("Server_PreAnimation")
local pre_sim = workspace:WaitForChild("Server_PreSimulation")
local stepped = workspace:WaitForChild("Server_BindToRenderStep")
local heartbeat = workspace:WaitForChild("Server_Heartbeat")

local remote = game:GetService("ReplicatedStorage"):WaitForChild("Remotes"):WaitForChild("PerformanceTest")
local cam = workspace.CurrentCamera

RUN.PreRender:Connect(function()
	local r_cf = VR:GetUserCFrame(Enum.UserCFrame.RightHand)
	remote:FireServer(pre_render, Vector3.new((cam.CFrame*(r_cf.Rotation + r_cf.Position*cam.HeadScale*5)).X,pre_render.Position.Y,pre_render.Position.Z))
end)

RUN.PreAnimation:Connect(function()
	local r_cf = VR:GetUserCFrame(Enum.UserCFrame.RightHand)
	remote:FireServer(pre_anim, Vector3.new((cam.CFrame*(r_cf.Rotation + r_cf.Position*cam.HeadScale*5)).X,pre_anim.Position.Y,pre_anim.Position.Z))
end)

RUN.PreSimulation:Connect(function()
	local r_cf = VR:GetUserCFrame(Enum.UserCFrame.RightHand)
	remote:FireServer(pre_sim, Vector3.new((cam.CFrame*(r_cf.Rotation + r_cf.Position*cam.HeadScale*5)).X,pre_sim.Position.Y,pre_sim.Position.Z))
end)

RUN:BindToRenderStep("server_performance", 1, function()
	local r_cf = VR:GetUserCFrame(Enum.UserCFrame.RightHand)
	remote:FireServer(stepped, Vector3.new((cam.CFrame*(r_cf.Rotation + r_cf.Position*cam.HeadScale*5)).X,stepped.Position.Y,stepped.Position.Z))
end)

RUN.Heartbeat:Connect(function()
	local r_cf = VR:GetUserCFrame(Enum.UserCFrame.RightHand)
	remote:FireServer(heartbeat, Vector3.new((cam.CFrame*(r_cf.Rotation + r_cf.Position*cam.HeadScale*5)).X,heartbeat.Position.Y,heartbeat.Position.Z))
end)

There’s a server script in ServerScriptService that listens for these events and simply sets the part of the recieved object to the recieved Vector3.

Code
local remote = game:GetService("ReplicatedStorage"):WaitForChild("Remotes"):WaitForChild("PerformanceTest")

remote.OnServerEvent:Connect(function(_, obj:BasePart, pos:Vector3)
	obj.Position = pos
end)

Here’s how it looks in the explorer window:
(Don’t mind the HandsGrab and HandsUpdate events, they’re not used at all during the tests)
image


The test

I’ve performed a series of tests all with different conditions. I’ll provide a video and my thoughts for each one.

The keywords explained

Client priority means the BindToRenderStep connection for the local script that sets the position directly is connected like this RUN:BindToRenderStep("client_performance", 1, func) and for the one that fires the UnreliableRemoteEvent it’s set like this RUN:BindToRenderStep("client_performance", 2, func).
Server priority means the BindToRenderStep connection for the local script that sets the position directly is connected like this RUN:BindToRenderStep("client_performance", 2, func) and for the one that fires the UnreliableRemoteEvent it’s set like this RUN:BindToRenderStep("client_performance", 1, func).
FPS locked means that I will not be using the FPS unlocker program and it will not run on the system at the time of the test.
FPS unlocked means that I FPS unlocker will be running, unlocking the FPS of Roblox studio. The cap will be set to none and the unlock method to Hybrid. The unlocker I’ll be using is rbxfpsunlocker v5.1 by axstin.

Server priority, FPS locked

[YouTube] Server priority, FPS locked | RunService testing
In this case, we can clearly see that at first, PreRender and BindToRenderStep are superior on the client. They are both first when I only move my hand. It doesn’t seem to matter which one is used when transfering the data to the server. However the BindToRenderStep is just as good as the others when I don’t move the hand, but the actual character itself and it clearly translates an even bigger delay to the server. So the clear favorite here is PreRender.

Client priority, FPS locked

[YouTube] Client priority, FPS locked | RunService testing
Interestingly, there doesn’t seem to be any difference between putting the BindToRenderStep priority to the client rather than the server. This confuses me very much, since the first time I checked, there was an enormous delay on the server side. I’m not sure what to make of this, but the favor here seems to be PreRender, just like with server priority.

Server priority, FPS unlocked

[YouTube] Server priority, fps unlocked
The results are as expected. When you don’t limit the CPU, you get more performance. The difference between locked and unlocked on my computer is about 2x, which would correspond to ~120FPS. Another interesting this is that it seems that blasting the server with more requests has made the synchronization better, as it seems to also be updating 2x more. I also think the PreRender event is still visibly faster than BindToRenderStep, but that may be just my brain making things up.

Client priority, FPS unlocked

[YouTube] Client priority, fps unlocked
Nothing interesting, really. Results are the same as with server priority. I am still confused why that wasn’t the case the first time I was testing.

Conclusion

My final say is that PreRender is the “fastest” - which would make sense, because according to the Task scheduler documentation the Rendering phase comes second, right after User Input, for which RunService doesn’t have an event. And that for synchronizing between client and server shouldn’t be done using BindToRenderStep, because it can lag behind even more than the other. However you should keep in mind, that if you don’t need something to be absolutely consistent with real time information, you can use a different event. This is useful for something like I’m doing - adding hands to VR, or perhaps for something like a GUI in 3D space that needs to always stay in the exact same spot. If you want the best synchronization, you should absolutely use an FPS unlocker, however you cannot force players to use it, so others may not have as smooth of an experience as you do. Another thing is that using an unlocker in combination with firing events to the server each frame might overwhelm your server, so keep that in mind. But keep in mind that this only applies for synchronizing an instance in real time. Most use cases probably have a better event to be used with and if you don’t it might end up not working properly, like a user being detected alive while they’re already dead.

Final words

Thank you for reading - this was my first ever “research” post. I’ll be happy if you correct any of my information, my grammar or just point out something wrong. Of course please keep your critique constructive.

You can try it yourself

Here’s the exact copy of the place that I used to test this.
run_service_event_testing.rbxl (108.9 KB)

7 Likes

Great research, you must have a lot of patience to do something like this :joy: Also, Heartbeat is equal to the PostSimulation event.

2 Likes

Thanks! :smile: Haha yeah, in total it took more than 4 hours, started at like 8 am. Thanks for the info! I’ll update the post :+1:

1 Like

Not to discredit any of your work because you did a lot of good research, but there are a couple very important clarifications that should be mentioned here. The post makes it sound like everyone should always be using PreRender, which is not the case.


First of, you mention using VR for your tests. It seems however that Roblox on VR has different rules for frame-rates. PC, phone and console are currently locked at 60FPS but the documentation suggests - although I cannot find any concrete numbers - that VR runs at a higher frame rate:

an Auto Quality Mode setting is available on Quest which aims to maintain a minimum of 72 frames per second by automatically scaling the rendering detail based on performance data.

link

This could mean that a frame on VR is structured differently from one on other platforms, but I cannot find any documentation on this unfortunately. Nonetheless, it means that your findings may differ on other platforms.

Secondly, regarding PreRender firing first, according to the task scheduler documentation you linked you are technically correct that PreRender fires first. However, that is compared to other events within the frame.

The documentation shows that this is how a frame is structured:

Depending on the kind of information you need to process quickly you should use different RunService events. According to this image if you need to check if a player is still alive to prevent a Touched event from firing within the same frame, you should be using Stepped as that is the only event between humanoid state updates and part contacts updates.

If you want raycasts that are cast by clicking on the screen to ignore a player within a certain distance of a given coordinate, you should use Heartbeat or PostSimulation to check the player’s position because those events are the ones that fire between physics updates and user input.

That is all to say that yes, technically PreRender fires first, but it is not the fastest event for every scenario.

3 Likes

Thanks for all the info! :grinning: I’ll make sure to add it to the post. I actually wanted to add a clarification about how different events should be used for different things, but forgot to add it. Either way it would’ve been incomplete without your clarifications. Once again thanks. :smile:

1 Like

It reminds me of how I can’t unsee Arabic in the UAE

4 Likes