What determines the speed of a remote event receiving a fired event

I was wondering what determines the latency delay of an OnServerEvent or OnClientEvent when a remote is fired. I’m not sure if the amount of seconds Player:GetNetworkPing() returns is the amount of time it takes for a fired remote to be received but I’m not 100% sure.

Latency mostly depends on the strength of the connection between client and server, including the distance and available bandwidth. And it can depend on the data devs send over the network, aside from everything sent internally (replication). Frequency and size of the data.

A good way to determine one-way latency is sending a timestamp from the client and subtracting from the receive time on server.

Ping time (RTT - rount-trip time) may be used divided by two, although that’s more of an estimation because latency might (sometimes?) be asymmetrical (e.g. it may take more time to send then receive). Ping means a test with a very small packet sent over the network, used for measuring the time of the trip forward and back to the device.

EDIT. After testing, :GetNetworkPing() seems to return a one-way trip time.

The remote events can transmit data up to about 60 Hz, and each empty remote event has a 9 byte overhead. An old article on the Dev Hub used to mention a soft limit of 50 kB of data per second. So if we’re close or even over that limit, Roblox should start to queue requests, which would significantly increase the latency.

Also, framerate could probably impact the receiving of remote signals.

For more info on network limits and optimization I suggest the article below.

2 Likes

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.