Os.clock() time is different in the studio than it is in-game?

I have this simple line of code that tells me the time taken between remote events. (Code is written in a localscript)
Screenshot_349

This is what it prints on the Studio
Screenshot_350

but this is what it prints in-game???
Screenshot_351

The time should be that of DeltaTime, does anyone know a workaround to this?

2 Likes

Try using the tick() function on studio

Where do you assign bul.DelayServ?

Using tick got me the same result

Its the os.clock defined in the server script

RE:FireClient(plr, os.clock)

–local

RE.OnClientEvent:Connect(function(DelayServ)
local dt = os.clock - DelayServ
end)

Thats more or less the idea

Well then, do you fire the actual event?

os.clock() returns a high-precision amount of CPU time used by Lua in seconds, intended for use in benchmarking, as found on the docs: os | Documentation - Roblox Creator Hub

The reason why it works in studio as intended is because the server and the client are using the same CPU on your local machine.

For the purpose you described, you’d want to use tick().

1 Like

As previously mentioned, os.clock reports precise time Lua has been using the CPU.

But using tick is almost certainly not a reliable choice due to potential discrepancies and the likelihood of deprecation.

What discrepancies?

(From Luau Recap: June 2020)

time() depends on when the game instance started running (varies between server and clients), while os.time (no milliseconds) is not precise enough.

On the other hand, DateTime (converted with ToUniversalTime) or workspace:GetServerTimeNow (synced on server and client) would be more suitable for your needs.

If you are working on a ping display.

Take into account that:

  1. This is one way latency. I’m pretty sure client-server and server-client communication isn’t always symmetrical, so remote event latency multiplied by two isn’t necessarily an accurate representation of ping time.

Perhaps player:GetNetworkPing is what you are looking for instead.

  1. Each observed set of remote event delays exhibits some dispersion/spread, such as occasional spikes/prolonged delays. They typically don’t affect the average much, so it’s better to present average latency.
2 Likes

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.