os.clock() returns a high-precision amount of CPU time used by Lua in seconds, intended for use in benchmarking, as found on the docs: os | Documentation - Roblox Creator Hub
The reason why it works in studio as intended is because the server and the client are using the same CPU on your local machine.
For the purpose you described, you’d want to use tick().
time() depends on when the game instance started running (varies between server and clients), while os.time (no milliseconds) is not precise enough.
On the other hand, DateTime (converted with ToUniversalTime) or workspace:GetServerTimeNow (synced on server and client) would be more suitable for your needs.
If you are working on a ping display.
Take into account that:
This is one way latency. I’m pretty sure client-server and server-client communication isn’t always symmetrical, so remote event latency multiplied by two isn’t necessarily an accurate representation of ping time.
Each observed set of remote event delays exhibits some dispersion/spread, such as occasional spikes/prolonged delays. They typically don’t affect the average much, so it’s better to present average latency.