So I have a server sided timer script I’m working on which counts down from 5 minutes. It works perfectly however it doesn’t seem to be in sync with a timer in my phone for example.
I did a test and started the timer from 5 minutes on Roblox Studio and on my phone and the phone timer finished roughly 3 seconds faster than the code.
This doesn’t bother me greatly, but I rather have it perfect and understand why this is happening because I honestly have no idea. Any suggestions at all, even to my coding in general would be greatly appreciated, thank you.
function timer()
local seconds = 60
local minutes = 4
while true do
if pauseTimer ~= true then
seconds = seconds - 1
if seconds == -1 then
minutes = minutes - 1
seconds = 59
end
if seconds < 10 then
print(minutes..":0"..seconds)
else
print(minutes..":"..seconds)
end
end
wait(1)
end
end
It’s because of the calculations you’re doing before the wait(1). It could be a millisecond before the wait(1) actually runs but all those milliseconds eventually add up to that 3 second delay.
A thing to note here it really doesn’t matter if you don’t have to be precise, but if you were calling wait 30 times a second, you would notice the difference since the error from each call to wait would add up must faster.
Thank you, I understand completely. I’m still relatively average at programming and it was more of a case of understanding why this happened to improve, rather than getting the timer 100% accurate and thank you for the additional information this will definitely become helpful to me when coding in the future.