And I am doing that, however I still get desync issues. Ie, the clients timer will be on like 5 seconds, and then all of a sudden end. Or will hit 0 and then be 2-3 seconds before the end of the game. Another problem I’m facing is adding time to the timer. How can I do that easily? Originally I was just firing the RemoveEvent every time the timer updated.
-- Fire timer to all clients
Timer:FireAllClients(Settings.Timer)
-- Start countdown timer
for i = Settings.Timer, 0, -1 do
Settings.Timer = i
-- Check if 'Add Time' mutator has been applied
local AddTime = table.find(Settings.Mutators, 'Add Time')
if AddTime then
-- Add time to the Settings.Timer
end
wait(1)
end
-- Timer has complete
TimerComplete:FireAllClients()
for count = Settings.Timer, 0, -1 do
Settings.Timer = count
-- code stuff
wait(1)
end
You could also just put a NumberValue in Workspace representing the timer, and make a GUI with a TextLabel with a Script that loops the Value of the timer.
I suggest not firing the client. You can instead simply loop players in a for loop. It may sound more complicated but is faster than firing/invoking every client per tick.
To be honest, your better off using an object value to fix this problem. I know it sounds depressing, I don’t like them either but I have had to use the simpler way to fix these issues in the past. Put some values in rep storage and GetPropertyChangedSignal on the values for the UI. It will reduce the latency to nothing.
How I would do it is just have an IntValue and change the value, if the value is greater than -1 display it, then when it reaches -1, hide it again and just have it connected with the .Changed signal of an IntValue.
How I would do it is just have an IntValue and change the value, if the value is greater than -1 display it, then when it reaches -1, hide it again and just have it connected with the .Changed signal of an IntValue.
Simply have the server change the IntValue.Value and have the LocalScript checking for the timer to be completed then have the server restart the timer itself when necessary.
Also, using .Changed will mean that whenever another (not intended) property is changed (such as visible in GUIs or Transparency in a Part), the function tied to that .Changed will be triggered
whereas :GetPropertyChangedSignal() applies when the specified property is changed, like below in both the Code Block and Example Place.
local Settings = game.ReplicatedStorage.Settings.Mutators
local Timer = Settings.Timer
local MaxTime = Settings.MaxTime
local Countdown = script.parent.Countdown
Timer:GetPropertyChangedSignal('Value'):Connect(function()
if Timer.Value >= Settings.MaxTime.Value then
script.Parent.Visible = true
Countdown.Text = Timer.Value
if Timer.Value <= 0 then
script.Parent.Visible = false
end
end)
Not ENTIRELY sure I got that right, I’m certain a better programmer will outright correct me or point you to a better method or otherwise explain it better.
To give you the basic idea. That value would be set to Clock:GetTime() on the server.
This would of course give you the elapsed time since the countdown began which you can use to format however you want.
We made our own synced clock but quenty has one that’s pretty good. Just search up time sync manager. I’d link but I’m on mobile and it is being a pain
You’ll never get server and clients perfectly in sync, but they should really only be off by roughly the client’s ping time, not whole seconds. One problem you have is that your timer isn’t using a timer, it’s just a loop that is relying on wait(1) to be 1 second. But wait(1) doesn’t wait for exactly 1 second, it will wait for at least one second, usually slightly longer. Having other Lua threads running, especially ones that are doing a lot of work before yielding, can make wait(1) last for much longer than 1 second.
You should use timestamps to track the passage of time, from one of the functions available that uses the computers’ actual clocks: tick(), os.time(), or time(). Which one is best depends on the exact use case, particularly whether or not time zones and actual time matter, or just time since the server started.
You can meter out a specific amount of time by sending a message to the client to count down for, say, 10 seconds. The client can then set a variable, e.g. local endTime = tick() + 10, and check on Heartbeat to see if tick() > endTime, at which point the timer is done. When you use the actual clock functions like this, your client and server will only disagree by about your round-trip ping time, which is as good is it’s going to get.
Generally speaking, assuming that a server machine and client both have their real time clocks in sync is also bad. In other words, don’t generate a time stamp on the server with os.time(), and then send it to the client and compare it to the client’s os.time(). Nothing guarantees the machines’ clocks are set correctly and in sync. So only communicate relative times between client and server, and only directly compare or do arithmetic with timestamps from the same machines.
I wouldn’t necessarily agree with the last bit if you use or create your own synced clock to generate the time stamp. It won’t be in perfect sync all the time but it’ll be pretty dang close, much better than you’ll get just by doing it completely local. I’ve used synced clocks like quentys for awhile to dictate all kinds of things(projectiles are another great use case for instance). It’s a very valuable tool