Is it best to countdown on the server and send a timestamp to the client each time updating a timer on the client, or is it best to send a request to the client one time and have the client regulate its own countdown timer?
In the past I have chosen the first method, but there is the downside of server delay. With server to client delay, ticking noises can become obviously out of sync.
I am interested in knowing how other developers handle timers for all players in a server.
I am using a remote event to fire all clients, but like you said, slow connections can get out of sync. I am also curious as to the best solution to this.
Any advantages to using .Changed on a NumberValue instead of an event? Seems a little bit of a roundabout way to do it.
--Server
local startTime = tick()
local endTime = startTime + 30
remoteEvent:FireAllClients("StartTimer", startTime, endTime)
Then account for timezones on the client and you should end up getting a pretty accurate countdown for every client. I made something like this in the past if anyone is interested in seeing it
This isn’t good practice. It replicates every change to the string value to all clients, which is slow, takes data, and does not have equal spacing between seconds client-side, since communication delay varies over 30 seconds or however long your timer is.
This is one of the most heavy-data solutions to the problem I can think of, actually.
Here’s a much better way to do this:
Send a signal to clients to start the countdown, and for how long.
Clients countdown locally until 0 from that point, but don’t do anything special on their own when it reaches 0.
When the countdown is done server-side, the server sends another signal to all clients that the countdown completed.
When the client receives the finished event from the server:
If the counter was still running, stop it.
Proceed to do the actions that need to happen after the timer client-side (if any).
This is an interesting method, but I feel that there should be some margin for slower clients so timers aren’t stoped shorthand more times than necessary.
Such as using a loop to wait for confirmation from existing finished client timers after the server countdown, and have something like a 3 second margin time for any clients still waiting to finish; if all have sent a finished request or the 3 seconds is up then send a request to end all timers.
I’m not sure how less efficient this would be though, or whether it is truly necessary. It could also be confusing for some players to have to wait for the countdown UI to go away 3+ seconds after it was finished, yet it could be just as confusing if not more for it to go away before it was done.
Slower clients would also get the completed message later than faster clients, so this is not really an issue you should have to account for. Clients may end at 1 or be stuck a little longer on 0 than necessary, but I don’t see this being a huge deal.
PS: you wouldn’t do wait(1) loops between the seconds, if that part wasn’t obvious. You either do this on a stepped or similar loop and update based on actual time difference, or you vary the wait(x) so that it catches up on lost time.
Not really a good idea IMO. It’s not mission-critical to make the timer end exactly at 0 for all clients before continuing, moreover by doing this, you make faster clients stuck on 0 longer than necessary, and everyone has to wait for the slowest client / for the exploiter that intercepts the sending event and has everyone wait for a certain timeout.
To add to this (in case no one said it), you should probably also allow the client to query for the time remaining too, so that they can sync up when they enter the game after the timer has started. (Unless the server is sending frequent time updates.)
Yes, I had thought of this. You would also want to account for server to client delay by sending a timestamp from the server and subtracting it from the current time on the client.
Btw I would like to thank everyone who contributed to this thread, have a nice day.
Edit: Would I do something like this? (for the server)
local Heartbeat = game:GetService("RunService").Heartbeat
local DesiredTime = 5
local StartTime = tick()
repeat Heartbeat:wait() until tick() - StartTime >= DesiredTime
Would the following code be considered to inefficient, or would it be considered necessary to get an accurate wait time?
Something like that works, yes. You could also just do a single wait client-side since this will not be affected by any delay. But if you need a timer to be running, you probably wanna update it on RenderStepped anyway, or at least with a wait(x) where x is the time to go until the next second ticks by.
This was an interesting read for sure, I didn’t quite catch the difference of data efficiency until I thought about it for a bit.
I believe most people tend to focus on sending as little signals between the server and the clients as they would not want to deplete too many memory too fast. However this creates a false illusion that by using a physical value on the server, one can slide past the data usage that is expanded in using remotes.
While it is true that remotes are among the main contribution to lag due to network latency, using a physical value isn’t the best option either. It works in most cases but not the most efficient… As you have mentioned, changing the value for each second differences and having every client listen to it is much heavier on data-cost, not to mention the unnecessary replication.
This sheds more light on data and memory efficiency in game design, generally the data expense difference is not visible for most games on Roblox. However for games that have high data expense, this is a very important factor as even the tiniest amount of data makes visible differences.
Thank you thomas, this has been an educational mental journey