Recently, I thought of an issue with a game I plan to work on in the future. We plan to have an upgrade system, similar to Clash of Clans, that relies on a certain time period for the upgrade to complete. The first thing that came to mind was tick().
Who wouldn’t use tick()? It can be easily subtracted to find the time elapsed since the upgrade started. It not only is easy to implement but it won’t be affected by how long the player is actually on the server for (meaning it will count when the player is offline, which is what I want).
The key is tick() is determined from the computer’s time it’s running on. For Roblox servers, it’s no problem. However, the client can simply change their date/time to change that tick() property. This means that they can technically complete upgrades like this sooner than they should be able to.
I understand that certain games, like Clash of Clans, have fixed this bug. Does anyone have any ideas on how to patch this?
os.date converts os.time into a readable format. os.time uses seconds, while tick uses milliseconds. Both will work, it depends on your use case and how much accuracy you want.
Another thing is that os.time uses UTC time, while tick() uses the server or client current time zone. os.time is more accurate in that regard.
This is a pretty simple determinant question of “client or server”, to which the server should be authoritative in such a situation where you’re processing crucial data. I personally would also use os.time, mostly because it returns seconds from a specific time zone only (tick() on the client uses a local time zone, while on the server it uses the server’s - think regions) and date also comes in the os library, which is nice if you aren’t interested in writing a custom date library.
That being said, make sure you also use client-side prediction. Pass the time off to the client once per time they join. The server won’t process any continuation requests until the current time is equal to or greater than the saved time, while the client should animate this time locally by counting the timer down.
No, os.time and tick() do not both use just seconds
Yes, both use seconds (I guess technically). However, tick() goes down to the miliseconds while os.time() uses just seconds.The only other difference is os.time uses UTC, while tick() uses the current timezone.
tick exposes miliseconds while os.time does not. They both use seconds, tick does not use miliseconds. Having them doesn’t mean it uses them. You can cut them off as well.
On another note - don’t really see what you’d need the miliseconds for, but perhaps someone has a use case out there for them. I’d just stick to os.time for OP’s use case.
Depends where you’re using it from. tick() on the client returns time relative to the machine’s time zone. tick() on the server returns time relative to the server’s time zone.