Help with client/server time

What’s the time difference in your logs?

ill send a screen record gimme a sec

oh wait i see your error it is saying that the remote event took 15 seconds to fire or 16 seconds

No I am getting -7 - -8 seconds

You could do it by firing the event twice, I guess.

The first time, you have the client send his tick(). The server the calculates the difference between server and client tick.
The second time, the client sends his tick again and the server can now convert its tick to the client tick by subtracting the difference, and use that as “client tick”.

Right?

:sob: :sob: :sob: I want to know what’s wrong with my code

i just tried doing it will the os.date() function and got the same errors

Ping is usually around 3-500 milliseconds, and time only gives seconds. You don’t get enough accuracy from it, making it useless here.

I can easily convert to miliseconds but I don’t believe that’s the issue

Convert to milliseconds? Where are you getting that data from?

i thing i figured out why its not working because the timezone difference would still affect the os.time() because it return the number of seconds since the last epoch so this would vary across time zones. I figure out a solution tho, im gonna send over the os.date() function and then use the minutes and seconds to calculate the time difference

1 Like

no you cant its not so accurate i have to agree i dont see the point of you doing this because unless there is high latency it will fire in under a second anyway

maybe you can tell me what your doing and we can find a better way

Sorry I have to say you guys are wrong that doesn’t make sense, even if it fires under a second I can use that time to adjust

in order for a second to be converted to milliseconds it must have some decimals in under 1 second, and remote events fire usually in under a second assuming low latency, and both the os functions return the seconds as a whole number that is no decimals included.

I could be wrong about the API, but here’s my understanding of it:

Calling time will give you the time since the epoch, down to the second.
If you call it 10 times in one second, it will return the same value all 10 times. There will be no difference in time, according to that function. It does not track small measurements.
Then, you can call it again a single millisecond afterwards. It will count as an entire second has passed, because it just clicked to the next second.

I am using tick, you’re the one using os, but either way the way I am converting will make decimals

but can you say what your trying to use it for and we can help better your description could be better

Is this of any use: https://github.com/Quenty/NevermoreEngine/blob/version2/Modules/Shared/Time/TimeSyncManager.lua

Seems to be related to the topic here?

I’d rather figure out why my code isn’t working just because I really want to know :triumph:

But…
I fire code from client to server to deal damage, so now I need to prevent exploiters from taking advantage of that…

My solution: On server, I have a cooldown table and a damageallowedtime table, if damage event is fired after damageallowedtime but before cooldown then it thinks it’s an exploiter
the issue is server and client are out of sync bc of the time it takes for the remote to travel

(Ik the way I handle damage is dumb and this isn’t perfect protection but it does make it really hard to exploit and there are other checks in place)