For a long time now, Ro-Racing has been using a particular free model timer. After countless uses of this timer, I’ve noticed a lot of trends with it. The racing timer looks at when a car crosses the finish line and then marks down the Distributed Game Time. When that same car crosses the finish line again, it finds the Game Time again, and then minuses the 1st from the second, and posts that onto a timer board to give off the time it takes to travel one lap.
In various instances, across various tracks, of various lengths, with various cars, the timer will often have issues where two drivers will run identical times down to the thousandth. (.000)
Is this because Distributed Game Time is rounded in a certain way or is it because the scripts run at a specific time interval? Or is this created by some other issue entirely?
If you connect to certain signals under RunService such as RenderStepped, Heartbeat and Stepped, they will run every frame. If you are just yielding using wait() with either no time specified or a time less than 1/30, your script will run exactly every other frame. (which is 1/30s at 60 FPS, but it will be longer if FPS is lower) If you do specify a time, such as wait(5), then the script will continue the next other frame once at least 5 seconds have passed (but it could have been >5 seconds too).
Makes sense then why we typically see ties being at times of like .0333 and products of .0333
So, to clarify, the precisest timing without using RenderStepped, Heartbeat, and Stepped is 1/30th of a second? And from my understanding of what you said, using those in some way can get me down to 1/60th of a second?
You could switch to using tick(), as it’s more precise. In case people finish in the same tick, the difference will be very small. Basically the car that gets checked first is “first” according to this.
you could also check how far over the finish line the cars are, for that extra accuracy