When using wait() or task.wait() in a localscript, the timing is based on the frames per second of the player–meaning someone with an FPS unlocker will have to wait less time, and someone on a first generation iPhone will wait more time. I’ve looked everywhere on the devforum, and while some people have shared this same issue, I haven’t seen a good solid solution to it.
I’ve tried replacing my task.waits with this function that I found on the devforum, although it seems not to work.
local function betterWait(n)
local dt = 0
while dt < n do
dt = dt + runservice.Heartbeat:Wait()
end
return dt
end
For an item the player has, there’s a timer on it of 10 seconds. While using the item, the timer ticks down until it reaches 0. There’s no practical way I could communicate to the server to handle this.
I mean… why wouldn’t I want it to be precise? If it isn’t possible just tell me, but I swear it is considering I assume a lot of popular competitive games that involve guns and weapons and stuff use localscripts. In this particular case, no it doesn’t need to be 100% precise but the timing can greatly differ based on the framerate and its annoying.
local function betterWait(waitTime)
local start = tick()
repeat
runservice.RenderStepped:Wait()
until tick()-start >= waitTime
end
it’s ol reliable to me, however i dont think the timing changes THAT drastically with FPS unlockers. my entire community uses it in my hockey game which is literally all task.wait() and no one who FPS unlocks has any advantages to my knowledge.
Unless their framerate is something crazy like 10 fps (which means .10 second precision for task.wait and also be completely unplayable), there will be no appreciable difference in the timing of the ability.