Recently I was using a PID controller that takes into account RunService.Stepped deltatime for the derivative and I had noticed that it’s only stable on 60 fps. Even with many many hours, I came to the conclusion that it actually gets more unstable as the frame rate goes up, so I thought "hey, why don’t I make a while task.wait(1 / 60) loop since it’ll be forced to always update at 60 frames per second. I had also created my own delta time in this loop but, the results were different.
My question is, why can RunService.Stepped be accurate, and a while task.wait(1 / 60) loop be inaccurate if it’s technically doing the same thing as if Stepped was fired at 60 fps
RunService has deltatime inbound to it,
not sure how to explain it.
RunService.RenderStepped:Connect(function(deltatime)
print(deltatime)
end)
while task.wait(1/60)-task.wait()
print(task.wait())
First, RunService.Stepped (superseded by RunService.PreSimulation) fires before every physics step (not calculation update, that happens about 4x each frame). If the game freezes or the physics rate changes, then the time it takes will also change.
Second, you can’t time things to exactly a sixtieth of a second, or any other interval regardless. For all you know, the game could resume seconds after the last invocation.
Instead, you should incorporate the DeltaTime into your calculations.
If you want to lerp by a factor of 0.1 every 1/60 of a second, then multiply the factor by the delta by the rate and clamp it under 1.
local Rate = 1 / 60
local Factor = 0.1
local Result = Lerp(
Origin,
Goal,
math.min(Factor * DeltaTime * Rate, 1)
)