Weird os.clock() halting behavior on mouse clicking

Ok, this issue is rather specific and complicated. But here is an overview:

My shooter game uses a custom wait function for accurate yielding (such as weapon fire-rate pauses).

local function yield(duration, stopCondition)

	local frame = 0
	local frameToWait
	local elapsed = 0
	local t= os.clock()
	warn("======start", t)
	repeat
		local _,dt = game:GetService("RunService").Stepped:Wait()
		elapsed += dt
		frame += 1

		frameToWait = roundNumber(duration / (elapsed / frame), 0)

		warn(os.clock()-t)
	until frame >= frameToWait
		or elapsed >= duration 
		or (stopCondition and stopCondition())
		warn("yield frame waited", frame)
		print("total elapsed", elapsed)
		print(os.clock() - t)

	return elapsed - duration
end

E.g: When yield(0.05) is called, the function halts and returns after ~3 frames (in 60 FPS) to wait for 0.05 seconds.

Here is the output of yield(0.05), in which the highlighted part is (os.clock() - t) calculated per frame, which t is the initial os.clock() recorded. For example, the first highlighted line shows that it has been ~0.163 seconds after the first frame, then 0.034 for the second frame, and 0.049 seconds for the last.
image

The Problem:

However, if I run yield(0.05) after a UserInputService mouse click listener (which is practically used in my game), the value of os.clock() - t shows a significantly low amount of time.

game:GetService("UserInputService").InputBegan:Connect(function(inp)
	if inp.UserInputType == Enum.UserInputType.MouseButton1 then
		yield(0.05)
	end
end)

image

And the final time calculated by os.clock() - t after waiting for 0.05 seconds is 0.034 (shown in the last line), which is inaccurate. U:sing tick() / time() results in the same behavior.

Why is this important to me?

The deltaTime returned by RunService.Stepped seems to be unaffected. However, I am currently trying to fire a remote 3 times on every 0.05 seconds (e.g: Firing a weapon). The server keeps track and verify the timing between these remotes for security purposes.

Here is a client clicking their mouse, then firing the remote 3 times, each time with an interval of 0.038 seconds.

game:GetService("UserInputService").InputBegan:Connect(function(inp)
	if inp.UserInputType == Enum.UserInputType.MouseButton1 then
		for i = 1,3 do
			game.ReplicatedStorage.RemoteEvent:FireServer()
			yield(0.038)
		end
	end
end)

And here is the output on the server: Each number represents the delta time between each time when the remote is received. I assume due to server tick rates, the best interval that the server can detect should be ~0.033 seconds, which is good enough for me.
However, as seen in the output below, due to the “mouse click halting behavior”, the second remote is received in 0.015 second after the first one was received.

image

This inconsistency result has caused me some problems regarding on the security mesaures that I am trying to implement, considering the value is ~50% off.

This is not a networking issue as I ran few tests withOUT any user mouse click input, I am able to produce consistent result.

image

The only solution that I can find of is to add task.wait() before yielding 0.05. But this eliminates my goal of having a precise waiting function and it is not feasible to introduce such input delay in a shooter game.

Is there any other solutions / addresses can be made to this problem?

2 Likes

You might be encountering a queue across the network
If you must use your current methods consider trying UnreliableRemoteEvents

However, I would advise sending only one networking request with the no. of shots as an argument
Handle your yielding process solely on the server and verify the no. of shots on the server
If you are dependent on very specific numbers, handling them across the network is usually not a good idea, for example if there is any latency on the network all your calculations will be inconsistent/incorrect

Instead of trying to yield, you can just check if enough time has passed before they can fire the weapon again. I use that method all the time because it is the easiest and most performant. For example:

local COOLDOWN = 3 -- Minimum delay in seconds
local lastActivated = 0

tool.Activated:Connect(function()
    local currentTime = os.clock()
    if currentTime - lastActivated < COOLDOWN then
        return
    else
        lastActivated = currentTime
    end

    doSomething()
end)
2 Likes

It’s basically a lazy-loaded way of your approach.

Also, if you’re sending remote events with that small if a delay, there’s practically no reason for them to be separate. Just simulate it on the server with one remote. That will use a lot less bandwidth compared to the former approach.

2 Likes

This issue seems to exist without the networking involved, as shown at this output image above, the first os.clock - t resulted in an extremely small value at the client.
image

I have found a temporary solution which seems to produce a similar expected behavior without altering the values.
I calculate the elapsed time by using os.clock() instead of using the delta time returned by RunService.Stepped:Wait()

repeat
	local _,dt = RunService.Stepped:Wait()
	-- Not using dt anymore
	--elapsed += dt
	elapsed = os.clock() - t
	frame += 1

Previously when calculating elapsed using dt returned by .Stepped, calling yield(0.04) at 60 FPS will yield for 3 frames, which elapsed is incremented 0.016s each frame. When user input is involved (aka the “bug”), running os.clock() - t after the function completes will result a significantly smaller value than 0.04.

Now with using os.clock() - t to calculate elapsed, it yields for 4 frames instead, because os.clock() - t returns an extreme small value (shown above) for the first frame. And running os.clock() - t after the function completes will return a value close to 0.04 now.

This is such a weird problem and it only happens when there is user input. I would like to hear a clear explanation on this someday.