Timer works in studio but not in-game

Exactly as the title says.
In studio it works perfectly fine, but in-game it’s all over the place. Here’s some of the code:
Thank you in advanced.

(in studio it’s counting up as it’s suppose to, in-game it is just showing a dozen negative numbers)

Server:

plrz.TrialTimes:FindFirstChild(v.Name).Value = os.clock()
						plrz.Time.activeTimeTrial.activeColour.Value = v.Name
						
						print("------------ OBBY TEST : VERSION 0.4 // 04:16 ------------")
						
						
						print(plrz.TrialTimes:FindFirstChild(v.Name).Value, " -- server time it's adding")
						wait(1)
						
						print(v.Name)
						
						local PickedColour = v.Name
						print(PickedColour)
						game.ReplicatedStorage.en.StartTime:FireClient(plrz, PickedColour)

Client:

game.ReplicatedStorage.en.StartTime.OnClientEvent:Connect(function(PickedColour)
	
	
	print(PickedColour)
	
	local grabColour = lplr.TrialTimes:FindFirstChild(PickedColour):GetChildren()[1].Value -- get best time

	print(grabColour, "this is grab colour! ")

	if grabColour ~= "nil" then
		script.Parent.Parent.BestTimeFrame.TextLabel.Text = grabColour
	end

	repeat
		wait(0.1)
		
		
		print(ftime1)
		print(os.clock())
		
		local ftime1 = lplr.TrialTimes:FindFirstChild(PickedColour).Value
		
		print(ftime1 - os.clock(), " -- fttime1 - os.clock() is this one")
		print(os.clock() - ftime1, " -- os.clock() - fttime1 is this one")
		
		
		local ftime = tonumber(os.clock()) - (tonumber(ftime1))
		local firstnumber = string.split(ftime, ".")
		local secondNumber = string.split(firstnumber[2], "")
		
		print(ftime, " -- ftime this one is")
		print(ftime1, " -- ftimeONE this one is, I promise !!")

		if lplr.Time.activeTimeTrial.Value == true then
			timerz.Text = firstnumber[1] .. "." .. secondNumber[1] .. secondNumber[2]


			if grabColour == "nil" then
				script.Parent.Parent.BestTimeFrame.TextLabel.Text = firstnumber[1] .. "." .. secondNumber[1] .. secondNumber[2]
			end

		end
	until lplr.Time.activeTimeTrial.Value == false


	timerz.Text = lplr.TrialTimes:FindFirstChild(lplr.Time.activeTimeTrial.activeColour.Value).Value
end)

This can possibly be because of the remote event. Testing makes the server-side your computer, as well as the client-side. Due to this latency between client and player is nearly 0ms. However, in reality, the client is your server and the server is the Roblox server. Therefore latency changes depending on your connection and server location. I would recommend spamming prints after every if statement if the debug console in the Roblox app does not give any errors. This can help you figure out at what line it breaks down. Also what helps debugging is printing the variables your using.

What I believe is causing this.

This can also be due to what os.clock() does. os Dev Docs

“Returns the amount of CPU time used by Lua in seconds. This value has high precision, about 1 microsecond, and is intended for use in benchmarking.”

Due to studio testing making your computer both the client and server, os.time() is the same among both. However in reality on Roblox, only your client is your computer. Personally, for me os.time() is around the 4k range for both client and server when testing in studio. However, in Roblox my client os.time() remains 4k, but the Roblox server(server-side in reality) os.time() is in the 400k range. You may have not thought of this vast difference between your computer and the Roblox server’s speeds. Believing the server-side in roblox would not change that drastically. Whatever your values are you might have to change them, or rewrite the script with this knowledge.

Examples of this difference of os.clock() from testing and in reality

Testing
Os clock in testing

Client reality

Server reality

Ahh I see I see.
I need to track milliseconds as well which was the original reason I opted to not use os.time()
Is there any other way I’ll be able to archive what I’m needing?
I don’t think I’ll be able to fix it with using os.clock() as I’ve tried everything around it and still can’t figure it out sadly.

Thank you for your reply! I appreciate it.

Using os.time() instead of os.clock() will work.

Oh wait I might’ve worded it wrong, I need milliseconds as well

Then use os.date(), os.clock() isn’t for recording time & date.

The following is from the os library documentation page:

os.clock()
Returns the amount of CPU time used by Lua in seconds. This value has high precision, about 1 microsecond, and is intended for use in benchmarking.

I don’t think os.date() can get milliseconds too, only seconds they can get or am I mistake on that?

Multiply the result by 1,000 to get milliseconds.

I’m a little bit confused by that,
Would you mind showing a little example there? Thank you

Are you attempting to determine the mechanical speed of the server vs the client? If so then keep reading. If you’re trying to determine the ms/latency of the client, also keep reading.

Determining Mechanical Speed of Client

You can do this by using runService.RenderStepped:Wait()
Since RenderStepped is dependent on FPS it allows you to determine to speed of the users roblox application. This is very useful for adjusting values to allow for a equal user experience among slowly computers, and people using fps unlocker.

An example of this is determining the client fps.

local fps = 1/runService.RenderStepped:Wait()

Determining Mechanical Speed of Server

People constantly believe this is fixed to 1/60. However, it is fixed to the server fps. Which Roblox aims for 1/60. However servers are just powerful computers, and they too have speed fluctuations. Due to the inability to use RenderStepped server-side, you would use runService.Stepped():Wait() However Stepped return two parameters which can be seen in stepped documentation. Documentation of Stepped

So it would be slightly different. An example is of determining server fps

local runningDuraction, frameTime = runService.Stepped:Wait()
local fps = 1/frameTime


Determining Latency Of Player

However, if you’re trying to determine the latency of the player, Roblox attempt to hide this due to privacy reasons.

The definition of latency is defined by the time it takes the client and server to communicate.

Therefore we can abuse the fact that remote functions are client-server-client or server-client-server.

Therefore we can abuse the fact that code written after Firing a remote function to the server will not be executed until the Server returns something after catching the event. Since this takes two times to travel been client and server we can measure the time using tick and divide it by two. Finally, we multiply it by 1000 to determine the milliseconds since the tick is measured in seconds including exact milliseconds. And 1 seconds = 1000 milliseconds.

Scripts of determining ms local side(local script)

local LatencyRemote = remoteFunctionLocation:WaitForChild("Latency")- -Change to correct location.

local time = tick()--Record time before firing
LatencyRemote:InvokeServer()--Invoke server
local ping = ((tick() - time) / 2)*1000 -- Latency, this line will only execute once server return travels to client, which means two travels since time was first tooken.

Scripts of determining ms server-side(normal script)

local LatencyRemote = remoteFunctionLocation:WaitForChild("Latency")
LatencyRemote.OnServerInvoke = function(player)--Fires when client is asking for latency
    return true--Returns true quickly to prevent time to run code with can add very slight bit ofextra latency into our equation
end

Keep in mind an exploiter can lie about latency to the client. So if this is used for determining something that affects other players. I recommend you measure latency service side by flipping the scripts and replacing “InvokeServer” and “OnServerInvoke” to “InvokeClient” and “OnClientInvoke”

local time = os.time() * 1000

There you go.

os.time() is measured in seconds. os.time() * 1000 would just return the time in kiloseconds. Which is still 10^3 of measurability away from mili seconds

local startTime = os.time() * 1000
local serverTime
while task.wait() do
	local currTime = os.time() * 1000
	serverTime = currTime - startTime
	print(serverTime, "milliseconds have elapsed since the server started!")
end

I am mistaken. I got my division and multiplication mixed. However, documentation shows the os.time returns in seconds. Not milliseconds, nor does it measure milliseconds. It will return a whole number. There let say 0.5seconds/500 milliseconds have passed since a value of “1635923817” it will still return “1635923817” until a full second has passed.

The poster is clearly trying to return measurability in milliseconds, for accurate measuring. Other converting whole seconds to milliseconds is as pointless as pouring water from a smaller cup to a bigger cup in hopes you can measure volume more precisely, however, your tool to measure volume remains. otherwise, he would easily know that he can convert that. Since he is comparing server to client most likely milliseconds of measurability is important. Especially in a race. He wants time as “1.1s” not “1000ms” which os.time would exclude that 0.1s.

Back to documentation for you. Here you go os.time documentation. Please read. It doesn’t measure miliseconds.

I just copied & pasted os.time() * 1000 as a solution from another thread (it’s still technically valid albeit redundant). If you really need to record milliseconds then use the following:

DateTime.now() will provide time & date with precision down to the exact millisecond.

os.time() return seconds, not miliseconds


Who said it returns milliseconds?

then why do you multiply os.time by 1000?

Why not? I like mathematics. Don’t you?

I like math, but multiplying os.time by 1000 might be useless

i still dont understand what you wanted to say v