it shows that os.clock has a smaller value than tick() but then 5 seconds later os.clock has a higher value than tick()? how is this possible? is it something with how fast the electrons are?? lol
-- server
local time_start = os.clock()
local servertime
local time_start2 = tick()
local servertime2
run.Stepped:Connect(function(dt)
servertime = os.clock() - time_start
servertime2 = tick() - time_start2
end)
spawn(function()
while wait(5) do
print(string.format("%.10f",servertime) .. " os.clock")
print(string.format("%.10f",servertime2) .. " tick()")
print()
end
end)
What you’re seeing is the only major difference between tick and os.clock (outside of the fact that tick measures since unix EPOCH & os.clock measures the actual amount of Lua CPU time in seconds). You should expect variable results from these because of their implementation. os.clock is designed high performance and aims for a result within 1 microsecond of accuracy (1 millionth of a second), whereas tick can be variable up to a whole second (which is actually why you should be avoiding tick in favor of os). tick is on track to receive eventual deprecation as Luau improves.
"If you need a UNIX timestamp, you should use os.time() . You get a stable baseline (from 1970’s) and 1s resolution. If you need to measure performance, you should use os.clock() . You don’t get a stable baseline, but you get ~1us resolution. If you need to do anything else, you should probably use time()" - zeuxcg
The reason you see variable results between the two is because of their performance differences and the innate fact that you are never going to exactly calculate times like this.