What is the minimum time delay I can have to fire a Remote Event continuously from client to Server. To not cause any performance degradation?

Hello

I am designing a automatic weapon with client sided hitscan, which can fire many bullets within a second. But I want to spawn tracers and bullet holes for each shot. Normally I can just fire a Remote Event for each shot but I am concerned about the performance in that case.

So I came up with a system where the shot data is stored in a table and the remote event is fired within a interval with a very short delay time and when a interval is approched all stored data is sent to the server. But I can’t seem to find the perfect time for the short delay, for which I came here. What Value can I choose for this interval? I would appreciate it if you suggested me one.

Look into using an unreliable remote event, this describes the exact behavior you want.

Although unsure one of my co-workers who is a experienced dev didn’t recommend me this tho as i want every event to be read and not just the recent one.

Depends on the size of your dataset.
RemoteEvents send in the proper order every time, and must be received in that same order (guaranteed).
Sending more than, say, ~15-20x a second is likely unnecessary, but I believe the cap is 60/s.

At any rate, this could lead to strange bursting effects if the shots aren’t queue’d up properly after replicating, so make sure you handle that as well.

Not sure if this helps, as I’ve not tested this, but I read somewhere, there is a network data limit as well of 50kbs.
I also believe the cap of 60/s is for all remotes.