Sounds on server not syncing with clients

This one’s a little hard to explain. Basically I’m working on a server-side TV streaming system, that does everything on the server, sending asset ID’s to a value object that’s read by all TV’s on the server. However it seems that online (works fine on solo), the sounds on the client start as soon as they join the game, regardless of when the server actually used the Play method on them. I’m not sure if the devs have considered making audio play from where it’s actually at upon players joining the game, rather than starting from the beginning.

Below are the two examples (joined into one video) of this occurring. Keep note of when the intvalue changes from 0 to 160250533 on each test. You’ll also notice that the sound fails to stop like it should do on the second test (should be at the end of the music).

[video width=425 height=344 type=youtube]CQbQPGbJFQg

1 Like

Hmm, I would suggest making the TV a local object by putting in it workspace.CurrentCamera

It should avoid these latency issues. Give it shot.

I’m certain that it’s nothing to do with latency. I don’t believe sounds were ever made to work in such a way (where if you join whilst one is playing, it plays from that point).

These TV’s have to be synced up with one-another, with every player in the game, for my game idea to work. Putting them in the players camera wouldn’t solve this.

You’re really stuck then.
Sounds play differently for each client. When a client joins a game, the sound obviously starts from the beginning.

Which is a problem that should be fixed?

Sound.TimeStarted read-only property that is to be used internally? Sound.SecondsPlayed/Sound.Position would also work.

Yeah, something like that would be useful to read. I think just more robust sounds would solve the issue, as we’re fairly limited with what we can do currently.

Wondering if this should stay here or be moved to the feature requests forum.

If a dev could clarify, it would be helpful.

1 Like