Allow Us to Continuously Sync TimePosition of 2+ Sound Objects

As a Roblox developer, it is currently too hard to reliably keep two Sound objects synchronized during playback.

I would like an engine feature that ensures 2 or more Sound objects are always in sync, and that they never desync or stutter, as desired.

In some of my projects, I need one audio track to match another track’s playback position in real time. For example, I might have a primary music track and a secondary track (like drums or another instrument layer) that needs to stay perfectly aligned with it.

In the past, I used to do this by repeatedly setting the secondary track’s TimePosition to match the primary track, which worked perfectly. But earlier last year, this stopped working. The secondary track now stutters or glitches when updated frequently, even in older projects.

Here’s a video example. I have this cool song that has two tracks, the music track and the lone drum track, playing at the same time. Around 10 seconds, my drum track (Track2) starts glitching out. This is because there’s no reliable way to make sure the two tracks are synced, and the only way I knew how that used to work was by always setting the TimePosition:

I would like to suggest a reliable way to keep two Sound objects perfectly in sync, whatever solution that may entail, without timing drift or audible glitches.

Some usecases may be:

  • Music with multiple layers (drums, bass, vocals, etc.) that could fade in or out depending on gameplay intensity.
  • Live concerts that allow for more audience interactivity by influencing the individual instruments or performances of the music
  • Greater precision for rhythm games, as well as dynamic variation for missed/hit notes
  • Dynamically switching or blending between alternate versions of a track

Layered music and interactive soundtracks are common in modern games, but right now, it’s currently too difficult to implement reliably on Roblox without a solid way to keep audio tracks in sync.

If this issue is addressed, it would improve my development experience because it would fix a frustrating audio glitch and unlock many possibilities for richer, more immersive sound design in Roblox experiences.

6 Likes

Hey @marbleycake37 – we know that synchronization is a pain point. The good news, is that we are working on a new API to address this. Keep an eye on release notes in the coming weeks – should be able to share more soon.

With respect to your current workaround

In the past, I used to do this by repeatedly setting the secondary track’s TimePosition to match the primary track, which worked perfectly. But earlier last year, this stopped working. The secondary track now stutters or glitches when updated frequently, even in older projects.

The engine has a few different strategies for loading audio files –

  1. for very short files, we just decompress the whole thing into ram
  2. for medium-sized files, we keep the compressed representation in ram, but stream/decompress asynchronously
  3. for large files, we write them to disk and stream from there

How fast setting TimePosition is depends™ on which strategy was used. For category 1, it’s pretty much instant – but 2 & 3 have to “re-buffer” around the new position before playback can begin.
Over the years, we’ve adjusted the thresholds between loading strategies, trying to strike a balance between memory usage and responsiveness.

So, instead of setting the TimePosition, a safer workaround would be to “lasso” the PlaybackSpeed.

Something like:

local threshold = 0.001 -- you might need to tweak these threshold & adjustment values
local speedAdjustment = 0.01
while music.Playing do
    if drums.TimePosition < music.TimePosition - threshold then
        -- drums are behind the music, speed up
        drums.PlaybackSpeed = 1 + speedAdjustment
    elseif drums.TimePosition > music.TimePosition + threshold then
        -- drums are ahead of the music, slow down
        drums.PlaybackSpeed = 1 - speedAdjustment
    else
        drums.PlaybackSpeed = 1
    end
end

this avoids the slow rebuffering, but changes the pitch of one or more of the audio tracks – in your example, the drums are probably safe to do this with, since they are mostly atonal.

8 Likes

Is there any news or updates on this?

This was in last week’s release notes, but we’re still fixing some bugs before flipping the flags to enable it :crossed_fingers: :soon_arrow:

4 Likes

I’ve personally found that speed adjustment tends to cause distortion in the audio, periodic TimePosition syncs can be done by just binding a resync function for each track to a Bindable’s Event and then firing it. Signal behaviour might break this, but it seems to work pretty well for me.