Instance Streaming: Softer Instance Stream-In

As a Roblox developer, it is currently too hard to control the Instance Streaming model/instance stream-in rate.

When a player gets near a chunk with packed instances, streaming seems to load everything instantly or extremely fast, causing frame lag spikes, depending on how packed instances are when close to a player. This issue also happens when you get away from the chunk. Those loaded instances get unloaded, and the lag persists in both scenarios.

In this video, there’s roughly ~9.2k Instances(counting Models, BaseParts) packed into that one area, each model is using Atomic ModelStreamingMode.

The possible worst-case scenario is when Streaming is constantly loading and unloading. In a packed area with tons of instances gets unloaded and a new area with equal or more instances get loaded, the client would have a consistent frame lag from a highly packed map in a Roblox experience.

From my perspective, this issue impacts Roblox experiences with realistic large map designs while trying to maintain low memory usage for the client.


I don’t see a reason why you would have 9.2k instances in one area of your game.

1 Like

Definitely support this! Games with really, really big maps would benefit from this!


Since the Model is Atomic, as the documentation states, “all of its descendants are streamed in/out together”.

Wouldn’t that mean all those things you don’t want loaded at once, you are making it load at once? Could you try setting the Models to Nonatomic and see the differences in load times and “frame lag”?


I doubt it, I don’t think Roblox downloads the models into your Secondary Storage Device, that’s why RAM usage is always so high even on a simple baseplate.


I have zero clue about the internals but I’ll trust your word


I need this feature as well; my game has many things on the surface and underground. In a position where there is a temple under me there are 10,459 Parts which to a lower end device as they walk through it, they will get a lot of lag.

I have everything set to Atomic for convenience and I want the object to be fully replicated so when the client needs to use it.

Even if Default is used, we will experience lag since all those 10,459 Parts are streaming in as fast as they can, including the descendants of each part which is in total 14,049 Instances.

I have no idea as how this can be fixed on Roblox as we want everything to be loaded in fast, but at the same time we don’t want lag…

Maybe something like PathfindingModifiers which you insert into a part to let the pathfinding service know what to do and stuff like that, but for Streaming which triggers the loading of areas? For example, when a player enters a cave in my game that area loads in, and once they are out of it, it unloads.


Something like defining streaming regions manually as override would be really useful! Caves are a good example, but also interiors would do a lot.

If someone is far away from a building, the inside should have a smaller streaming radius than the outside. It saves a lot of resources.


Thats true, even though this lag still persists on Nonatomic, there is no difference, also in some of my cases, when using default (Nonatomic) the lag effects are still present, in most cases Nonatomic could be worse than atomic, and other times it isn’t, regardless of either one, stream in and stream out seems to happen too fast causing frame lag.

1 Like

RAM varies usage based on your own studio configuration, such like the amount plugins you have installed, in-game the ram would remain mostly low.

1 Like

I think my game could also benefit from this feature.

my game uses a lot of meshparts for terrain, when the player travels fast enough the models streaming in and out cause the same lag spikes you described. I read somewhere that meshparts can lag due to the data downloading from roblox rather than just being replicated to the client but I could be wrong.

1 Like

In this video, there’s roughly ~9.2k Instances(counting Models, BaseParts) packed into that one area, each model is using Nonatomic ModelStreamingMode.

I made this video for those wanting to know the difference.

The point is to show the lag spike, and this lag is affected on more experiences with high numbers of instances.


It was solely example, when there are more big mesh instances spread out in chunks like 20 per chunk, this greatly impacts the client’s FPS.

In my guess, with many varying meshes the Roblox content provider would load those meshes live while streaming them in, doing this repeatedly for any other unique meshes at the same time loading other meshes and streaming those at the same time, resulting in a much longer lag.

Here is an example of this lag, using Nonatomic in a live experience. There are no thousands of instances in one area/chunk.

Your solution would be culling, but not every Roblox developer understands how to implement this.

1 Like

Support! I was running into issues with this earlier, not exactly parts, but it was data regardless that caused Replicator & ProcessPackets labels to heavily increase.

I had a large body of water around my map, and while I optimized it after to reduce the depth of the water, my game was struggling to load it all instantly - my minimum streaming radius was quite small, but it tries to catch up to the target as fast as possible, even if that results in an unplayable experience.

The terrain itself rendered perfectly smooth after it was all replicated to my client. No clue how anyone uses Opportunistic streaming, and I’m sure those types of experiences will benefit heavily from limiting the replication speed outside of the immediate minimum radius.