This has saved me from literally hours of monotonous work grouping thousands of individual parts into models. The fact that no instance could be guaranteed to be on the client under old StreamingEnabled stopped me from using it in my game, but going forward i’ll just use StreamingEnabled for everything
Now all we need is the ability to specify streaming radius per model. Is this being worked on?
I have many models in my game that are within close radius of each other but shouldn’t all be rendered at once on the client because of memory limitations. However, there are other low-memory models that should be seen from a distance
Will there ever be a way to update a streaming mesh after it’s already created? Streaming currently has very few uses for games where destruction occurs. (The streaming meshes become inaccurate as soon as the actual structures falls apart.)
I echo both the praise and feedback of others; this is a great update but there’s still a few improvements and tweaks that would be massively appreciated.
I was wondering if it’s possible to be able to access these streaming modes outside of models – for example, in folders?
Within my game tree there are a few folders that house a large number of subfolders and parts. These parts are incredibly small, invisible, and used for scripting purposes only and I’d appreciate being able to mark these folders as persistent while keeping the very heavy environment streamed as normal. Currently, the only way I can see myself doing this is by creating these folders on the client, but this creates a few issues where the server interacts with some of the folders, requiring overly hacky solutions.
Hi! Before you roll this out, could you please consider changing:
" * PersistentPerPlayer - Always streamed for specified players
to
" * PersistentPerPlayer - Always streamed EXCLUSIVELY for specified players
This would be a much more useful feature, and is what I thought you had originally implemented. I cannot think of a use for the current version of the feature, where I would also want other random players to stream that instance in as well as the ones I whitelisted.
In simple terms:
What it does: “If model not on list, model behaves like normal”
vs
What we want: "If model not on list, model go away"
Is there a technical reason why it’s not implemented this way?
I think the behaviour you are talking about would be beyond the scope of Streaming. We’re stepping into selective replication territory here, which I feel could be an entire new feature.
I can actually think of multiple use cases for this feature in it’s current state, e.g. models that have their physics always owned by one player but the player doesn’t necessarily always stay near them.
this is really cool feature i wanted to also post a feature request calling it readonly variables since this will really help agains exploiters trying to change local script variables
This is amazing, would be great if there was a option to only stream a specific folder/location. For instance at my experience, I need some models to be loaded in at all times so that the client script works as intended.
I have an important question regarding this, which is, do Persistent models skip the ‘streaming’ method entirely, or are they still loaded in using the slow system?
One of my biggest problems with streaming enabled was that, parts that I spawn/clone directly near the player server-sided have an excessive delay to loading in, even though you’d think being directly beside the player when spawned would just immediately load in - the Streaming System itself in this case is so excessively slow that point-blank it still causes massive issues, that so far I have been unable to solve.
I like this update but the way Persistency is described implies that it is still taking the multiple times-slower approach to ‘streaming in’. Let it be perfectly clear, Persistent parts need to ignore the streaming system and go back to using roblox’s fast, default replication.
Can anyone confirm or deny how this behavior works for me? Preferably a developer/staff, since they did such a good job ignoring my feedback/questions pre-release of this update. @CorvusCoraxx
Roblox’s default replication is so much faster than Streaming in every circumstance, there is no such thing as a future where Roblox is 100% streaming enabled currently. It is simply not possible, because as it stands streaming enabled will always be a slow, clunky system unless Persistency uses default replication methods.
@CorvusCoraxxWorkspace.PersistentLoaded shouldn’t require Player.ReplicationFocus to be set before it can fire. This interferes with disabling Player.CharacterAutoLoads and wanting to wait for persistent model streaming before spawning a character. As a workaround I have to set Player.ReplicationFocus to a dummy object.
I would however like to express slight concern on the matter of ‘Default’ StreamingMode for models potentially changing in the future. I feel this would be something more appropriate to have handled with something like a per-place setting that decides the default StreamingMode for models in that place, which could remain as non-atomic for already made places and be changed to atomic for new places when the time comes. Though maybe this is already what you have in mind.
Also, in regards to this:
Is there a benchmark for what constitutes “very large”? Is a model with 1,000 parts very large? Or is it closer to 10,000, or even more? Obviously I know it is not an exact thing, and probably depends on the instance types as well, but some sort of reference as to what is too much would be nice to have.
Unfortunately it’s not about raw number of Instances, it’s rather about the amount of data. One complex MeshPart with a detailed collision fidelity could contain as much data to replicate as dozens of basic parts.
If you want to get a rough idea how this stacks up in practice you could save the models as .rbxms and compare how large they are (this won’t be a precise measurement due to compression and slight differences between serialization/replication but it’s an effective way to get a rough feel for it).