Ideal method for Chunks?

To elaborate; I’m currently working on a Doctor Who game. For unfamiliar, Players have their own TARDIS (which is a somewhat ship capable of travelling to different places throughout time)

As these TARDISes will be loaded ServerSide, it creates the issue of loading Chunks client-side as the TARDISes will just fall through upon arrival at said destinations.

I’m unfamiliar with chunk methods since the introduction of FilteringEnabled as my old Method was simply the use of CurrentCameras.

Can anyone propose an idea for handling these chunks?

As of now, my idea is: All locations are serversided, but removed on the Players Client upon joining and are loaded individually.

Anchor the TARDISes once they arrive then position the client near the TARDIS, wait u til the terrain/ objects load and then unanchor the tardis.

Wouldn’t the TARDIS model still fall through the client generated locations as its server based?

You could try simulating a chunk on the server by loading in a transparent part that matches the shape of the chunk, to make it feel like it’s there but keeping less strain on the server.

I recently wrote a tutorial about how to do this. It’s in the post approval process, but I posted it to the Bulletin Board to link it here. Be warned, it’s long:

You definitely don’t want to do what you’re proposing though by loading everything to the client when the player joins. This will harm loading time unnecessarily.

To summarize my thread, abuse Player.PlayerGui to replicate Instances through to the client. Remember that the client’s scripts may be able to access the chunk before it’s been fully loaded. Copy it on the client. Client tells server it’s been loaded. Server destroys original. If a player leaves, destroy all the chunks they had pending.

As hacky as it is, there’s really no better solution. You could serialize Instances and send that data to the client, but I see no point in this since replication will be slower, and you can’t dynamically create MeshParts, so it’s just trouble.