Fix table serialization support for RemoteEvents, BindableEvents, and more

Roblox uses a table serialization algorithm that doesn’t support any keys in tables other than string or array keys. Obviously this leaves out many table structures which are incredibly common and useful, such as:

  • Key = player’s user ID, value = player’s data table.
  • Key = player’s house, value = player who owns the house.
  • Key = boolean, value = data to use based on the truthiness of a statement.

This algorithm shows up in (at least) the following situations:

  • RemoteEvent argument passing.
  • BindableEvent argument passing.
  • DataStore saving.
  • Inspecting tables in the Lua debugger with breakpoints.

Please add full support for all valid table key types to the table serialization algorithm.

This will only become more important once Typed Luau is out of beta, because developers will want table keys to be of their custom data type (technically, a table).

This also adds a point of confusion for new developers when it shouldn’t. Why should someone new to Lua have to learn about the intricacies of when an integer key goes into the array, and when it goes into the hashmap?

Many of my more complicated bugs involve “complex” structures in tables. The debugger just tells me what memory address a table is at, which is not usually useful. With a better serialization algorithm, Roblox can add full support for table inspection in the debugger, including with tables as keys inside tables.

For DataStore saving, userdata could still be unable to be saved while still allowing other valid table structures.

Related topics about individual cases of this issue:

11 Likes

I would like to let you know that they are encoded in JSON that’s why they can’t be Mixed-Tables and there are easy ways to get around this in the mean time, I don’t know how practical it is for adding support however it’d be nice.

I’m aware that they’re encoded in JSON. There’s no reason that the JSON encoding and decoding algorithm can’t handle mixed tables or anything else I mentioned.

Relatively easy. There’s no performance concerns, the JSON serialization shouldn’t be coupled with much (DataStores are a unique case because you can’t save everything, but they already check for Instances as table values so it should be doable).

fix this

Support. I’ve been trying to fix this bug in my code with ReplicaService for hours on end, after thinking it had something to do with me not setting values in the table properly, but it turns out this is because remotes don’t support sending dictionaries with non-string indexes.

I’m basically going to have to change how I format my tables, and this sucks because Vector3s are the fastest way for me to iterate over these tables, and I can’t convert it to a string and back because that will make any server => client changes take about 20x as long to process, as string formatting is way slower.

The tables are formatted like this:

Summary
Chunks = {
		[Vector3.new(0, 0, 0)] = {
			Position = Vector3.new(0, 0, 0),
			Voxels = {
				[Vector3.new(0, 0, 0)] = {
					Type = "Air"
				}
			}
		}
	}

I’d likely have to format my tables like this for my system to work over remotes:

Summary
Chunks = {
		[0] = {
			[0] = {
				[0] = {
					Position = Vector3.new(0, 0, 0),
					Voxels = {
						[0] = {
							[0] = {
								[0] = {
									[Vector3.new(0, 0, 0)] = {
										Type = "Air"
									}
								}
							}
						}
					}
				}
			}
		}
	}

I’m sure I’ll find some way to do this, maybe without the format of long numbers (client-sided metatables?), but especially seeing how Vector3 is a native type now, I really wish that Roblox supported sending this data over remotes as indexes, even if it has to be sent in a string but turned back into a Vector3 once the client receives it.

It would also ease up so many restrictions that come with remotes, mixed table support being a big one. I personally don’t use mixed tables that often, but I have been confused before when some values were straight up not even being sent.

1 Like

If your format is storing the key using a Vector3 you could just convert it into a string like this for example:

Chunks = {
		["0x0x0"] = {
			Position = Vector3.new(0, 0, 0),
			Voxels = {
				["0x0x0"] = {
					Type = "Air"
				}
			}
		}
	}

[/quote]
You’re already storing the Vector3 Position as a property for reads - so maybe this would be an easy solution for storing the chunks as keys.

1 Like

I don’t wanna get too off topic here, but unfortunately, the problem for my game is that I have a chunk loading system which needs to be super quick, and indexing a table by creating a new Vector3 is so much faster than doing that but also formatting a new string to index for the chunk’s information, and I’m trying to keep it decently optimized.

My benchmarks (might seem overkill, but it REALLY adds up):

Vector3.new x 500K = 0.005 seconds
Vector3.new => string (concatenation) x 500K = 0.18 seconds
Vector3.new => string (string interpolation) x 500K = 0.23 seconds

I’m gonna try and see if I can get it to work with metatables, as I might be able to create a fake table locally which has its index as the chunk / voxel’s position Vector3, and the value as the reference to the table under Replica.Data, and hopefully that’ll let me keep most of the speed gains of using Vector3 instead.

Thank you though! :slight_smile: I will likely have to do it your way if my idea doesn’t work.

You should just be flattening the map:

{[key] = value}{{key, value}}

This is a relatively cheap transformation to do, and roughly the same thing which the serialization system would have to do anyways if it were made to support such a thing (if you’re clever you can avoid spending any extra memory on the deserialize side while undoing this transform)

To the original request, it’s unlikely that this feature will be implemented.

The key reason being that supporting serialization of more “exotic” tables opens up a lot of grey area where it’s unclear how the serialization should happen. E.g.: What happens when you pass a table with an array part which has nil holes in it? Should extra network bandwidth be spent to preserve the order of the keys past the nil, or should they just be sent as part of the hash part even though they were originally in the array part? And that’s just one edge case.

Handling exactly structs (string keys) and arrays (with no nil holes), without self-references, makes the behavior very robust and straightforward to understand both for developers consuming the APIs and engineers exposing them. Serializing more complex data structures is something better left to Luau library code which can make opinionated choices about how to handle edge cases.

6 Likes