Luau Recap: August 2021

Yeah I agree, when we were planning this optimization the mental model was “setfenv breaks this but we don’t like setfenv so we will try anyway and it’s not a huge deal either way”. The examples we’re discussing definitely convinced me that we need to document this as an incompatibility (and we will note that it actually mirrors Lua 5.3 behavior somewhat), if the optimization actually stays enabled :slight_smile: Stay tuned for the next recap where we will find out!

Yeah, understood. FWIW the future introductions of new ways to store objects will resolve this, I think, although I still wouldn’t recommend to write type-safe code without a type checker in the loop :slight_smile: but that problem hopefully we’ll get to solving earlier.

Ignoring the issue that stravant pointed out, do you ever think there will be a time where you decide to remove getfenv/setfenv entirely in favor of backporting _ENV? It creates a huge backwards compatibility issue but seems to be a huge brick wall preventing a lot of good optimizations.

1 Like

Oh believe me we would LOVE to. setfenv is a cool concept but it’s been nothing but trouble so far :-/ Unfortunately we did a survey of the various uses of getfenv/setfenv and there’s no clear path for us to remove these without, well, breaking a bunch of games and upsetting a bunch of developers, which would be unfortunate – this is even if we added _ENV support. For example many developers [used to] use module systems where your first line in your script is

_G.InitModules()

And then you can proceed to use a bunch of globals injected through setfenv. This is not something that _ENV can replicate, so these developers would have to migrate to require which not everyone is happy to do.

So maybe this is possible but it would be a big effort, would break a bunch of code and honestly since we did more or less find a way to have our cake and eat it too, we don’t want to spend time on this right now.

6 Likes

I just turn it off entirely when using the built in editor for whatever reason. No prediction is better than being offered incorrect stuff.

1 Like

Why not just create some sort of unique references to the same closure? If I understand it right, this might be similar to the fastcall implementation for global functions.

It might mean you still have to allocate a bit of data for the unique reference, but it could be done faster than reallocating everything and might serve as a good compromise.

I know this is a moot point, but I know that some exploits heavily rely on the ability to duplicate function closures, so, perhaps it would be possible. However, countering my own point they likely have a hacky implementation (no pun intended), or cause other issues in the engine around sandboxing or stability since exploits are usually an abomination of hacky sandbox-breaking nightmares so I don’t know how much that actually means.

That’s already what happens without this optimization; the goal of the optimization is to remove that allocation.

1 Like

There are a few reasons I define objects this way.

  1. Maintainability within a huge codebase.
    I need a custom module loader to keep dependencies explicit (for replication and compiling), and because I very much dislike require(something very tedious to type thousands of times) boilerplate. If I need to refactor a class, trying to track down every .CFrame specific to that class is very difficult and tedious. I define and use numeric indices explicitly, relying on a compile process to optimize by doing inter-module constant folding/inlining, as well as to convert {[iFoo] = 1, [iBar] = 2} to {1, 2} before publishing.

  2. Fewer string constants in each module.
    There’s something very satisfying about compiling a project and having unneeded information like field names stripped. This isn’t possible with the Luau type system because types/dependencies are soft and metatables could be used unexpectedly. Producing efficient stringless code makes me happy.

  3. Object size.
    I can create more objects in less time using less memory. I think there are plans to optimize for this specifically, but it hasn’t happened. I started doing this before Luau and I have a lot of code set up this way. When your object is used for 100 bones in an animation skeleton, it really adds up.

I have a code example and further explanation in this post: Optionally provide Luau type warnings during runtime in studio

To be honest, I think you’re wrong with your approach here, and have prematurely optimized in what are probably the wrong ways in some places.

  1. Like you said, this is partially a legacy code thing. Under the old Lua VM, there’s no question that using arrays of fields like that was better. But that’s not the case anymore.

  2. If you have a bunch of very small objects where you need to squeeze that kind of memory difference out of them, using arrays of fields isn’t the answer anyways, using a Structure of Arrays instead of an Array of Structures is.

  3. “There’s something very satisfying about compiling a project and having unneeded information like field names stripped.” – I mean, you’re free to think that, but I don’t think it’s a good idea development process wise. You haven’t given any argument as to why you think other people should also be doing that, simply that you personally like it.

4 Likes

It’s still the case relative to not being able to use intellisense or export types when using custom module loaders. If I remove a member from a class, code will fail silently and there’s no way for me to know; foo.CFrame is notoriously difficult to track down because searching for \.CFrame\W regex in my codebase yields 1120 results across 240 files. Accessing fields using class[_R.Class.iCFrame]
or _R.Class.GetCFrame(class) isn’t pretty and I don’t like it, but it provides a quick way to “find all references” and “go to definition”, which is immensely useful within large codebases.

This is true in a lot of cases, but doesn’t really work in this specific case because my animation system is request-based and only creates bone/joint objects when they are needed. For example a distant low-detail character doesn’t need hand joints until the player gets close and the high-detail mesh requests them, and eye-blink joints are never connected if the character is using the first-person mesh. While it could be designed that way and would work well for static assemblies, there would be lots of array shifting and added implementation complexity for my case.

Don’t get me wrong, I don’t enjoy writing object oriented code this way. My goal is to write maintainable code that compiles to be fast and lightweight, within a module loading system that facilitates fast join times by replicating modules and their dependencies only when needed. Using number-based objects removes nearly 900 unique class-field strings from my project, which would otherwise waste space in the global string hash structure, and in the constant tables of hundreds of ModuleScripts. Field names are practically debug information, and my code becomes stuck with it when using Luau types normally. With number-based objects I can justify inlining functions that would otherwise add more string constants to ModuleScripts that use them.

Before and after my compile process

Before:


After:
It reuses the module’s string constants for auto-generated globals, which is why “ButtonR3” is there. It uses upvalues in most cases, but will use globals if the function is nested within other functions.

It’s certainly possible to achieve fast and lightweight compile results within an explicitly typed system without resorting to writing code with number-based objects. It’s just not very practical to achieve in Luau because non-explicit code is allowed. I also can’t use Luau’s type system properly because requiring modules using require(script.Parent.Foo) / require(game:GetService("ReplicatedStorage").Utility.Bar) is the only way to get inter-module intellisense and use exported types. I really don’t like requiring this way, first because my codebase defines 5,000+ require-like dependencies across 1500+ ModuleScripts (I could automate refactoring but it would be a boilerplate and organization nightmare), but also because it doesn’t enable me to replicate modules as-needed to improve join times (at least not without preprocessing every time I press play when testing.)

I’m mostly content with what I’m doing. Sometimes I miss working on Visual Studio projects where I could write neat code without performance concerns affecting my coding style, but oh well. Support for numeric field names and linting support for class[Class.FieldName] would be appreciated though. I would certainly change my mind if I could get intellisense to work with custom requirers and ensure every field access is explicitly typed; Then my compile process could change field names to A, B, C / [1], [2], [3] automatically without worrying that non-explicit code might cause havoc by using an original field name.

I’d really prefer to get better performance from doing things the right way. It might be good if there were custom types that were implemented similarly to function prototypes, but with strictly typed fields/methods/interfaces, and perhaps some way to enforce using these objects explicitly so that “find all references” is possible.

1 Like

So… with your compile system. You could actually test this, right? I’m imagining that it wouldn’t be that much work to change the array access into field access since you already have all the pseudo-parsing set up to identify and modify the relevant parts of the code.

I’m suspicious that if you do test this out, replacing everything with field access, it will actually be faster, in spite of the “900 unique class-field strings from [your] project”.

(Basically compile down to .a / .b / … / .aa / .ab / etc instead of 1 / 2 / 3)

2 Likes

Well you could always introduce decorators and start a nightmare spiral of supporting developers manually turning optimizations off :wink:

I think this optimization is worth pursuing if it doesn’t break anything major; Stravant is probably one of like 5 people in the universe who knew that functions refs can be passed across BindableEvents like that (I didn’t, despite knowing a lot about Roblox), let alone used it for anything.

That said, what are the odds we end up with a built-in symbol class/primitive eventually? It seems like people use them a lot (Roact uses newproxy for Symbols, which is probably not desirable haha), so it might be worth considering.

1 Like

Yes, I can definitely change it to use strings just by changing the keys from numbers to strings. I would be completely okay with using .a / .b / .c because there would only be a few dozen unique field names (compared to 900.) I have a few hacky classes that reuse the table in other ways (like using the end as an array, or string keys as a lookup) so I’d need to look at it case by case.

I’ve done performance tests in the past on using the array part vs the hash part, and I remember access performance being better on string-based tables, but allocation performance and memory usage being better for array-based tables.
Here’s an allocation performance test from today. The results are pooled from the clients of a few random people who joined my profiling place over the course of an hour.

It takes 0.65x the time to create an array-based object with 8 keys compared to a name-based one. It’s not a real-world test, and pushing the VM this hard might affect the results, but that’s still a very real difference that can affect how many objects I can create before causing a frame spike. Although it’s interesting that {A = v} is slightly faster than {v}.

Source for tests
local profiles = {}

local newTest = function(create)
	return function(count, spoof)
		local v = spoof(nil) -- v is always nil, but the VM doesn't know that.
		local clock0 = os.clock()
		for _ = 1, count do
			local t = create(v)
			if v then t = spoof(t, v); end
		end
		local clock1 = os.clock()
		return clock1 - clock0
	end
end

local add = function(name, create)
	table.insert(profiles, {
		LocalName = "create/" .. name,
		Test = newTest(create),
		TestControl = newTest(function(v)
			return v
		end),
	})
end

add("0", function(v)
	return {}
end)

do
	local mt = {
		__index = {A = 1}
	}
	add("0(metatable)", function(v)
		return (setmetatable({}, mt))
	end)
end

add("a1", function(v)
	return {v}
end)
add("h1", function(v)
	return {A = v}
end)

add("a2", function(v)
	return {v, v}
end)
add("h2", function(v)
	return {A = v, B = v}
end)

add("a4", function(v)
	return {v, v, v, v}
end)
add("h4", function(v)
	return {A = v, B = v, C = v, D = v}
end)

add("a8", function(v)
	return {v, v, v, v, v, v, v, v}
end)
add("h8", function(v)
	return {A = v, B = v, C = v, D = v, E = v, F = v, G = v, H = v}
end)

add("a16", function(v)
	return {
		v, v, v, v, v, v, v, v,
		v, v, v, v, v, v, v, v
	}
end)
add("h16", function(v)
	return {
		A = v, B = v, C = v, D = v, E = v, F = v, G = v, H = v,
		I = v, J = v, K = v, L = v, M = v, N = v, O = v, P = v,
	}
end)

return {ProfileList = profiles}
3 Likes

@make_life_more_difficult_for_roblox_engineers is totally a decorator we could add!

Yeah, so I’d say for now using newproxy for this is probably okay. That’s consistent with existing practice at least.

newproxy isn’t deeply problematic in any way, really. newproxy(true) is slightly annoying, because that’s the only way in a fully sandboxed Luau environment like Roblox that you can make a userdata (that traditionally is just for host-exposed types!) with a metatable, but I’m not actually sure that removing support for that, not that we could, would make anything measurably better. And we could always introduce an extra internal proxy type that newproxy maps towards, there just hasn’t been a strong need.

The one caveat of course is that proxies don’t roundtrip through reflection. However, the fact that functions do is actually Not Great, it resulted in exploits in the past, thread safety issues and the like - I think we patched them but it’s the only object, if I recall correctly, that preserves its identity when being passed through reflection, which means that it points directly to the relevant VM object - which is a poor design choice and it would be nice to break this one day. Which is to say that even if we had symbols/etc we wouldn’t want to roundtrip them through reflection, so proxies are okay as it is.

1 Like

Exciting to see how scripts are always improving!

Some are definitely more exciting than others, but still very exciting!

Not sure what specific update this was, but recently the type checker for Luau, despite it being not strict, is throwing warnings about many type checks I had. I resolved a lot of them, as it was just a “Boolean” → “boolean” and etc, but there are a few such as “Function” that I cannot rename to “function” due to “function” taking a name space already. Is there a way to get around this?

This topic was automatically closed 120 days after the last reply. New replies are no longer allowed.