How will this impact ServerSide loadstring? I still have use cases for it, and want to know if it causes similar issues such as setfenv/getfenv.
Server-side loadstring continues to work; if memory serves, scripts that call loadstring
behave as if they called setfenv
/getfenv
- some optimizations are disabled from that point on, but behavior should stay exactly as it used to be.
Yes, this is accurate. The compiler detects idiomatic iteration using pairs
/ipairs
and converts it to fast internal VM constructs that don’t require function calls. Direct use of next
isn’t detected and isn’t advised.
We currently aren’t doing inlining and loop unrolling - both are definitely possible but some internal mechanisms in the compiler aren’t quite ready to support these. Although specifically loop unrolling might not be that interesting that often to justify the code size increase…
Creating attachments (using the tools provided in the model tab) doesn’t work anymore when I have this enabled.
I would like to request that we have the new VM on the test place of Phantom Forces: test place - Roblox
I am pretty confident Phantom Forces is not broken by the VM from my testing, but I don’t know for sure. So test place for a day before the main place would be a good idea.
And I am working on a new game which would benefit greatly from the VM because of all the really heavy Lua, and it’s phone only, so quite the combination for testing!: 💣💣 Merc Sim [ALPHA] - Roblox
I expect that, on phone, the mobile FPS would see a 100% or more performance gain on the new VM, which is very exciting.
game:GetService(“RunService”) this is still a proper way correct?
That’s the best practice way and recommended way for getting services, yes.
So this is the way that’ll work in the new VM, correct?
That’s the correct syntax to get a service, yes.
Once again, nothing is changing in the new VM beyond not being able to call Instances (you won’t be able to do things like game("GetService", "RunService")
). Everything else should work exactly the same.
Would it be difficult to optimize next too? I spent a day converting my game’s 150k line codebase to “in next, foo do” because I had discovered “in pairs(foo) do” was ~1.45x slower on small tables in the old VM.
It wouldn’t be too hard for me to automate converting it back, but I’d like to be able to compare performance and optimally support both VM’s for now.
The addition of library functions like “table.new(int arrayLength, int hashLength)” and “table.find(table array, indexStart = 1, indexEnd = #array, int step = 1)” would greatly optimize my game.
Instructions that optimize “setmetatable({}, foo)” would be helpful, but it may be tricky to optimize cases like this:
local function foo()
local self = {}
setmetatable(self, bar)
return self
end
Also, does the compiler optimize away constant upvalues like “local foo = true” for configurable scripts and “local setmetatable = setmetatable” for code what was optimized for the old VM?
My game (once compiled) uses setfenv to provide fast, lazy access to cross-script functions and data. To use less memory, all compiled scripts share the same environment (if ‘script’ is needed, it is localized on the first line before setfenv is indirectly called via _G.) Would it be possible to get the performance boost of using global functions like “pairs” or “string.find”, while still using a custom environment? Overwriting default library functions using setfenv is not a common use-case.
Locals are almost always preferable to globals, but there are cases where upvalues can hog memory (32 + 8 bytes per function) in scripts with nested functions that use many upvalues. Are there plans to address the memory usage of functions? Could the memory to reference the function’s environment be optimized if the function uses no globals?
This is a bit far-fetched, but would it be possible to recreate function prototypes internally with respect to constant/global upvalues in ModuleScripts, so that these upvalues don’t use 8 extra bytes per upvalue? This wouldn’t be beneficial if ModuleScripts are cloned and re-required though.
Anonymous/in-line functions that lack upvalues/globals could also be optimized. If the same functional-function is created multiple times, they could simply reference the same object in memory in theory, although I’m pretty sure the __eq metamethod doesn’t compares equality if the functions are unique (even if they use the same prototype), and someone’s code may rely functions being unique by testing equality or using them as keys in a table.
There are so many common cases that can be optimized by the Lua compiler, here are just a few examples of optimizations that my custom Lua lexer/simplifier does:
This:
local function foo()
return bar
end
return {foo()}
is slower than:
local function foo()
return bar
end
return {(foo())}
This:
local Methods = {}
function Methods:Foo()
end
is slower than:
local Methods = {
Foo = function(self)
end}
There are also cases where function calls or unused variables that have no “side effects” and can be ommitted from the bytecode entirely. If multi-pass optimizations are too slow for testing in studio, perhaps they can be made when publishing games to the site.
What do you mean by not being able to call instances, like not being able to do this
local function GetBricks(StartInstance) or
local b = Instance.new(“BodyGyro”,seat)
You will not be able to call methods ( Object(Method, …) ) but everything else will work fine.
This is not a lot of memory, I’m not sure what it would be optimizing for or if this can even be reliably optimized. It’s hard to reach even a KB of data alone in function closures unless you’re spamming their creation.
Lua 5.3 actually already has an optimization for this, but it works on all functions by not allocating a new closure if there’s already a previous one cached that either A. has no upvalues or B. refers to the same upvalues. So sounds totally doable.
Like the example I provided. I mean using Instances as functions, not defining functions or calling methods.
Alright, but will those two line of code work in the new VM
Given what I just said about being able to define and call functions, yes. If you still don’t understand I suggest you go reread this thread.
Here’s a visualization of how upvalues use memory:
local function bar()
end
local function foo() -- +32
return function() -- +8
return function() -- +8
return function() -- +8
bar()
end
ens
end
end
I often end up with many nested functions when I’m trying to simulate complex behaviors in my game, and that +8 bytes for every closure makes globals very tempting.
I also would like to try referencing my game’s data system using either locals or globals so that I don’t need to use setfenv.
Before I publish to roblox, my game basically compiles this:
-- I set "_G.meta" before requiring modules, so I can know what modules it depends on.
-- I don't remember why I chose "meta"
local Reqs = _G.meta:Reqs{
-- I use referenceId's instead of traversing the DataModel
-- This also makes requiring modules a lazy process
-- It also only sends modules to the client when needed
SomeClass = 1234;
}
return function()
local test = Reqs.SomeClass.new()
Reqs.SomeClass.SomeMethod(test)
return test
end
into this:
_G() -- inject environment
return function()
local a = _0()
_1(a)
return a
end
I could ommit the setfenv by doing something like this:
_D = _G._D -- reference data system
return function()
local a = _D[1]()
_D[2](a)
return a
end
The above example doesn’t require converting strings to referenceId’s (it only does this once, then caches the result in the environment), but it requires globals, and results in a “getglobal” instruction as well as a “get” instruction.
I’d like to be able to localize “_D” without adding 8 bytes to every function in my entire game that references my data system.
So as long as you’re not using the “hack” type syntax the code should be fine?