Wasn’t all of your data pre-calculated in that? That would probably mean that the slowest bit is just looking up data and setting properties; which would make sense that doing what you did made it faster.
For code which does lots of calculations, (e.g. inverse kinematics, physics simulation, etc.) BlueTaslem’s points make a lot more sense as they affect the speed much more.
The data is, but the actual things like the UDim2s aren’t. About 64 are created per 1/60 of a second, so storing the index for UDim2.new does prevent a lot of those calls.
Generally speaking, you don’t want to optimize for things that aren’t a problem.
The usual workflow I’ve seen is to write code/features, run a profiler and work specifically on the problematic areas.
In roblox, the tools for this are somewhat lacking and poorly documented (something I’d like to fix in the near future), but the Micro Profiler will get you an idea of what’s taking the most time.
It’s only a micro optimization if you only :Connect() once to the anonymous function.
Not to mention creating anonymous functions that are created when scripts first run won’t make a difference other than a few milliseconds or something (huge scripts loading in VM), if you are doing iterations with anonymous functions it could be a problem as you are recreating the anonymous function every frame or step and at that point you could just declare a function in place of anonymous functions and pass that for micro optimization in most cases, in some its a huge optimization really depends on what you’re trying to achieve in the end especially when not all declared functions will work and you may have to use anonymous functions for expected behavior (ran into this a couple times this past month).
Yes, not creating a new function each iteration, and instead reusing it, would be better. I’m saying for a single connection, defining a function may be overkill.