What this feature does
Part size is currently limited to 0.2 and we have a feature that will lift this limitation down to possibly 0.01.
The Goods
We are trying to evaluate what issues we may run into with this feature. Specifically, we want examples of what you would utilize this feature for with physics-based builds. Also, what concerns do you have regarding the implementation?
If you have place files (not id’s) where you would want us to look at this feature being used we would be happy to check them out.
Parts that are 0.2, 0.2, 0.2 currently are sometimes quite hard to select. Parts that are even smaller I cant imagine being any easier. Would the selection box stop shrinking to a certain point?
I think one of the most popular reasons is to make ParticleEmitters generate from a small point in space as opposed to a 0.2x0.2x0.2 area which has some obvious offset. A while ago I made a test particle emitter based on a rotating point, but the reversion to the 0.2 minimum ruined the effect.
This would be useful for any case where you have an invisible adornee/emitter part that has to be as small as possible, especially for ParticleEmitters since they emit from a random point within the volume (and we may want them to act like point emitters).
I’m very much worried that novice developers are going to mess up frequently and have a bunch of 0.01 parts scattered throughout their maps or even get the measurement offsets for their places completely off scale. My biggest worry here is the potential unfriendliness towards beginning developers. Even for advanced developers, I cannot imagine how difficult it would be to maneuver small parts.
For me, a builder who usually happens to be building in extremely tiny scales, I use meshes to make parts appear smaller than 0.2. I actually fixed the old minimap scripts to work with newer parts, but I still do run into issues. As a result, I remain conflicted.
However, I would strongly recommend adding a CornerWedge option to SpecialMeshes. Regular parts, wedges, and whatnot can be scaled down to infinity with the help of default meshes, but CornerWedgeParts cannot (unless you decide to do some wacky union tricks). As for this issue, though, I’d probably have a boolean property in BaseParts called “FreeSizingMinimum” or something along the lines of that, which would be set to false by default.
Maybe a 0.1 minimum. But a 0.01 minimum just feels excessive.
This is another thing I’m concerned about. Although it’s not the best of practices, I’ve found myself doing this several times myself. I feel like, if anything, backwards compatibility in scripting should also be taken in consideration here.
I don’t think we should limit the capabilities of the whole engine solely because size 0.01 parts are hard or impossible to build with. There are programmatic uses as well, such as scaled modules for 3D GUIs, tool viewmodels, or small detail parts from meshes or wherever. 0.1 is still too limiting for some use cases, usually for models that appear very close to the screen.
Addressing building and selecting difficulties:
It can be set up such that, by default, it’s not possible to resize a part below 0.2 with the resize tool or the properties panel. A setting can be made to change the minimum part size for build tools.
Addressing script backwards compatibility:
Some scripts may have been setting part sizes below 0.2 previously, with the expectation that they would stop at 0.2. I don’t think this is very common, and those scripts somewhat breaking sounds like an alright consequence for improving the engine. If it is determined that the consequence is too great, then there can be a setting for backwards compatibility that limits the ability to programmatically set part size.
I really don’t want to see the whole engine limited because of problems that are easy to work around. I’ve been hoping for unlimited part sizes for a long time. If the tools don’t work with it, then limit the tools, not the capabilities of the engine as a whole.
I think you summed up my view really well. We have been needing small part support for a while now and I’d rather not see it limited. Like you said, if anything just limit the tool’s abilities but please give us the option to change that limit so that those of us who are more experienced can use the default tools to their full potential.
That should be quick for any active developer to audit and fix. And that is a good example of what is definitely non-future-proof coding, as opposed to a legacy pattern we’d expect to have special-case backwards-compatible support for. As I’m sure you know, it’s not usually the best practice to wantonly supply out-of-bounds values to functions or properties, knowing they will fail validation and be subject to corrections you don’t have control over and can change with any update. It’s perfectly within the right of an API developer to change from clamping the values to rejecting the property setting altogether. That’s not going to happen here, just general software dev advice about why it’s best to not rely on undefined behavior of API calls.
Could you elaborate on this concern? Would it be something that affects your personal set of tools, or something you publish? Just to be clear, a new lower part size limit would not ever force you to make a 0.01 part, and if you use the quantized move and scale tools, you could continue to work in 0.2, 0.5, 1 increments, etc. If you’re modifying the source code of your plugin, couldn’t you also just add your own constraint if you wish to keep 0.2 as the lower bound?
Anyone wanting to work with parts this small is likely doing so already with imported meshes, and just accepts that you can’t click on sub-pixel things, you need to select them in Explorer or with a marquee drag that catches them. Or get really close… That isn’t unique to Roblox either.