The default value for Color3 is in raw (0 - 1), yet all of the color-related features in Studio don’t support it and instead use the should-be-deprecated 8-bit system. This can really slow down my workflow, and also encourages use of inconsistent color formats within the same game.
Here are examples of both:
I often use Color3:lerp() to smooth color transitions between areas. However, this is really slow to do as :lerp returns a Color3 in the raw format, not in the 8-bit (0 - 255) format that Studio exclusively loses, so I need to manually convert each returned value again back into the 8-bit format. The shortest way for me has been to just run this: print(Color.r * 255, Color.g * 255, Color.b * 255)
for every color value (usually at least three). It’s so slow and unwieldy though, why can’t I just copy the color in the raw format myself? Not to mention that it’d be much easier in general to just use the raw format in the properties tab so I can potentially process the lerping with my own brain in some cases.
I’m a UI designer and a scripter, and my workflow is generally to find a UI style and make it all in the editor, then script everything. But the problem is that the editor exclusively uses a completely different color value than scripts, which means I have to convert the colors, and when converting between them I either:
- Use Color3.fromRGB(), wasting performance (unnoticeably, though) and splitting the colors in my scripts between raw and 8-bit values (raw for quickly constructed colors such as white and black made in the script, 8-bit for colors taken from my UI style, plus by default everything works in raw anyway).
- Spend time converting the 8-bit colors to raw colors, and additionally deal with the insane decimals that will result
I consider the first to be the lesser evil and the most ergonomic, so that’s what I use. This situation can be likened to using both the metric and (whatever the American measurement system is called) measurement systems on the same project. Fun fact: A NASA project once failed solely because of incorrect conversions between the two.
Ideally we would be able to set our entire color mode in Studio to whichever one we prefer – this would fix both of the problems I mentioned. I’d personally use raw, as it has the most synergy with scripting. Regardless of which mode we have enabled, values entered in either format should be automatically converted to the correct mode (the difference between them is simple: raw values are decimals 0 - 1, 8-bit values are integers 0 - 255).
Fixing this issue will solve the current inconsistency (and confusion! especially for beginner scripters trying to construct Color3s with the default constructor with 8-bit values) in development and speed up and make-sense-of the entire color system.