Texture Blending Api

As a Roblox developer, it is currently impossible to blend textures together to create a composite image. This is currently something that would be very useful. In the future with PBR materials this will be something that’s necessary in order to layer textures such as makeup or dirt onto skin, scratches onto cars, paint onto walls, and any other situation where an object partially changes appearance.

My proposal is a family of objects that facilitate texture blending while keeping in mind that roblox users are kids and need simple interfaces.

Instance -> TextureBlendingComponent

:OutputToContent(String content)

When a blending operation is ready to output, the developer calls OutputToContent. The provided content must start with “runtime://” to prevent the developer from writing non-runtime content. When OutputToContent is called the game will save the generated image to that content id and update any objects that are using that content id.

TextureBlendingComponent -> ImageContentSource

.Source [Content]

This is how you get an image into the blending pipeline. Outputting it will output that exact image to the target content id.

TextureBlendingComponent -> OpacityModifier

.Input [TextureBlendingComponent]
.Opacity [Content]

This is the first operation-type component. Operation-type components take an input TextureBlendingComponent and apply an operation to them. You could make an image semi-transparent using this object.

TextureBlendingComponent -> HSVModifier

.Input [TextureBlendingComponent]
.Hue [Number]
.Saturation [Number]
.Value [Number]

Like the one before it, HueShiftModifier takes TB component and applies an operation with an argument.

TextureBlendingComponent -> TextureBlend

.Top [TextureBlendingComponent]
.Bottom [TextureBlendingComponent]
.BlendingMode [BlendingMode]

This one takes two arguments. It is used to layer two textures together.

TextureBlendingComponent -> AlphaMask

.Input [TextureBlendingComponent]
.Mask [TextureBlendingComponent]

This one masks the input texture using mask as a reference. It can be used to cut textures up and make them fit together more nicely. In this way it can generate a new shape.

TextureBlendingComponent -> NoiseTexture

.Seed [Number]
.Scale [Number]

This one generates a noise texture using the given seed.

TextureBlendingComponent -> TransformModifier

.Input [TextureBlendingComponent]
.Offset [UDim2]
.Rotate [Number]
.Scale [Vector2]
.Skew [Vector2]

This one performs several actions all at once!

Usage Example

To blend dirt and customizeable makeup onto a character:

-- Generate skin texture by modifying its HSV
local skin_source = Instance.new('ImageContentSource')
skin_source.Source = 'rbxgameasset://skin'

local skin_hsv = Instance.new('HSVModifier')
skin_hsv.Input = skin_source
skin_hsv.Hue = data.skinHue
skin_hsv.Saturation = data.skinSat
skin_hsv.Value = data.skinVal

-- Generate dirt texture by adding random noise then making it transparent
local dirt_source = Instance.new('ImageContentSource')
dirt_source.Source = 'rbxgameasset://dirt'

local dirt_noise = Instance.new('NoiseTexture')

local dirt_blend = Instance.new('TextureBlend')
dirt_blend.Top = dirt_source
dirt_blend.Bottom = dirt_noise
dirt_blend.BlendMode = Enum.TextureBlendMode.Multiply

local dirt_opacity = Instance.new('OpacityModifier')
dirt_opacity.Input = dirt_blend
dirt_opacity.Opacity = data.dirtiness

-- Generate makeup texture by changing colors and opacity
local makeup_source = Instance.new('ImageContentSource')
makeup_source.Source = 'rbxgameasset://' .. data.selectedMakeup

local makeup_hsv = Instance.new('HSVModifier')
makeup_hsv.Input = makeup_source
makeup_hsv.Hue = data.makeupHue
makeup_hsv.Saturation = data.makeupSat
makeup_hsv.Value = data.makeupVal

local makeup_opacity = Instance.new('OpacityModifier')
makeup_opacity.Input = makeup_hsv
makeup_opacity.Opacity = 1 - data.makeupTransparency

-- Blend those 3 final textures to create the composite character texture
local makeup_skin = Instance.new('TextureBlend')
makeup_skin.Top = makeup_opacity
makeup_skin.Bottom = skin_hsv

local with_dirt = Instance.new('TextureBlend')
with_dirt.Top = dirt_opacity
with_dirt.Bottom = makeup_skin

-- Pick a contentId and output to it
local contentId = 'runtime://' .. player.Name .. 'Skin'


-- And apply the final image to the character
character.Texture = contentId

There are tons of options for a comprehensive texture API, such as

TextTexture - Generates text as a texture
Isolate - Performs a magic wand operation, returning the selected pixels
VectorNoise - Like noise but uses all 3 channels instead of black and white
GridLinesTexture - Generates grid lines
CheckerboardTexture - Generates a checkerboard pattern
... and more!

Using chained APIs like this you could generate simple textures and patterns on the fly, or even generate complex textures like dynamically generated UI. If you do it on the server there doesn’t even have to be any load on the client, and the intermediate textures are never even generated. Only the final output sticks around in memory and only when you request it, and then it automatically applies to everything referencing the given content id.


I would love something like this and use it all the time for car customization in my game, however I think it would be nice if there were also a way to live preview it instead of always just generating a new ID. In addition, exporting only to runtime:// might not be the best since people may want to use this in plugins or studio tools. I think it should be an option.

Regardless, it might be hard getting this past the moderation bottleneck that plagues many feature requests.

It generates to the ID you provide. Doesn’t have to be a new one each time.

Also there’s no reason runtime:// wouldn’t be able to work in studio. Runtime data, by definition, does not serialize.

1 Like

Oh I see what you mean, so I could create a runtime asset and then overwrite it later by rewriting a new asset to the same ID. That sounds good, though I still don’t see how it could work in studio with people publishing their models to the website and such with custom runtime URL’s that could conflict with other objects in the workspace when it’s inserted.

Just prefix your runtime content with your game name or something if you don’t want free models to collide. Just like with datastores.

1 Like

This probably can’t be an API you can use at runtime. When fancy material / texture compositing like this is done in other engines it is made efficient though a ton of offline baking of textures and/or shaders.


This API is specifically modeled after unreal engine’s material nodes, which operate at the shader level. I think this could totally be doable if it’s implemented as a shader on top of a lazy texture generator. This way one-off generations are optimized as texture files after a few frames and newly updated content can still instantly render the modified texture as a shader.

But I don’t actually know anything about how roblox implements rendering.

1 Like

That sounds great until you want to make it work on mobile. Yes unreal does work on mobile but if you actually want it to perform on mobile you need to be extremely conservative with what features you’re using.


I think in general developers can be trusted to make code perform on whatever devices they’re targeting. Roblox also has its own throttling options that it can impose.

Still, having the entire engine dragged back just because of one system is a worse decision. Either way, mid-end Android and most iOS devices less than 2 years old are powerful enough to run pretty much anything as long as it isn’t too intensive.

It is down to the game’s developer to polish the gameplay, while the engine’s creators make sure that all the features work as optimally as possible. Who says someone can’t implement a busy wait or add hundreds of sprite sheet textures?

1 Like

why let mobile weigh down the platform when not all games are set to release on mobile / are lightweight enough to handle it. Performance is ultimately going to be in the hands of the developer regardless, so I don’t see why it couldn’t just be supported so that those of us who do know what we’re doing aren’t forced to cater to those of us that don’t.

1 Like