I did that for my rasterizer it is actually fairly simple if I understand what you are saying, you just pass each channel into this formula and then bam, mapped.
Fairly similar, I did the same when I made a rasterizer a few months ago for fun (lost place files unfortunately) but for this I’m not passing all the channels, I’m mostly looking at the average of all channels and a weight value.
Weight algorithm:
local function Weight(X,Y,Z)
local Sum = X+Y+Z
local WeightX = X/Sum
local WeightY = Y/Sum
local WeightZ = Z/Sum
local TrueWeight = (WeightX > WeightY and (WeightX > WeightZ and -1))
or (WeightY > WeightX and (WeightY > WeightZ and 0))
or (WeightZ > WeightX and (WeightZ > WeightY and 1))
return TrueWeight
end
So that if one channel is RGB of (1,0,0) and the other is RGB of (0, 0, 1) or RGB or (0,1,0) they’re not considered as being within range (since they’re literally entirely different colors)
And of course it’s a lot more complex than just that but yeah that’s the generics of the code.
I used baseparts. 1920x1080 = so about 2,073,600 parts? At 160x90, the game was able to keep up at 50fps with the video being shown at 30fps.
Oh wow that’s pretty complicated I just floor the channels so that they can only be either 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9 or 1.0
Well yeah I have to do this since it’s RGB, I just did simple comparisons for my grayscale rasterizer.
I’m probably overlooking something but I just did this for my rgb to color thing
import math
rBitDepth = 100
gBitDepth = 100
bBitDepth = 100
file = open("output.txt", "w")
file.write("{")
precision = 1000
for r in range(rBitDepth+1):
for g in range(gBitDepth+1):
for b in range(bBitDepth+1):
file.write("\n")
rRatio = round(r/rBitDepth * precision)/precision
gRatio = round(g/gBitDepth * precision)/precision
bRatio = round(b/bBitDepth * precision)/precision
rRatio = round(rRatio * 255)
gRatio = round(gRatio * 255)
bRatio = round(bRatio * 255)
idxAddress = (rRatio << 16) | (gRatio << 8) | bRatio
file.write('['+str(idxAddress)+'] = Color3.new(' + str(r/rBitDepth) + "," + str(g/gBitDepth) + "," + str(b/bBitDepth) + "),")
file.write("}")
file.close()
--colorPallete is a module of the output.txt file
colorPallete[bit32.bor(bit32.arshift(round((round(backbuffer[dimensions.Y*(y-1)+(dimensions.X - x +1)].X*8)/8)*255), -16),bit32.arshift(round((round(backbuffer[dimensions.Y*(y-1)+(dimensions.X - x +1)].Y*8)/8)*255), -8),round((round(backbuffer[dimensions.Y*(y-1)+(dimensions.X - x +1)].Z*8)/8)*255))]
The weighting I’m referencing is done for optimizing pixel count on the lua front-end, rather than on the backend. My backend uses run-length encoding, all the RGB values are represented as single bytes (characters) since it’s 0-255.
edit: Did I mention this data is streamed in real-time from a get request with a simple JS webserver?
streamed in real-time
Is this like 30 frames per request?.
Next idea: Take very short noises from the marketplace and pitch them to match into a sound. E.g. I can upload an amogus sound and then it will take amogus.length / shortSound.length
and loop through it that many times and pitch each sound to make it sound like amogus sus meme. Seems like a fun project.
Oh dear god that sounded misleading. No it’s a single request but the data is computed, written (and therefore streamed to roblox’s GetAsync’s buffer) in real time.
Suprisingly using BaseParts makes the process literally faster rather than using Frame (for me atleast), unfortunately I can’t render a 1920x1080 picture because Studio fills up the ram completely and either would crash the entire Linux userspace or else making the kernel OOM kill Studio
I probably need to setup a local server like how OP did, because the RGB picture data are stored in a single ModuleScript (I compile picture data with a custom Python script I wrote that turns them to Lua arrays)
Edit: 720p render
Maybe if you create a sound with a single note, technically you can create other notes depending on the pitch you change it to. Just send over a frequency table and maybe play around with it??
If you’re using python, you can probably send it over directly using Flask. Can shorten string length by using hex.
I don’t know anything advanced as hex manipulation but thanks for the tip! I’ll work on implementing them
You should look into parallel luau.
Yup, I rewrote the renderer using parallel Lua but it’s still kinda slow so I’m trying other methods in addition to that.
I have managed to pull this off with my canvas module to create super performant and decently high quality images on a GUI or SurfaceGUI from any PNG file.
Loading this 128x128 image is pretty well instant with little to no lag at all, since my method of storing image data just consists of look up tables that can be generated via plugin which can get binary contents of a PNG file.
My module will also happily render images at 256x256, it’s just that my method of for storing images uses strings, which means image sizes are limited to 128x128 for now.
It also helps that this canvas has a really efficient frame compression method with the help of UI gradients to store many pixels in one frame, which leads to 1457 total frames used for this image above.
Only problem with using frames in a GUI, is that if the resolution is high enough, the frames will fail to render. I only seem to experience this issue at around 300x300 with a rendered image with heaps of colours on the GUI.
I’ve been stalking these pixel posts since like 2020 haha
Changed my mind, i never knew you had a Canvas Module! now i will make my system using the module.
Question doe, Isnt it better to use RichText coloring to make a frame, rather than using frames?
nah. I have tried that ages ago and you just cannot get a perfect a pixel and it is very tedious to get working. Anyway, my method of displaying pixels has frame compression, which means very few frames are being used to display images with the help of UI gradients.
This also means I can display very high resolution images thanks to the low amount of frames being rendered