That’s great to hear! But I will need to remind you that this is not intended for real-time applications (like games).
Then why bother optimizing it? If it’s not meant to be used in projects then what is it for?
Roblox is slow. Roblox is very slow. Walking around in 2 frames per second is unbearable. Not only are we optimizing it for the novelty of being the fastest, but we are perfectionists too.
As I said in the post, it is meant to provide a fast real-time preview of your scene, and it’s not meant to be played in games in real-time. I guess the only exception is when you’re aiming for a niche aesthetic not seen in any other game.
I will be posting how the pathtracer looks in real-time soon! I hope this clears most doubts.
Holy shit DAYUM! You cooked! None of my ray tracers are anything close to this man, dayum!
THIS IS WHAT I WANT TO SEE!! I was working on my own path tracer with a denoiser that can look alright at around 20 samples… THIS IS INSANE! I think this is a new thing. First it was Crazyblox making a realtime raytracer, now I am seeing stuff like THIS! Good job!
I found this and wanna share with you. This is made for heavy calculations like this.
ActorGroups look promising! I’ll probably not use it because I already have made my own parallel luau module using the structure of fragment shaders as inspiration. I think ActorGroups is slightly more performant though, so I plan to update my module to closer represent theirs (but crediting @athar_adv of course)
And as of now, I don’t want to delay the release anymore. Most of the todo list is already done, and it’s mostly UI that is a roadblock for me. Getting good environments is also a hassle. I’m also waiting on the OSGL v1.6b release because the pathtracer uses the nightly build.
One thing I wanna know; which build gets the most FPS and when does it start to go lower than 10? And would be great if you could give link to an example place.
I’m planning to release it as a downloadable place. The first release won’t be downloadable though; I will only open-source it after it is more or less complete. Right now I’m trying to get a demo up and running quickly for you guys to test it out. After the demo phase is over, you would be able to get a copy yourself.
FPS rarely goes down to 10 or less. For extremely intensive renders where all the bad stars align (high render resolution/fidelity/settings) it might drop to 10 and less. I plan to implement two render modes: real-time and non real-time. That should void any stuttery experience.
I meant a playable published game like your previous ray tracing test which is awesome.
A playable demo won’t come until later. There are still many things to implement ahead of me, and I work on this project in my free time (and final exams are coming up, so you can see how this will take a while). I’m the only person who’s actively maintaining this project too, so it’s really moving according to my pace. Expect a playable demo anywhere from tomorrow or 6 months later.
Hello, Id also implement some form of actor work scheduling as to not have too many worker tasks colliding with eachother… this can get very messy for big long computations so it is extremely nice to have
I have an idea for a realtime version. I made a denoiser that is not optimized, but can give a decent image at around 20 samples. Maybe the realtime version could not be bloated with pbr materials, so you don’t have to worry about blurring textures. Even better, multiply the textures back into the image like a post process. You could also add more samples that don’t go through all of the calculations around actual samples. These are just ideas I had based on my project that may or may not work with how u built the engine, but it could make the realtime version look a little better.
I appreciate your suggestions!
However, I do not plan for a real-time version of this. This will be mostly used for getting nice-looking, eye-candy, and blenderesque renders. What method did you choose for denoising? If it’s good enough, I might use it for renders! There already exists 2 denoisers in the current build:
- mean: fast and simple, works by averaging neighboring pixel’s color
- nlm: anisotropic diffusion. preserves edges, reduces noise, and still being relatively simply as a postprocessing step.
I will also make a different, primitive ray tracing function. It will be much faster for not doing extra calculations and this can be used for real-time applications if you want, but I still recommend crazytracer for that. crazytracer is actually focused on being fast and real-time. That’s what differenciates raytracers and pathtracers.
Your idea of “multiply the textures back into the image like a post process” does not really work because you’re giving up on the fundamentals of PBR materials entirely. These “fundamentals” are as follows: roughness, metalness, and normals. And besides, this is only really used in very old rasterized games.
Regardless, I appreciate your input, and you can build upon the pathtracer once I open source it!
This is neat! I love it, keep it up!
Yea that is a drawback to multiplying textures back.
About my denoiser
My denoiser isn’t complicated, but if you’re curious all I did is take samples around a pixel and set the sampled pixel’s color to the pixel’s color. Then run that over all of the pixels. I did it that way, because the blur was stronger. It’s good to not sample right next to the pixel, and instead give it a little bit of a gap. It’s not optimized, but that got myself good images.
I almost forgot to mention the edge detection. I use depth, normal, object’s color, object’s material for checking edges.
I guess the one issue with blur denoisers is the image can look a little lumpy and edges close together can’t blur enough to get good coverage. Only on 20 samples though. Theoretically more samples should remove that problem.
I just thought of an even better idea. You can scrap the all of the buffers except the normal buffer by just checking if the object’s unique id is the same, then using the normal to retain object edges. I have not tested it, but it could work.
I tested it. it didn’t work. WHY DID ROBLOX BLOCK SCRIPTS FROM GETTING THAT DATA!!
holy. this is peak editableimage
I said I’m going to make a video showcasing the engine, but kind of forgot
This is the latest build, which has most of the promised features (except bloom). But don’t expect an edited video, this is just a very crude OBS recording. I have many things to do today, so I won’t be working on bloom for now.
It renders the current scene in real-time (albeit, at a low resolution) with tonemapping. It uses an approximation/fast version of ACES, then post-processes it using the slow, more beautiful ACES when done rendering. It also uses a better configuration of the NLM (anisotropic diffusion) denoiser than in previous releases. Hope that explains most of the new things shown here.
Seriously, ive played this game before march. And wow its so fasttt! Also, can it render particles and particoe based volumetric lighting and billboard guis?