I am excited to showcase something ive been working on for the past few weeks!
Research Paper
NVIDIA Corporation, Chentanez, N., & Müller, M. (2010). Real-time Simulation of Large Bodies of Water with Small Scale Details. In M. Otaduy & Z. Popovic (Eds.), Eurographics/ ACM SIGGRAPH Symposium on Computer Animation. https://matthias-research.github.io/pages/publications/hfFluid.pdf
THE PERFORMANCE RESULTS IN THE VIDEO ARE NOT ACCURATE NOW THAT IT RUNS IN PARALELL!
It runs in real-time with around 2000 particles.
To Do
-Optimize more
-Add more features of the simulation for more liquids (slime, honey)
-Use Parallel luau for more speed
-Find better way for large scale bodies (oceans)
-Use EditableMesh or Shaders (PLEASE ADD A SHADER EDITOR ROBLOX) to combine particles and make foam.
I apologize for the confusion. It is not a bunch of spheres. You can see by the splitting of the particles and not pooling up! I forgot to reference the pdf I used.
Looks great, but do particles combine when they collide? This would allow you to create massive amounts of water without creating too much performance issues.
And also, definitely use Parallel Luau and Native mode to increase performance even further.
I dont know how to do that lol. Ill look into it since performance is key. its just a little confusing. I guess i can ask here though. Is the amount of actors the more the merrier?
Definitely do that. That would grant a huge performance improvement. And it’s probably a must anyway if you want to expand on this system.
Well, not really. Increasing the amount of actors at one point will have no impact on performance, since things generally rely on the amount of cores the client has. So keep it at a good level, and try to do performance improvements in your code.
Native mode is generally used for making mathemathically complex or other complex operations much faster, since instead of the code being interpreted, it gets compiled directly into bytecode.
And yes, you can just do --!native at the beginning of every script to enable native.