How to detect if the player camera is underwater


To start off, here are a list of things that I consider is not a valid solution to my problem. [color=“ff0008”]Please read this before you post anything[/color]:

  • Raycasting to detect terrain material | This is not a viable solution as terrain works in 4x4 voxels. I want this to completely take into account the water wave equation.
  • Raycasting screen pixel colour | I’m not even entirely sure of this is possible, but if it was, I wouldn’t use it because it is an extremely performance-dependent system and a very hacky one at that. This could be easily exploited by players using specifically coloured shirts to fool the system.
  • Player character touching water | This is for the player camera.
  • Fake water using parts | This is not a viable solution at all. I want this to work with terrain water.
  • Calculating character y position to find wave height | I’m not too sure if this would work at all, but if it would, again, this is a very hacky solution that I would not want.

Now that you know my basis for this, here is what I want.
Basically, I want to be able to know if a player camera is underwater, taking into account waves, and water to be anywhere in the world, not just one Y position. Basically, if your screen is tinted blue from having the camera in water, that’s what I want to be able to detect.

- Best regards, iSyriux

There is currently no solution to this: Figuring out terrain water wave height? - Help and Feedback / Scripting Support - Roblox Developer Forum

Some very smart minds are working on it; however, nobody has figured it out yet and we would need Roblox to expose and API to do this.

The blue-tint is not controlled by an accessible API and there are no known ways to accomplish what you want to that precision.


I suppose what you can do is treat each voxel as a bounding box then use some modified aabb logic math

Yeah, that’s what I’ve been trying to achieve for the past year (Yes, I made my first post about this topic in early 2020)
Unfortunately, I do not know how to apply the roblox terrain water wave equation and find out the exact times for it to sync up, as well as take into account various bodies of water, height, and the properties of the terrain water changing in-game.

terrain water wave equation? I am pretty sure they don’t react with other particles. If you are using the wave equation thats actually really simple since the wave equation returns x and z points. Just check it on that last delta change and read the voxel (terrain:ReadVoxels() → filter if voxel is at position we predicted) then check if the camera was between its bounding box.

I don’t know calculus. Could you explain to me what is delta?
Also, I never mentioned anything about particles. All I want to do is be able to detect if the player camera is inside, with wave displacement.

I am writting it rn

The idea is to get the difference of them. Delta means change

1 Like

unfortunatelly im in a dellima. idk how to check all voxels without overclocking :frowning:

Maybe have a part attached to the camera and when detect when the part is touching water?

Yet again: a hacky solution

1 Like

Ok @iSyriux I got a brand new approach using some math I learned from computer graphics. Basically, will define the water as a mesh of several triangles then we will cast a ray to each triangle on the screen then see if it intersects that water thing. Then we will see if that rays normal is not the same as the triangles normal when getting the cross product of our intersection point to a vertex on the triangle to our selves and that intersection point

How would you define the water as a mesh of several triangles, I need it to work with any body of water in game at any height and any position with the water properties in terrain taken account of

In ray tracing the most fundamental operation is the ray intersection algorithm. We could use a technique called DDA for 2d however since this is 3d will need a different approach. The most simple ray tracing intersection algorithm is spherical conics however it’s kinda hard to describe a mesh with spheres so will use triangles. A triangle can be broken down into 4 things. 3 points and a normal. With this we can actually use that to define a plane. It can be stated that a plane can “theoretically” be defined implicitly as the dot product of the normal of that plane by a point on that plane - another point which is on that plane being 0. This makes sense as we know that orthogonal vectors dot products will be 0 and that we can define a vector as the difference between 2 points. The ray parametric equation specifies that a point on the ray can be defined as it being equal to o+(d*t) where o is the origin and d is the direction and t, well t’s a bit tricky to describe but in simple terms its how far to travel across that ray. If it were negative we would have a line segment. So knowing what o and d are we are essentially just trying to solve t. We can define a “reference point” for our vector as one of the vertices on the triangle - this equation. Again, the dot product of this to the normal must be 0. In other words what this means is that our implicit equation is

0=normal:Dot(vertexOfTriangle - (origin+(direction*t)))

Now again, we just want t. The problem here is that the dot product doesnt really have a simple inverse function. However what we could do is simple treat the dot product as a function of itself and what we get after doing this is we find that t is simple the ratio of the dot products of our normal to the vector formed from our origin and our testPoint to our inverted normal to our direction of our ray. In other words t is

t=(n:Dot(origin - vertexOfTriangle))/(-1*n):Dot(d)

Since we only will accept positive values of t now if t is negative or undefined (inf) We can say that we aren’t looking at the water when all of these triangles t values aren’t positive. However we now need to see if our ray even collided with this triangle cause then if the mesh is enclosed we will always be intersecting this infinite plane. To do that what we will do is get the “normal” vector of our intersection point and our verticies and then from there form the cross product of them to the edges of those verticies. If this cross product is not the same as our normal it is not within the triangle. Now we have a way to see if we are looking at water we now need to see if we are under water. This is pretty simple all we do is form a vector to the surface of the waters normal and our selves and then get the cross product of it to the normal of the water. If it is negative we are under water. Here is a video which goes over the ray intersection algorithm I used in a bit more depth with code examples (not in lua though)


  • You only need to do this function 4 times, each for each corner of the viewport
  • when I say areselves/us/the player I mean the camera
  • Another alternative I have is checking the cross product of the normal and the player to normal base vector and seeing if thats negative and seeing if we are inside the water with region3