I have restricted my cameras movement by setting minimum and maximum angles on the X and Y plane. The angles are -30, 30 on the X axis, and -40, 40 on the Y axis.
The Y plane, or horizontal plane, is wider than the X plane, so that it better fits the average monitors length and width.
( I will later calculate the clamp based on screen size. )
Right now, if I move the camera to the point where both axis are at their max/min, it hits a corner. I don’t want these corners here, and want to get rid of them by making the camera clamped within an oval, and not a rectangle.
Does anyone know how I can do this with math?
EDIT:
I tried to use ellipse equations and coordinates to solve this, but I couldn’t figure it out. I worked on it for maybe two hours and crashed.
One way of doing this easily is through a naive implementation. Perform your regular X coordinate constraint with constants. So if you want the eclipse size in the X-axis to be ± 30 degrees, just do it like how you are doing it currently. However, we need to recalculate the Y-axis constraint every coordinate on the X-axis. Do this by utilizing some Pre-Algebra, the equation for an ellipse.
For example, I could use 300 - x^2/3 = y^2
Now, to find the Y-axis constraint, I just plug in the current x coordinate (angle). If I am looking 20deg to the right, then I just plug in 20 like so: (300-20^2/3) = y^2 to get y=12.91. This means that my constraint needs to be from (-12.91, 12.91). Very simple.