Hey, math nerds!
I’m attempting to make a game where the camera is already focused on the head for whatever scale. I did some experimenting and found that the point form some sort of exponential curve down. I suppose to better help visualise this, here’s the data I collected.
Character Scale: 1 -- Ideal Camera Offset: 0,0,0
Character Scale: 0.5 -- Ideal Camera Offset: 0, -0.5, 0
Character Scale: 0.25 -- Ideal Camera Offset: 0, -1, 0
Character Scale: 0.125 -- Ideal Camera Offset: 0, -1.25, 0
Character Scale: 0.0625 -- Ideal Camera Offset: 0, -1.4, 0
Character Scale: 0.03125 -- Ideal Camera Offset: 0, -1.5, 0
Anything beyond 1 would have an ideal camera offset of 0,0,0
.
I plugged these values into Desmos for a better visualisation but I simply can’t figure out the function to get these values.
(Edited to add the graph here)
Haven’t used this side of the devforum yet, so if this post is a little misplaced, sorry! Else, help would be awesome!