Converting 3D point in ViewportFrame to 2D point on player's screen

I found these DevForum posts:

but they’re solution is hard to follow for me (it seems they use CFrame instead of :WorldToViewportPoint(), which is what I’m trying to use.

I’m using this function to convert 3D points to 2D points

and this is how I’m calling that function:


where vpCamera is the ViewportFrame camera.

I tried debugging it to even see if :WorldToViewportPoint() was returning that the object points I’m trying to convert to 2D points are in frame, and they are. So I think I’ve narrowed the problem down to the body of the if statement of the get2DPosition function where inView is true.

Here’s a video showing the behavior when the object is in the game world vs when the object is in a viewport frame:

You can see in the video for the “not working inside of viewport” segment that the output is printing the return of the get2DPosition function.

In summary, I need help modifying my function so that it’s able to convert 3D points within a viewport frame to 2D points on the player’s screen. Thanks

For anyone else who runs into problems using :WorldToViewportPoint() with ViewportFrames and their cameras, here’s what I found:

TL;DR
When calling on the Roblox API method :WorldToViewportPoint() on a camera that is created, it behaves differently than when it is called on the default camera (workspace.CurrentCamera).

When called on a camera that is created (not the default camera):

  • The X & Y values of the Vector3 returned by the WorldToViewportPoint should be treated like UDim2.Scale values that are relative to the AbsoluteSize of the ViewportFrame.
    This means that this is how you calculate the 2D position of a 3D object inside of a ViewportFrame:


    Where vpFrame is a ViewportFrame and ScreenPosition is the Vector3 returned by WorldToViewportPoint.

  • If the ScreenGui you’re using has IgnoreGuiInset set to true, you need to manually add a piece of code to take into account the gui inset. Example:


From the solution to this post, I was able to understand that the 2D point you get from the :WorldToViewportPoint method being called on a created camera uses decimals, numbers that are more fitting for scale rather than offset.

I then asked why this is.

From this post, I understood that the ViewportSize property, a read-only property, of the default camera is the resolution of the player’s screen, while for a created camera, it is 1:1. The method was returning a 2D position relative to the camera’s ViewportSize.

With all this in mind, when calling the :WorldToViewportPoint() method on a camera under a ViewportFrame, I had to calculate the 2D position using the returned value of the method (ScreenPosition) as a scale value.

At this point, the gui objects representing the 3D points in the ViewportFrame converted into 2D points that could be represented on the player’s screen was, well, actually on the player’s screen now. However, the points were slightly off, as shown in this image:

Note that the ScreenGui in which the gui objects are parented to has IgnoreGuiInset set to true. I noticed that with how my code was as it was, if I set IgnoreGuiInset to false, the points would then align as I intend, as shown in the image below:

{10AA046E-B9E8-4142-BDAD-829CA9C4AC58}

However, I need IgnoreGuiInset to be set to true, so I added this block of code as a workaround:


I think this is expected behavior for the method, though, because it says on the documentation for WorldToViewportPoint that this method does not take into account gui inset. For some reason, when it comes to calling this method on viewport cameras, the gui inset needs to be taken into account always, meaning if it is ignored via IgnoredGuiInset set to false, you have to add it back in, as seen in the block of code above.

API documentation really needs to get updated :sob:

1 Like