Converting OpenVR Matrices to Roblox Space

Hey guys, long story short: I’m currently working on extending tracking support for VR Devices in Roblox using the OpenVR API just as a fun side-project, and I’m currently having issues with Matrix conversions to Roblox space with CFrames.

What I’m attempting to achieve is proper device tracking in Roblox, just as the normal VRService:GetUserCFrame(UserCFrame) is able to do, except with external devices. I currently have a separate API server where the client sends the tracking device’s absolute zero position ( IVRSystem::GetDeviceToAbsoluteTrackingPose ) and the Roblox server GETs the data for each user (if requested).

Here’s the current conversion from HmdMatrix34_t (a normal Matrix 3x4) to Unity’s Position and Quaternion (where m represents the HmdMatrix34_t)

// https://steamcommunity.com/app/250820/discussions/7/1637549649113734898/
Position = new double3(m.m3, m.m7, m.m11);
double w = Math.Sqrt(1.0f + m.m0 + m.m5 + m.m10) / 2.0f;
Rotation = new double4(-((m.m9 - m.m6) / (4 * w)), -((m.m2 - m.m8) / (4 * w)), ((m.m4 - m.m1) / (4 * w)),
            w);

Where then the result is parsed to JSON, sent to the API server, then the Roblox server reads it, and converts it to a Roblox space format. Here’s a snipet of code that takes the tracking data and converts it to Roblox Space (where trackerProperty is the current tracker being handled in the iteration)

local v3 = VRLibDataConversion.double3toVector3(trackerProperty['Position'])
local qX, qY, qZ, qW = VRLibDataConversion.double4toFloats(trackerProperty['Rotation'])
local trackercframe = CFrame.new(v3.X, v3.Y, v3.Z, -qX, -qY, qZ, qW)
local HeadScale = WorkSpace.CurrentCamera.HeadScale
-- https://devforum.roblox.com/t/how-do-you-get-the-exact-position-of-the-controllers-in-vr/95209/9
local cframe = (WorkSpace.CurrentCamera.CFrame*CFrame.new(trackercframe.p*HeadScale))*CFrame.fromEulerAnglesXYZ(trackercframe:ToEulerAnglesXYZ())
-- Current Y offset, without this, trackers will be in user's head
cframe = cframe - Vector3.new(0, 2, 0)
						
while VRLibAnimationTool.ispartbeinganimated(tracker) do wait(0.01) end
VRLibAnimationTool.animatepart(tracker, cframe, animspeed)

(double3 and double4 are custom networked properties, should be seen as x amount of doubles)
(ispartbeinganimated and animatepart are Tweening tools, animatepart is essentially the same as CFrame = x)

However, the current issue with this solution is that it isn’t perfectly aligned in Roblox space. If the user is near the center of the space (excluding the Vector Y offset above), the trackers are perfectly aligned, however the farther you move away, the more the trackers drift. Below is a video demonstrating the effect. (Note that rotations seem to be fine, it’s only the position)


(video compressed for upload limit)

I’ve tried converting this to a world space with the CFrame[x]:ToWorldSpace(CFrame[y]), where x is a part that follows the user’s HumanoidRootPart, and y is the raw tracker CFrame, however that presented more issues with direction and the continuous positional drift issue.

I’m not the greatest at math, and I’ve never used Matrices before, so I apologize ahead of time if I’m unable to answer certain math questions. Also I am aware that this is something that Roblox should extend support for on their end (maybe with Tracker Roles :eyes:), but again, this is just a fun side-project, and would be neat to mess with first to see what possibilities could be reached with extended tracking support. If any more information is required, please let me know and I’ll attempt to drop whatever is needed. Thank you very much! :smiley_cat:

1 Like

Multiplying the Positional Vector (v3) by pi seems to fix the problem, tested on mine and a buddies’ playspace.

local v3 = VRLibDataConversion.double3toVector3(trackerProperty['Position']) * math.pi

I also don’t need the Vector3 offset anymore.

1 Like