It is extremely difficult to develop a game with VR support that interacts with the 2D UI in any substantial way. Fixing this is important for two reasons:
Games that were not developed from the ground up to support VR may need to completely restructure/rewrite any parts of their game that are related to the 2D UI in some way if they want to support it. This is good, because it encourages VR-focused design, but also bad, because it greatly increases the cost of implementing support for VR (which is currently a very tiny portion of the market). In my case, when deciding which course of action to take I had the expectation that the 2D UI would be usable and properly supported (as that is true for every other platform Roblox supports) but this is clearly not the case.
It will get really confusing for players when there are multiple different UIs visible at any one time. We developers want to avoid this, but it’s hard when we can’t use the Roblox UI, can’t see if the Roblox UI is visible/open or not, and can’t control whether the Roblox UI is visible/open or not. It would be really nice if we could use the Roblox UI for our systems when viable, or at least make our own UIs and Roblox’s UI mutually exclusive or have other kinds of relationships. Roblox should be trying to make it easy to prevent users from being overwhelmed or confused by this dichotomy between our game systems and the main Roblox UI.
Elaborating on the first point, to make using the default 2D UI viable (instead of making our own custom UIs) we need the following things:
- Reliable access to the CFrame and Size of the 2D UI, so that we can project things on to it or have some idea of where the UI is in the world
- Hit location of the VR pointer on the 2D UI (preferably in the SurfaceGui’s space in pixels, but anything works as long as it accurately represents the hit point)
- Better integration with input systems, so e.g. InputObject.Position accurately representing the location of the VR pointer.
In my current project to make use of the 2D UI I need to dig through the core scripts and manually compute this information. Whenever a VR update changes the behavior of these things all my code breaks and I have to start from scratch again in trying to understand and replicate what the core scripts are doing. (This was not a game built from the ground up to support VR, hence why I decided to use the 2D UI in the first place - obviously not a good long-term decision, in hindsight!)