[Full Release] Build Cross-Platform UI with the ViewportDisplaySize API

Wow, after testing and testing, id say this one is very important to a players, phone user will enjoy this feature more often.

-- to be fair the new version is just a more straightforward way of accomplishing this. using TouchEnabled to detect mobile is not good at all, though, this will false positive for any laptop with a touchscreen or even for pcs with certain drawing tablets connected, it’s much better to just use the screen size

I completely agree, it’s terrible. It can create false positives.

What is the reliable sure proof way to find if it is Mobile?
The issue is we don’t have one.

TouchEnabled + Check ScreenSize?
What if it’s a touch laptop and the screen is small?
Looks like they’re on Mobile I guess.

When making UI/UX, layouts, button behaviour, etc.
I build very differently for a mobile device than a desktop.
The issue is, I need a 100% sure way to know what device the Player is on.


For resizing UI Elements and positing our objects with this API - I get it, it’ll do the job.
But I believe we need a layer deeper for Device Detection.
The addition of, Input Action Service and PreferredInput are great, but we still don’t have the full picture.

Hi, thanks for looking into all these options! For small-screen devices (such as mobile, laptops, or consoles) with touch input, would you treat them as ‘mobile-like’ and use the mobile UI when PreferredInput is Touch? If so, relying on PreferredInput and ViewportDisplaySize should be sufficient. If not, what specific distinctions would you like to make between these device types? Typically, we don’t need to rely on device type itself to design cross-platform UI—as long as we have the input and output device info. But I’d like to understand what additional information you feel might be missing. Thanks!

We do have ScreenGui.AbsoluteSize to get screen logical pixels, will this work for your case?

You should be designing your user interfaces to be reactive to the client’s setup, not their platform type. There’s no useful information you could get from knowing if the client is on ‘Mobile’, because that’s too broad of a category. You can’t accurately determine what input devices or screen size the player is using, because they could have connected a Bluetooth keyboard to their iPad or mirrored their screen to a TV. You can’t just assume someone on ‘Mobile’ only needs only touchscreen and a small screen size.

What I personally vouch to do instead for input management is connect callbacks for every possible input device the player could use, e.g. KeyCode.E for keyboard, KeyCode.ButtonX for gamepad, and a UIButton for touchscreen. What this does is make sure that no matter what wacky setup the client has, using any device will work. You could even use a keyboard and controller at the same time, if you’d like.

PreferredInput is meant to determine what input device the player prefers, regardless of what they have plugged in and could use. You should only be using this for visual stuff like displaying UI buttons when on touchscreen or changing text telling what button to press–you shouldn’t use it to determine what input device the user will actually use, despite what the Roblox docs say. If a touchscreen laptop has a trackpad, it shouldn’t need to show any touchscreen buttons. If a desktop has a controller connected, the game should tell the player to press the gamepad buttons, etc.

ViewportDisplaySize is meant to determine what kind of physical screen size the client has for its viewport. You’re supposed to use this to determine what UI layout you should use. I won’t say much else, because I think you get the idea.


Now the real reason why these APIs aren’t exactly that useful is because they overstep a bunch of cases, such as:

  1. PreferredInput can’t distinguish between having a keyboard and mouse and just a keyboard with a touchscreen. You either forfeit the touchscreen button that you’d need because you don’t have a mouse or get unnecessary buttons that you don’t need because you have a keyboard. Three measly enums aren’t enough for the general case.

  2. PreferredInput has wacky precedence regarding input devices. I haven’t fully tested what every setup results in, but I’ve gotten weird stuff like laptops picking touchscreen despite having a keyboard and trackpad.

  3. ViewportDisplaySize also has too few and too broad enums for it to be scalable. Physical viewport size is something we could actually measure, but Roblox abstracts this data into black-box choices that aren’t ideal. Laptop and desktop screens are a big range, but they get categorized the same.

  4. Since ViewportDisplaySize is based on the diagonal screen size, wide but short viewports might give weird results (although I can’t test this just yet since it’s currently broken).

If Roblox really wants to give developers better APIs for us to build cross-compatible user interfaces, they need to expose more information about the types of input devices and screen size the client has, such as:

  1. A way to get every input device currently plugged in, as well as a way to listen to changes when a device is added/removed. This way, we can decide what the client’s preferred device is for our own game’s needs.

  2. A way to read the physical viewport size dimensions, even with and without CoreGui insets/phone notches. This way, we can determine what thresholds our UI layouts should adapt for.

TLDR: The main issue is these APIs are designed with a one-size-fits-all mindset, when in reality different games are going to have different thresholds for how our interfaces should behave on each supported device. Exposing more informative APIs with less hand holding would be the better solution.

1 Like

I have already been using AbsoluteSize for several years for many different use cases. I’m asking about a feature similar to what was announced in this beta, but for just allowing us to directly read the physical dimensions (preferably in either cm or inches) of the device so we can be more specific with how our UIs will scale to big screens vs small screens.

For example, I’d like to take advantage of high-resolution desktop monitors (1440p, 2160p etc) by showing more buttons on a page compared to a mobile device which will likely have at least a 1080p-ish resolution but on a much smaller physical display in the real world. I’m doing this with my inventory menus in my current project via frame.AbsoluteSize and rounding but it’d be nice if I could also get more in-depth with this beta feature itself.

Besides that, this probably isn’t needed as the enum provided to us in this beta seems to be simple enough for anyone to understand. I just like having more options in my toolkit.

I initially forgot to mention that on mobile devices from the looks of things UI is rendered at a much lower resolution (from what I’ve read it’s a quarter of the screen resolution) to help improve performance on mobile and I feel as though that would be a useful thing for developers to be able to use in order to adapt our UIs further.

-- There is no reliable fool-proof way but that doesn’t mean that every option is created equal. You chose the least reliable most foolful way of doing it.

For the first question yes. With our current detection methodology. Any small-screen device with Touch will be treaded as ‘mobile-like’

Secondly, no we can not rely on Preferredinput And ViewportDisplaySize, as these are both dynamic. At game start, we determine the Players device and setup the UI accordingly. So if a player is on mobile, but they have a controller connected and their PreferredInput is not touch then they will not be treated as ‘mobile-like’ even though they’re on a mobile.

I don’t think I can trust that a mobile/tablet will always be ViewportDisplaySize.Small. Especially with foldables now becoming a thing.

The frustration is, just tell me what the device is. Stop making us have to guess. I can figure the rest out from there. Dynamically changing gameplay buttons from touch, gamepad, keyboard from preferred input is easy. that’s already been done. We can handle that. it is easy too thanks to IAS

A mobile is a mobile,
A tablet is a tablet,
A Console is a console,
A VR is a VR,
A Computer is a Computer.

If a player is on a device, I want to build for that device and give them the experience for that device.

A Players methodology for interacting with their devices is what can change, and we have the tools to deal with that now. Thanks to IAS and PreferredInput

I just wrote a post explaining this in further detail.
Device Detection - API & Enums - Feature Requests / Engine Features - Developer Forum | Roblox

The environment is also dynamic. E.g. a User can attach / detach mouse, keyboard and even touch screens. They can also dock their device, e.g. to display their phone output on a TV. The same way nothing stops you from using your console like a PC (keyboard&mouse on a monitor) and many players do so.

Should we still be using IsTenFootInterface to detect players who are playing further from their screen or should we consider Enum.DisplaySize.Large to indicate the same thing?

Enum.DisplaySize.Large indicates that the player is experiencing the game on a large screen, which corresponds to most television displays. This includes console setups where a gamepad is used with a TV screen!

1 Like

So, from what I understand from all this.

  • PreferredInput - use this to reflect how the user is interacting with their device.
    Show gameplay buttons, layouts, positions relative to this dynamic variable

Is there any point in me checking UIS.TouchEnabled or UIS.GamepadEnabled or UIS.KeyboardEnabled


  • ViewportDisplaySize - position, scale, re-parent UI elements relative to this.

Thus, these values are the Players “truth” and I can rely on these variables being correct.

Currently I have treated the environment as static, based upon TouchEnabled or not, and built our UI around this at run time. With the exception of preferred input, the gameplay buttons change relative to those.

So instead of TouchEnabled in this case, the change would be Enum.DisplaySize.Small?
Further connecting to the ViewportDisplaySize changing and update.

I think my overall issue with this is that I’ve been building in a specific way for 3+ years now and have to accept the change, plus the task of refactoring the framework to support this correctly.

  • PreferredInput - use this to reflect how the user is interacting with their device.
    Show gameplay buttons, layouts, positions relative to this dynamic variable

Is there any point in me checking UIS.TouchEnabled or UIS.GamepadEnabled or UIS.KeyboardEnabled

Your understanding of PreferredInput is correct! For UI related purposes, you shouldn’t need to check TouchEnabled, GamepadEnabled, or KeyboardEnabled directly anymore – PreferredInput should cover those scenarios dynamically.

  • ViewportDisplaySize - position, scale, re-parent UI elements relative to this.

Yes, you’re also right here. ViewportDisplaySize is what you’ll want to rely on when adapting UI to screen dimensions.

So instead of TouchEnabled in this case, the change would be Enum.DisplaySize.Small ? Further connecting to the ViewportDisplaySize changing and update.

Yes, this is correct! It’s important to decouple screen size from input method. A device might have a small screen and still use a gamepad or keyboard, so TouchEnabled doesn’t necessarily mean a small display. ViewportDisplaySize gives you a more accurate picture of the visual environment.

Let me know if there’s anything else I can help clarify! Our cross-platform development documentation also provides more information on how many of our new APIs work together.

1 Like

Wonderful, thank you for confirming these points.

For testing in Roblox Studio, will we have the ability to Switch the active Viewport between the Enums at Run-time with real Viewport Sizes, same way we have Device Emulation?
I think this would be a necessity for testing with this.

Further, I’d love to have a working Framework that implements these changes ASAP. However, the only concern is that it is currently Studio Beta, do we have an ETA as of when Client Beta will start?

Any estimate on when we can use this in live games? I want to follow the recommended methods of handling UI and input. The recent releases of IAS and UI Styling are great, but without this is hard to go through with overhauling my UI as I want to take advantage of this in the process.

3 Likes

When is this out on client?

I wanna update my GUI only when this is out

1 Like

yeah, that is a cool, but i think we need screen data with platform data combination, so users can do anything with this.. but this is useful too

There is a bunch of folding phones now which have almost the physical size of a tablet.. So it’s not inaccurate you are just wrong on that aspect. There even is a phone that can basically unfold 2 times into a tablet size phone

When out of client?