How do you read what quality settings a player has set if they have automatic on?

To get a player’s currently set graphics quality, you use:

UserSettings():GetService("UserGameSettings").SavedQualityLevel

This returns: Enum.SavedQualitySetting.QualityLevel1 and so on.

However, it can also return Enum.SavedQualitySetting.Automatic. But this doesn’t help when making scripts scale with the current graphics level a player’s graphics are set too because there seems to be no way to see what level graphics automatic is setting. Is there anyway to find out what level graphics automatic is forcing on a client if they have it turned on?

7 Likes

You can’t.

Sounds like a good feature request

12 Likes

Pretty sure automatic graphics adjust the level based soley on framerate. I remember @Maximum_ADHD made a short video demonstrating how automatic graphics would throttle down to level 1 when the window was unfocused. (because Roblox throttles framerate to 15FPS when out of focus) Is this still the case Clone? If so, it would be pretty easy to experiment with throttling framerate in game and figuring out what Automatic graphics’ target framerate is, and from there, writing some code that would be able to accurately predict what Automatic has set the graphics setting to.

But yeah this would be simpler as a direct API function lol.

it does

3 Likes

Im not sure how this would work :thinking: Documentation - Roblox Creator Hub

An event bound to quality level change. Unknown if it fires on activation/deactivation of Automatic