I can get a user’s graphics level on any platform with UserSettings().GameSettings.SavedQualityLevel, but it returns Automatic instead of the user’s actual graphics level when they have their graphics set to auto. This is problematic because on mobile, clients are forced to auto, so it’s impossible to tell what graphics level they’re on to adjust the quality of our game (e.g. lower-poly weapons) with UserSettings>>SavedQualityLevel.
I could add a custom graphics slider to my game, but I’d like for configuration to be simple and just have a single graphics slider (the ROBLOX one). Is there any way aside from UserSettings>>SavedQualityLevel that will allow me to determine the user’s graphics level, even if they’re on automatic?
This property should never return Automatic. Knowing if a player’s graphics level is set to Automatic is 100% useless to developers. I wish that this property would always return their TRUE graphics level.