It’s currently very difficult as a high-fidelity Roblox developer to determine what the default settings should be for new users. Obviously default settings should be a kind of middle-ground but it can still lead to some user’s experiencing lag and other users being disappointed in the game’s quality when it can technically be higher.
For a project I’m currently working on, I’d like to sort users into different versions of the game depending on their hardware capability. But unless their hardware is particularly bad, it’s really hard to determine how they’ll be able to handle the gameplay elements.
It would be extremely helpful if developers could detect user hardware capability, and automatically tweak the gameplay experience on a per-user basis. We wouldn’t even necessarily need to detect hardware specifics, just a more broad idea of what their machine can handle.
This kind of functionality would be crucial for games that users play using FPS unlocker, as technically they aren’t lagging, but any kind of tweak to game systems can be the difference between 100 fps and 150 fps.
The problem with this is that hardware varies so much even with the same chipsets and models. There are too many factors for this to be useful, and won’t account for things beyond software control such as the efficiency of the user’s cooling.
If you want to adjust quality in your game, consider providing users with a settings menu with all the different toggles and effects so that they can tailor the experience themselves.
Well yes, I do this. But there needs to be default settings and from what I’ve seen, most new users don’t visit the settings menu at all. Which leaves alot users just assuming the game looks awful, and others just assuming the game is laggy.
I know about the variance between chipsets, I’m proposing not necessarily detecting the hardware itself, but more like a mini-stress test on each component that would return some values indicating what the user’s machine is capable on average. Thus optimizing user experience without the need of their specific input.
Roblox kind of already does this to some degree if the user has graphics on auto, so I’d like the ability to hook into that a little bit.
Lighting technology mostly. Users on console, tablet, and mobile would be sorted into servers using Voxel while users on much better systems would be sorted into Shadow Map and Future servers. I know that technically a roblox graphics setting of 5 or less automatically sets Future to Voxel, but it removes lots of other effects with it. Also I don’t want users to need to change their settings in order to play my game specifically, I think that would negatively impact retention.
Roblox engine should just handle this for you (wrt Lighting tech you mentioned above), that way we as developers don’t need to maintain that list of performance variations per platform. Imagine keeping a roster of all possible devices and then through trial-and-error with analytics (assuming you don’t wanna buy a lot of devices) figure out the best configuration of settings for each device, seems like a major pain and something that can be better handled by Roblox engineers who have way more data/devices available.
You can already check the player’s device. Wouldn’t it be easier to just check that instead of introducing something new to the API? Your example of putting people into servers based on hardware can already be done with that check. Another way to check would be framerate tracking. If the user’s framerate for RenderStepped is low, you can switch settings to accommodate and increase their performance. I do this in my tycoon games to switch Ore droppers from client/server physics ownership so that low end devices do not have to compute physics.
Well this only works for users with exceptionally bad hardware. If the user can obtain 60 FPS, I’m unable to determine if their hardware is capable of more performance. This is a problem both for users using FPS Unlocker, and for specific cases where the user can get 60 FPS on the menu, but much lower in gameplay. I need to know before they enter the game what gameplay elements they could potentially handle.
I’m sorry for bumping a one-year-old topic but it was the closest topic I could find to what I’m encountering. I’ve worded it as best as I can to be a contribution to this topic, but if you have any feedback, please DM me and I will attempt to either rectify the error or request this post deleted.
A possible additional use case for hardware spec detection is improved part ownership systems. I’ll try to demonstrate this in the video below, with gameplay of the very physically intensive Destroy a City. It’s not a top-notch game but it demonstrates perfectly what I encounter:
In the video, I have a bunch of parts simulated, slowing down physics by a lot. The slow parts are all being server simulated, and for seemingly no good reason, because the system this was played on is easily capable of simulating more than the few parts the server allocates to me. And the few parts that I am simulating are buttery smooth, the difference being especially evident around the 35-second mark where I destroy a fence. The fence parts are being throttled(? Is that the correct word?) until I walk up to them, claiming network ownership of them and then simulating them much faster.
For reference, the system’s specs are as follows:
Intel Core i5-12400F 2.5Ghz
Nvidia GTX 1650 4GB GDDR5
1920 * 1080 @ 75Hz
~170Mbps connection, other details removed for privacy:
It should easily be able to handle more than the dozen parts I estimate it was handling at any one time in the clip. If Roblox implements this feature, I could create systems to maximize the potential of users’ hardware.
Not exactly, to some extent many users share hardware specs. the os.clock trick logs CPU time, which is actually a reliable fingerprinting method. But just knowing the make and model of their hardware wouldn’t be unique enough for that purpose.