im thinking about using a loop to fill the client’s memory for a second then get the amount of memory used by roblox and assume the result is the client’s ram and make the graphics adjust to it
how reliable do you think this is?
im thinking about using a loop to fill the client’s memory for a second then get the amount of memory used by roblox and assume the result is the client’s ram and make the graphics adjust to it
how reliable do you think this is?
There is no function that allows you to retrieve the client’s maximum RAM usage. I would suggest just letting the client control this on their own. The engine already does a lot of memory-culling operations to support different devices to begin with
Yeah there’s no real way to get the client’s actual max RAM from Lua. Filling memory on purpose just to estimate it isn’t really reliable either since different devices handle that differently, and it could just crash or throttle before you get anything useful.
Best option is to let Roblox handle that. The engine already adjusts memory usage and streaming depending on the client. If you want to adjust graphics manually, it’s better to use device type or performance stats like Stats.PerformanceStats
.
ram usage doesn’t affect fps.
filling vram up over it’s limit does affect fps since it has to use shared memory, but i doubt most Roblox games could get vram up this high even on lower end devices and i bet Roblox already has things in place to handle this if it does occur.