It would be cool for ROBLOX to make a blog post or something on how they manage their servers. For example, Discord made a blog post on how they handle their Cassandra Database clusters, it was a really good read. I’m sure curious ROBLOX developers would love to have a read of something similar.
All I know is that whatever specs they currently have is not enough to allow me to use my ported, pure Lua, BigNum (BigInt) library to perform elliptic curve calculations in a reasonable amount of time. I’m talking about multiplying 64-bit integers thousands of times a second; ending up taking dozens of seconds at 100% CPU to complete.
What I really need is a BigNum C-extension to rbxlua, which would allow me to perform the same calculations in just a fraction of a second. I am sure there are MANY more real-world applications for a BigNum library than just elliptic curve calculations. Of course, I would make a feature request for this, but I do not have the requisite permissions, seeing how I am a trial user. Please add this!
PM a Top Contributor as per rule 15.1.
The task manager on windows does not show the specifications of the server itself, but the local machine you are playing Roblox on.
It would be cool to be able to pay to increase the RAM of the server for memory intensive games!
I think having a free option is better rather than paying, it would kind of act as a wall for those who don’t have a lot of robux but want to make a good showcase/experience
If this were to actually happen I think it would be more likely for Roblox to add a pay wall because obviously they would want to make money off of this because adding RAM would cost them.
It’s probably Roblox’s Operating System they wrote themselves.
They are using Linux operating system
“Roblox/Linux” Ah yeah, this HTTP header means nothing eh?
I thought Roblox doesn’t like linux…
Yo, dead topic, but I just thought I’d share a benchmark, currently the server CPU runs at a speed slower than my i3-10100 (4 core 8 thread, at 3.6Ghz), the benchmark was a simple for loop with a addition value:
local Clock = os.clock()
local x = 3
for i = 1,1000 do
for a = 1,100 do
for b = 1,10000 do
x = x+3
end
end
end
print("Server "..os.clock()-Clock)
end)
The results were pretty bad, my cpu took ~6.9 seconds, while ROBLOX’s server took about 7.6 secs. I guess this doesn’t mean much, and it does still help to calculate this on the server, cause if it’s done on the client, all the resources will go to this, and freeze the screen… Wait on that note it is possible that roblox’s server CPU’s only dedicate a certain percentage of the procession power to certain threads so that it doesn’t freeze… Anyways there are my benchmark results.
Edit: I did NOT understand CPUs at this time… It’s called having multiple cores… But single threaded performance is likely a Xeon with ~7th gen architecture or smth? Not great.
There was a page on the Roblox fandom stating that all servers are now ubuntu, which is kind of awkward to think of because that Roblox doesn’t support Linux devices…
Another thing to consider though, the CPU for their servers are likely faster than a 4 core CPU due to the number of games that they have to distribute the load over, not to mention when a game you are never connected to the server directly but instead a Lua VM making it harder on hackers to pull your IP (smart move roblox, however please fix your moderation bots, they are not good and often still issue false bans)
I guess:
DDR4 ECC RAM 1TB, 3200MHz.
AMD EPYC 7763 64 cores, 128 threads, 2.45 GHz base clock.
NVMe SSDs 20TB, PCIe Gen4, RAID 1 configuration (for operating system and critical data).
Large-capacity HDDs 40TB, 7200 RPM, SATA (for game data and backups).
High-speed data center-grade switch Cisco Nexus 9300 series, 48-port 10GbE.
Hardware-based firewall Palo Alto Networks PA-220.
Dedicated load balancer F5 Networks BIG-IP LTM VE (Virtual Edition).
Redundant high-wattage power supply 750W, 80 Plus Platinum-certified.
Uninterruptible Power Supply (UPS) APC Smart-UPS 1500VA, Pure Sine Wave.
Enterprise-grade motherboard Supermicro H12SSL-NT.
Lightweight OS Ubuntu Server 22.04 LTS, headless setup.
Really? They go down a lot! It must be a Pentium! What the server specs should actually be:
CPU: Intel Xeon processors
Graphics: RTX 4070 or even 4090 (if Roblox can afford it, or GTX if they are broke)
RAM: 32GB DDR5
Storage: NVMe SSDs
LOL BURN! Single core Xeons? That’s a good roast!
Roblox is a massive platform hosting billions of games, and it provides free data storage that would normally cost money. All scripts run on Roblox’s servers, not your PC. With billions of scripts running simultaneously, Roblox servers sometimes face outages because they can hit their limits.
With your specs, Roblox couldn’t even run that many games at once. Roblox has a safety feature that shuts down servers if memory usage exceeds 4GB, which is why the player limit is typically 100 anything higher can push the servers to their limits. (4GB might be not correct)
Roblox servers also handle huge data traffic, with hundreds of assets being uploaded every minute. When billions of players are active, there are limits to what the servers can handle. It’s not fair to mock Roblox’s servers they are powerful and often more advanced than you might think. Making fun of them harms Roblox’s reputation and is simply inaccurate. They’re not weak; they’re more like NASA-grade computers.
Why do they go down a lot then? they’re old outdated hardware!
I understand why you might think Roblox’s servers could be outdated, but the reality is a bit different. Roblox operates on an incredibly large scale, and the platform handles massive amounts of traffic daily, with billions of active users across millions of games. The servers are designed to handle this vast amount of data and support realtime gaming experiences, which is no small feat.
When Roblox servers go down or experience issues, it’s usually due to server overload or system limitations that happen during peak usage, not because the hardware is outdated. They deal with billions of requests, file uploads, game data, and realtime interactions all at once. Even the most advanced systems can only handle so much, and when usage spikes, it can cause temporary outages or slowdowns.