Unfortunately, there ain’t no way this system works, the amount of low-quality content I see up is constant still.
There’s a game where every day people are commenting that they were scammed/not given purchases, that was a re-upload of a game that literally was put into Pending Review due to it’s usage of minecraft textures, and was simply re-uploaded to avoid having to go through the review process, which lead to the scamming of thousands of users out of thousands of real dollars, which despite constant reporting - both before and after this “low quality content” update, has not been dealt with.
Not only that, but the game itself is basically named and set up identically to the game that was placed under content review - If this system doesn’t immediately catch re-uploads of moderated games, even when reported, what can it actually catch? The game in question has HUNDREDS of active players still!
I’ve also noticed shovelware games with poorly AI generated thumbnails and very low quality game content get uploaded, and I get the feeling they aren’t getting reduced in viewership either.
But that’s nothing compared to a developer that’s gone unpunished for scamming it’s audience out of who-knows-how-much robux/money years ago still being up.
This feels like a step in the right direction, but I sincerely doubt this system will be prominent enough to deal with any actual low-quality content so long as Roblox is profiting from it.
Edit:
While we’re at it, why doesn’t low-quality group wall content come into effect here? Development Groups making thousands of dollars on Roblox HAVE to moderate their wall, as Roblox clearly is not capable of it.
Naturally, the group wall of the game mentioned before is 99% spam - Thousands of offsite directions and bots. They make plenty of money, and should be able to reduce the spam output, but refuse to do so, putting Roblox players in jeopardy.
There is no way to report an unmoderated group wall, it just fills up with spam and scams. And yet the games in the group, remain favored by the algorythm. This must be addressed. Groups with high quantities of undealt with spam on their wall should be visibility limited until it is shown that they are actively moderating their wall, or disable it.
As a developer, I see other developers with horrible practices pass by me every day and I am left to just… deal with it. Does not matter how many rules they appear to have violated, how much money they have extracted from players who will get nothing in return, or how unmoderated and vile their group pages are. Reports are unheard, and systems like this don’t seem to achieve anything.
I love Roblox. And I love developing for it. While I do not want to cause any trouble, and think Roblox has genuinely good intentions most of the time, the moderation system has been a point of critique for a significant amount of time, and an Algorithm-powered Visibility Limiter for “Low Quality” content will never catch the actual problems on Roblox.

This game isn’t “Low quality content” enough to be visibility reduced. I get that. It probably meets a minimum bar, certainly - But the visibility limiter system should consider games from groups with other Moderated, Un-Appealed games in it as being likely bad to feature; As well as groups with poor Group Wall spam ratios.
That, in my honest opinion, will capture more “Low Quality” and “Scam” content than anything Roblox’s current algorithm is doing.
A lot of these are games that were purchased up by large corporate entities.
They shovel enough ad budget in to keep these games operational, and then completely abandon the groups associated to be full of spam & scam links, unmoderated, and then employ shady measures if their poorly managed purchased products end up breaking the rules, like re-uploading without appealing. Did they fix the original rule breaking occurance? Maybe, who knows! But they didn’t go through the correct methods, and despite what is likely hundreds of reports by hundreds of players, if not more, games like this are completely untouched.
If the “Low Quality” content detection cannot deal with this, then it can deal with nothing - that is my current opinion. The groups that produce and allow this are what needs to be carefully watched - So that low-quality brands can’t establish trust-reducing branding.
@DevRelationsTeam What are we - What am I, a developer who calls this platform home - supposed to do about this sort of thing, when all attempts are completely ignored?