New interface for bans and a new warning system

As a Roblox PLAYER, it is currently too hard to understand the moderation page. Moderation page used for warnings also doesn’t make sense.
The page for it is 2007 era and also it doesn’t work with site themes, turning it blue.

Let’s talk about ban interface.

Example formatting: [Pardon the typos in the image]

What it needs:

  • I really think moderators have to mark what’s wrong with the content. For audios it would be a fragment which contains the rule-breaking moment. For images or text it will be an animated red box. Simple?
  • Images shouldn’t appear as content deleted… I want to know what was on it…
  • Timer would help, because the game is now international and we don’t know what timezone exactly it is.
  • Additional explanation could be something you get in appeal mails, for example “This person has been known for this and this activity, and their portrayal is banned from our platform.”
  • Past rules should be taken into consideration - if it’s long time, it should be avoided. Unless it’s swears or nudity. Those were always bad. Removal of such content should be informed with warning system I have proposed.
  • ToS rule being displayed would clarify what exactly happened. It would also prevent “pewdie”-like bans.
  • Maybe slapping an ID of the asset would help?

We should be able to browse the site, but be limited from updating, posting or changing anything.
I want to check if I have more models that could break this rule… Maybe even remove them…
The offsale update only makes it harder for me to find my stuff without using my own account…
Oh also DevForum login.

Now about warnings.

Warnings should appear in inbox like system messages, but have a :warning: symbol in our notifications, so we see it in the topbar area. You can’t turn that off.
Using ban interface for that only gives us a mini-heartattack.
I think this would be better.

Warnings also need similar explaining.

If staff is able to adress this issue, it would be a quality of life change.
Especially when the game is aiming to be international.
I had multiple short bans for stuff I just wasn’t aware of - infamous “red bandana” ban, because apparently US has some gang with those (and I don’t even live there, how am I supposed to know?).

I recommend checking other moderation-related threads.
Once combined, it would prevent many misunderstandings and unwanted bloodshed.


While I agree with the great majority of this, two things…

If warnings simply appear in the inbox (and notifications), what would prevent users from simply never checking their inbox and later claiming ignorance? With the current warning system, users have to acknowledge that they understand the reasoning for the warning (whether they agree with it or not). This ensures that no one can say they “didn’t see the warning” later when appealing a ban.

Additionally, being able to browse the site during a ban, would defeat the purpose of a ban, which is to forcibly remove a user from the site. After a ban (assuming not permanent), you could then log in and check your assets to, as you stated, ensure that they are not also violating the rules.


Yeah, but I’m pretty much anxious. Stacking bans can be a thing.
I don’t want to get one right when I’m free.

Warnings - would display a special icon, like I said. I think it’s visible.

1 Like

Agreed completely. The current moderation interface was barely adequate in 2010, and now almost a decade later there’s no excuse for it to be like it is. If Roblox ever wants to be taken seriously as a platform it needs to start improving the moderation, and this would be a step in the right direction.

Even just displaying what asset was moderated (if it has one) would be a step up because right now it just displayed the deleted content icon, which is beyond useless.

For images, if the offense is just small text that can’t be read, it shouldn’t even lead to this page. When I attended RDC last summer, one of the talks they had claimed such ‘warning’ was simply only feedback. In this case, a DM should be sent to the user regarding why the image was moderated, similar to when an audio gets rejected due to unplayability, etc. Of course, this would again, only apply to unreadable text or another very minor issue - other stuff like uploading NSFW content, for example, would still lead to the punishment page.