Create a community-driven post approval system to moderate topics


Recently, the state of the entire DevForum has declined drastically. Previously, closed-off areas of the forum were only accessible through post approval. However, over the past few months, categories that were once partially accessible have either been completely opened or absolutely locked down.

Having a small post approval team was effective when Roblox had a small DevForum. With the recent explosive growth, it is necessary to create a more scalable solution. Thus, creating a decentralized system of community moderation would limit low-quality content without restricting access to more commonly abused categories.


* implies that this value may be subject to change
In order for members to post a topic, 1 credit must be used. 1 credit can be earned by reviewing 5* topics (1/5* credit per topic). When a post is sent to approval, it will be reviewed by 5* different people. To start, everyone* will receive 1* credit, and the maximum amount of credits any one user can have at a time is 5* credits (which prevents people from hoarding credits to spam). Credits are private and serve only as a utility to control the flow of posts, not as a cosmetic forum stat.

For accountability, every user who reviews a topic and approves it is liable for any moderation action inflicted upon the original poster. This ensures that reviewers don’t just approve every topic to get posting credits. In addition, the queue would be strictly first-in-first-out to stop people from using alts or friends to falsely approve posts. To prevent changing a topic immediately after approval, editing will be restricted for 30-60* minutes depending on the category (i.e. #development-discussion would be less; #bug-reports would be more).

Depending on the outcome of the approval process the poster will either receive the anonymous feedback of the community members who reviewed the topic or a message stating the topic had been approved. However, no matter what the outcome, the posting credit is still consumed (i.e. you don’t get a refund if your topic fails). Also, as seen above, flagging an inappropriate unposted topic would be quick and easy, and doing so preemptively prevents the bad content from spreading to the wider Roblox community.

After a certain amount of topics correctly approved/successful posts, the member can be promoted to regular and not be subject to any posting limits. This framework provides a structure for members to police their own posts as well as advance to regular status and graduate from the system.

This system would be made possible by a custom plugin, similar to the bug report wizard or the old community post approval system, but with the addition of credits.


This section will be updated as replies are added to the topic.


  • Opens up formerly closed sections of the forum
  • Provides a new system for promoting members to regulars
  • Creates a sustainable cycle that discourages posting for the sake of posting
  • Decreases workload for DevRel of moderating the current deluge of low-quality topics
  • Increases the quality of posts by preventing bad content before it can be posted as opposed to trying to retroactively flag the bad content.
  • Spreads the workload among thousands of users rather than a small group of volunteers.


  • Requires significant work to set up
    • Counter: DevRel has already shown they are willing to start large projects to improve forum quality. If they are in the research stages of such an undertaking, now is the time to suggest it before plans become more solidified.
  • Depends on members being able to understand the rules
    • Counter: Anyone who incorrectly approves or declines a post that successfully appeals will receive moderation action. This means that if a reviewer incorrectly approves/denies a topic, they will receive the appropriate moderation action (i.e. feedback → temp suspension → perm suspension).
    • Counter: Posters who submit low-quality content will still receive feedback, but now from the community before the topic is posted rather than from the moderators after the topic is flagged.
    • Counter: The number of reviewers can be increased for greater accuracy if necessary. However, at 5-10, a majority should still have some grasp of the rules. In addition, learning by reviewing is much less destructive than learning by doing.
  • Creates a new credit system that could be confusing.
    • Counter: The new system will be implemented in such a way that it will be obvious how to use posting credits. Here’s what it could look like:
    • Counter: The tutorial could be updated to include information on posting. As players and now developers on Roblox, members should be familiar with virtual currency systems.
  • Friends could approve each other’s posts to bypass the system
    • Counter: Whoever incorrectly approves a post that gets taken down will receive moderation action in addition to the OP.
    • Counter: The chance of having 5* friends who would be able to refresh through the queue to find and approve your post is unlikely. With 1.2 topics being posted on average every minute, that creates a window of around 15 seconds where the topic has not been reviewed by anyone yet. As more legitimate users pull the topic from the queue from review, the lower the saturation of false reports.
    • Counter: Measures could be added to prevent this sort of abuse, such as by limiting users to one topic pulled for review every 2-5 minutes. This also has the added benefit of decreasing the incentive to speed through reviews, because you would have to wait for the next one anyways.
  • May expose normal users to offensive or inappropriate content.
    • Counter: Malicious content already appears on the DevForum with the current system. Implementing this system would decrease its reach by limiting the audience to 5* reviewers. This may not completely eliminate harmful content but will represent an improvement.

Common questions:

What about regulars?

Regulars would not be subject to the post-approval system. Members could advance to regular through repeated successful posting in the system.

Why credits?

A credit system can operate on its own without having to burden a small team of volunteers, and has the benefit of being more sustainable.

Is there any particular reason behind the suggested numbers for reviewers/credits awarded?

Yes! By making the two values inverse (i.e. 5 → 1/5), it ensures that the input of 1 credit to pay for approval only outputs one collective credit to the reviewers.

Wouldn't this be some completely new system that should be suggested to Discourse?

First, the bug report wizard shows that DevRel can add new plugins to augment posting topics.

Second, this forum is very large and encompasses a very wide range of topics. Discourse is unlikely to add a native feature this large unless there is significant demand. At the moment, a plugin would likely arrive faster and be easier to tailor to the DevForum’s needs than a Discourse feature.

However, depending on the feedback on this topic, I may consider filing a separate Discourse feature request if the community supports the concept of a posting credit system, in order to get the process started.

What about replies?

Replying would not be affected by this system, only the creation of topics.

Which categories would be governed by the new systems?

This is up to DevRel to decide. However, all of the categories have been decreasing in quality lately.

Is this supposed to be a replacement for the flagging system?

No! This new system is meant to better control the incoming flow of topics and supplement the flagging system by creating preventative measures to decrease low-quality content.


I mean that’d be cool if Post Approval wasn’t already disabled since September of last year. Besides, this could be abused with friends approving posts for each other.


This feature request is intended to create a new system to approve posts, not reactivate the old post approval system (which was slow for posters and unfair to volunteers). I’ve edited the title to make it clearer that this topic is addressing the creation of a new system.

See this section:

Anyone who approved a topic that gets taken down will receive moderation action along with the original poster. So if Friend A approves Friend B’s spam topic, both Friend A and Friend B would receive the appropriate moderation action for a spam topic (i.e. feedback first time, temporary suspension with repeat offenders, etc.).

In addition, trying to use friends to cheat the new post approval system would require having FIVE friends who:

  • Are willing to sift through the approval queue to get to your topic.
  • Were able to find your topic in the queue before anyone else.
  • Understand that they will receive moderation action against their DevForum accounts for approving a low-quality post.

I really like this idea, but it should not be put into place in #help-and-feedback but definitely in #development-discussion and #resources.


I like the idea of this system a lot. However I feel some things about it could be worked on.

I don’t believe the credits should do anything, at least not in small amounts (a few hundred/thousand correct approvals/denials should be where they start having any form of reward as such) to fight off people from doing it just for the points and potentially brushing over posts to approve in hopes of getting free stuff. In the low ranges of points, incorrect votes shouldn’t have moderation action (because some people might be still learning the finer details of the rules, so a warn would just do fine), however if a lot of incorrect votes happen, or the user goes past a certain amount of points at any point, incorrect votes should start having action taken (because at that point the user should know full well what they’re doing)

Town of Salem has a similar system for their ingame report system, where the points do nothing, and users do reports exclusively out of their own want to clean up the game. 10 users vote on a report, and if it gets a majority guilty vote out of those 10 people it goes to a smaller, more professional team who gives the final judgement on if its guilty or not (99% of the time it’s accurate so that team doesn’t have to do much and can thusly brush through reports in seconds) - It’s proven itself to be VERY effective, so I’m 200% for this idea being implemented for post approvals on the devforum.

I think the amount of people that vote on post approvals should be a decent amount higher than 10, to fight off any potential alt abuse to approve/disapprove specific posts as well as getting a potentially more accurate result. There’s a lot of members on the devforum, so filling those spots shouldn’t be difficult at all.

Overall, I’d love to see this happen. Community driven post approval (and heck, community driven moderation when implemented correctly) is definitely an amazing idea, and would make the devforum a way better place to be.

1 Like

I could go both ways on this. Help and Feedback is a category where topics are often single-use (in the sense that each topic is used once, then forgotten), which means that moderating quality topics may not be as important as other categories. However, this category suffers from many repeat (particularly with scripting) and off-topic (such as Blender questions) help requests, which makes it harder to responders to find legitimate help requests. I’ll leave the final decision up to DevRel.

This part might have been unclear. The credits will (likely*) be on a 1 credit → 1 post ratio. At any one time, the maximum amount of credits you can have is 5*. The credits will not be publicly visible and only serve to control the flow of incoming topics.

I 100% agree that moderation should follow a scale based on prior offenses. I could definitely see how some newer members might not fully understand the meanings of some categories.

Yep, this is pretty much what I am envisioning, aside from a few key differences:

  • This system would be proactive and preventative (before problematic posts) rather than reactive (after problematic posts).
  • DevRel would only be involved with appealing community decisions. With almost 12,000 topics posted every week, it would add a huge strain for the team to have to arbitrate all of the failed posts.
  • As you mentioned, reviewing posts in the new post approval system would reward the reviewer with the ability to post themselves, while in ToS the system exist solely to clean up the game.

Ideally, I agree that the number should be around 10. However, this would also decrease the reward for every reviewer (from 1/5 to 1/10, unless the credit input/output ratio is changed). In addition, it could complicate moderation, because rather than have to moderate the 2-3 people who incorrectly approved/denied, DevRel would have to deal with 4-5 people.

With regard to abusing alternate accounts, my previous post explains why this system naturally counters cheating:

In addition, trying to use alts to cheat the new post approval system would require having FIVE alts on which you:

  • Are willing to sift through the approval queue to get to your topic.
  • Were able to find your topic in the queue before anyone else.
  • Understand that they will receive moderation action against their DevForum accounts for approving a low-quality post.

In addition, this is not a replacement for the flagging system, but rather a supplement. Using alts to incorrectly approve your topic would still result in your topic getting flagged, removed, and your original account getting moderated.

Wouldn’t this potentially expose normal users to gore or NSFW who are just trying to make a topic? I’m not sure if I support having to be a “moderator” in order to post.


I personally agree with this idea. I am a member and it is very frustrating not being able to express daily bugs I experience nor being able to request features to improve my development. In fact, I feel a little left out since roblox deleted the opportunity for members to attempt to speak their minds. A solution like this would help me and many other developers.

As for this solution, I think it could use some polishing before being implemented to the forum. To restrict users from mass approving posts, they should be limited to approving 1 topic per day. If any of these topics stay inactive for roughly a week, they should be instantly deleted (unless they gather up the minimum amount of approvals).

1 Like

Oooooo this system actually seems kinda cool

1 Like

Everyone here (should) be 13+. I’ve seen inappropriate content posted online many times before and I’m just 15. Not too bad (or then it’s just me who doesn’t get disturbed by bad content). Not our responsibility if someone is a little kid and uses an alt to come here and sees inappropriate content. NSFW is also rare here, I’ve not seen it on DevForum.


So in short: basically the original Post Approval program but with more steps and potentially exposing many users to inappropriate content from malicious posters. No support from me.

This still suffers from the same subjectivity that single-person Post Approval did but opens it on a broader scale with enables further problems like malicious behaviour when reviewing topics and system gaming. I would trust single-person PA more because their members are carefully selected from a pool of active contributors with good standing.

Additionally, you should focus on problems rather than solutions. It seems tempting to post a proposed way to fix a problem but the solution to a problem is ultimately up to Roblox’s teams. Your only role is to raise a problem you’re having as a developer or user of the service.

I think I should mention that post approval was never a form of community moderation, only discussion control. Sages and Post Approval were still required to flag problem threads for real moderators to deal with. No community member has moderation capabilities.

I cannot, in good faith, support something that creates more problems than it resolves. The counters to the documented cons aren’t very convincing either.


I mean, this is definitely better than nothing. Of course, the launch will be painful because it will take time for users to discover the feature and learn the rules, but later on when people learn the rules the system will be stable and will work as intended.
But unfortunately, I highly doubt that this will be implemented like 70% of other requests and will probably be ignored like 100% of our cries about the absence of PA replacement.

1 Like

Not sure why being 13 makes it okay to show them 18+ content before the mods can even stop it. There’s a reason it’s called 18+, not 13+. Thirteen year olds are still kids.

This would essentially place every kind of burden that moderators are paid to face onto literal children. Most of the forum is 13-17! I thought it sounded kind of neat at first glance, but no support.


I mean I’ve moderated a Discord server before. I literally had to ban people for 18+ content and I’m just 15 right now. I’ve not been disturbed by those 18+ images ever. So if you’re just moderating, it’s not that bad. Just moderate the poster and forget about it.

1 Like

Currently, users can be exposed to offensive content anywhere on the forum, particularly in development discussion with the rise of trolling attacks. This isolates the exposure and creates barriers to spamming and a clear path to sending the post to DevRel without it ever being sent to the wider community.

Hopefully, having this system in place would end up decreasing the incidence of NSFW content posted on the forum by trolls, because by limiting the audience to 5* reviewers, their goal of getting attention would be almost impossible to achieve.

Yep, the goal of this system is to create a sustainable method of ensuring quality content while still giving everyone the opportunity to post.

This phenomenon is already regulated by a few things:

  • Limit of 5* credits maximum in any user’s account.
  • DevForum topic posting limits (already in place)
  • Shared moderation action for false approvers and bad posters.

Limiting the number of posts that could be approved to 1 post/day would mean that everyone would only be able to post every five days.

While it definitely is not a requirement, implementing a self-moderating system can make the public categories more polished and organized for a <13 kid visiting the forums to learn to develop (as a not logged-in visitor, which is possible).

I wasn’t too familiar with the later post approval system (the one with the fancy plugin and badge), but I can definitely agree that there is some fundamental overlap. However, there are also a few very important differences that make this system more effective than the last:

  • Sustainable: in order for a member to submit a post a topic, they have to review a small portion of the queue.
  • Self-regulating: this time, (almost) the whole community is involved, rather than a small group of forum veterans.
  • Efficient: As more people want to post topics, more topics get reviewed, which ensures the topics are reviewed swiftly.

As said earlier, the DevForum already has inappropriate troll posts popping up. But with this system, the audience would be much smaller (5*), which decreases the potential attention for trolls, while also situating that audience next to a report button. I’ll update the original post to clarify that there would be a well-situated reporting system for offensive content.

The subjectivity is an aspect of all moderation. This, in its very simplest form, is a preemptive flagging system. Rather than allowing low-quality content to get posted, only to be flagged and removed hours later, it makes sure that low-quality content is never posted to begin with.

If the content is deemed malicious, obscene, or otherwise offensive, the post will be sent to DevRel.

If the content is simply low-quality or inappropriately categorized, the OP will receive feedback in order to make changes to their topic as needed.

Malicious behavior with regard to failing topics is countered with a traditional admin-based moderation, through an appeal from the poster. Similar to the false-approval punishment discussed earlier, if an appeal is successful, the users who maliciously failed a topic will be punished. I’ll update the original topic to better explain this (I only included it in one of the design mock-ups).

System gaming has already been extensively discussed. Could you be more specific about how you think the system could be abused beyond alt/friend approval?

Also, after seeing this feedback, I’ve decided to remove the skip button from the design mock-up. Now, rather than being able to skip to your topic, you will be forced to review whatever is at the top of the queue. Although you could spend 30-50 minutes refreshing 5 alts, by the time your post reaches the top of the queue, it will likely already be reviewed by someone else.

(Note that it could be 2-3 hours before I have time to update the image)

Just like how it is necessary to identify issues with Roblox’s team, it’s also important to identify potential changes. That’s why programs like the community sage group existed and why Roblox implemented a community feedback program. Not just to see what’s wrong, but how the community believes that it should be fixed. Please see this topic for more info:

Good to know! I haven’t been in post approval before, so I wasn’t sure where exactly the line is drawn between feedback from post approval and DevRel’s moderation capabilities. This new system should function in a similar way, where malicious topics are handled by DevRel while low quality topics receive feedback from the community.

The goal of this proposition is quite the opposite. By building a system of self-moderation, the community can have the benefits of the old post approval system without the unsclalablility of the legacy system.

I’ll elaborate on those more. As more feedback comes in, I’ll try to update the original topic to reflect the pros and cons that have been raised.

That’s the goal! Even though it doesn’t necessarily resolve all of the issues, it would represent an improvement for both sides. Both those requesting greater accessibility and those seeking higher quality would get what they want (rather than the current situation, where categories are still locked down and the forum is filled with low quality threads).

This is one of the biggest improvements promised by the system. Right now, users learn the rules by posting bad content and getting feedback. This clutters the forum and leads to low-quality content proliferating as a linearly growing moderation team tries to control an exponentially growing forum. With the new system, bad topics would never be posted to begin with, which stops low-quality content before it can start.

A recent post has received a reply from Roblox staff that DevRel is currently researching a major project to overhaul content discovery on the DevForum. This shows that they are open to starting major efforts to improve the Developer Forum. By creating this feature request and describing it in detail, there is hope that progress can be made.

Under the current system, mods can’t do anything to stop offensive content before it is posted anyways (outside of a few measures like word filters). This already occurs in places like #development-discussion where trolls will spam offensive content. Right now, the community is limited to flagging the post (which still requires viewing the offensive content) and hoping it is swiftly removed before others find it. In contrast, this new system would limit the incentive for trolls by decreasing the size of the audience (and thus the attention) and give an option for community reviewers to flag the post immediately before it is broadcasted to the wider community.

Moderators are paid to review hundreds, if not thousands of posts per day, and on this forum particular, few of the posts flagged are actually 18+/NSFW (in my experience). For me, the top things I flag are:

  1. Off-topic
  2. Specific rule violations (e.g. no posting on behalf of others).
  3. Unsubstantive/spam
  4. Insults/ad hominem attacks

I’ve only had to flag maybe 3-4 posts for offensive content out of the 11.3k topics and 115k replies that I have read (but this could be different for other users).

While a few 13-14 year olds may not be mature enough to understand and take appropriate action against 18+ content, it isn’t possible to create a sustainable cycle of post approval unless everyone is required to moderate to post.

Thank you for your feedback everyone! Later today I’ll try to update the original post to clarify some of the concerns you brought up and add in the pros and cons that have been discussed.


All forms of post approval were the exact same. The plugin and badge were only meant to automate certain tasks rather than do them by hand as well as allow Post Approval members to directly accept a thread into a category.

In regards to your differences points, no those don’t make it better. Sustainable goes against DevRel’s principle of accessibility, self-regulating isn’t convincing given the current nature of the forum and it’s not efficient either because I’m sure a lot of people share the sentiment of not wanting to review topics just to get an answer to a brief or important question in addition to the fact that this also tickles on the negative side of accessibility to the forum.

The point here is that you should be focusing on the problem not the solution. This is applicable for regular feature requests as much as it is for forum feature requests. Roblox will resolve the problem in the way they best see fit. I’m not saying don’t propose a potential solution but your post’s focus should be clearly describing the problem not the solution.

Not agreeing with any community-driven “moderation” solutions. Developers should be focused on developing, moderation should be left to moderation.


The system proposed with credits and forced community moderation is just not fair for the community. I see in a reply that you stated people should just “forget” about inappropriate content. You have to keep in mind that some people are more disturbed about certain topics than you might be. The point of the DevForum is to discuss development related topics, not moderate a forum about development. DevRel has already removed the more community moderation systems due to it being unrealistic for an ever-growing community.

Buildthomas has already created a post elaborating on a more proactive method of moderation that is better suited for the growth of the platform and for future scalability. While I do not agree with everything in it, I totally recommend giving it a quick read as it presents great information on how prevention can be better than moderation.