Add community discussion boards (Iron Legion's Hack Week Project)

As a developer, it is currently impossible to have open two-way communication with players on Roblox.

Bots spam private messages, comments and group walls, filtering out any legitimate conversation.

Developers are forced to resort to third party services such as Twitter or Discord just to have basic interaction with their communities. Millions of hours of user attention are spent off-platform because there is no way to interact with players that have similar interests on the site.

Iron Legion’s Hack Week project gives a perfect solution: game discussion boards with developer-set permissions and moderation. By limiting posting privileges to players who have played a certain amount or achieved a certain badge, bots are eliminated.

  • Developers would be able to collect crucial bug reports and feature requests from the community, and gauge the demand on them.

  • Developers would be able to make announcements and interact with users, answering questions in an open forum for others to see.

  • Players would be able to interact with others players that have similar interests.

Millions of hours of engagement would be brought back to the Roblox website.

Please make this project a reality.


I think that this raises concerns as to the scalability of moderation. Roblox struggled, and in many cases completely failed, to moderate the old forum with a dozen or so sections. The report button hardly ever worked even on the old forums. Moderators basically only took action on threads that they came about themselves. What this feature proposes is millions of forum sections to moderate, with tens (hundreds) of millions of posts across millions of groups.

I feel like loads of inappropriate content, harassment, bullying, personal information, and so on, would be facilitated by this feature. There’s little that Roblox could do to combat this due to the sheer volume of posts.

1 Like


Google has reCaptcha, so if they don’t want to serve images, they can serve a fancy blue checkbox


The project was designed to handle this by giving the discussion owners the tools they need to manage their own discussions. Discussion owners (and possibly other users they designate) would be able to clean up spam and remove all posts by certain users, as well as set restrictions on who can post. Inappropriate content would still be reported to Roblox moderators, but discussion owners would be able to keep things clean by themselves.


Well the report system is a failure in itself that doesn’t work but back to OP I do support but then again back to moderation they canned the forums because the failure in moderation there. At the same time, it could be a really good step into more mature discussion with age limits (doubt it’ll happen) but it would be nice to chat without

“##### ###### #### yes ### hi #####”

From most users.

OP remains correct and this idea should be supported and encouraged but with hackweek projects the possibility of implantation is extremely low. There’s no way on ROBLOX the platform to engage and interact with anyone or your players, forcing you to sly-ly send kids offsite to another service (which we all know about) in a in-direct but obvious way. This might help reduce that and keep kids to remain in-house where moderators can hopefully take action on. It allows developers that missing link to interact and boost player<>developer relations.


As Iron said, the burden of moderation would primarily fall to the community owners. Boards that are particularly toxic or fail to properly moderate could always be delt with on a case by case basis. The overall moderator burden is much lower than with the old public roblox forums


This is a great solution because it takes the power of Roblox being user generated content, and puts the burden of community well being vs toxicity on the community leader, hence making moderation user generated.


Even if the thought of your group / community discussion board potentially being seen by admins and having warnings or whatever issued if it’s “too toxic” doesn’t scare some community runners, put down some incentives.

Make a rating system for community well being, sorted by an algorithm like how many reports are filed per member mixed with per member occurrence of name calling or giving out personal info / whatever else. If community moderators are on-top of things those posts will not be up for very long if it goes onto some kind of semi permanent rolling average rating that directly affects their future influence.
Then have it so privacy settings like <13 put a safety rating amount cut off to limit which communities players can join and participate on.

This could be applied to the wall as well, so I’m not sure a community discussion board is a bot countermeasure, but it’s invaluable for game feedback and discussion.

Captchas aren’t plastered over the site because of their implementation difficulty – it’s because they negatively impact legitimate users as well. Having to select cars on a street every time you post on a wall, join a game, rate a game, or send a chat message would be just as bad as when bots were rampant.

1 Like

But what if they choose not to report things to moderators? What if the discussion owners don’t clean up their forums? Even groups with the best intentions could fail to:

  • Have sufficient moderators to manage their forum. Remember that Roblox is full of impulsive immature children and, if groups are not careful, the people they choose as moderators will do it incorrectly. As such, they won’t have as many moderators as you think.
  • Have moderators that don’t understand the Roblox rules as well as Roblox moderators do. This applies to just about everyone. You can’t expect users to enforce rules that they don’t understand.
  • Enforce rules that are widely disagreed with. Offsite links, personal information, mild or implied swearing, and so on, would likely run rampant even in the best groups.
  • Protect users they don’t like. If a group’s administration doesn’t like a certain individuals, it is unlikely that they will remove content harassing, bullying, or even attacking those users. How quickly will Roblox moderators be able to act in their place?

You can’t say that groups will be punished for failing, because almost every group that’s not run by mature adults is going to face these problems. Taking away the privilege from these groups will simply result in the majority of groups having lost this privilege. Moderators are flawed. Child “moderators” are abhorrently flawed. How much damage will occur before the moderators take action on the groups? How many users will be impacted before the moderators catch the problem? There are groups with over a million members. Some of their games reach 100,000 concurrent players. How many (child) moderators will it take to (accurately) moderate a single group’s forum that potentially has over 100,000 active posters? Will large groups always end up losing this privilege because of the size of their communities that they can’t contain? That defeats the purpose.

The question at the end of the day is: does (or can) Roblox have the moderation resources to manage this? How quickly will they be able to act? Will they be able to discover content if it’s not reported? What is the exact nature of the incentive for groups to enforce the rules?

I’d love to see this feature, but from a business perspective I don’t see it working out.

Conceptually, this isn’t a difficult problem to solve – moderators just need a tool that shows a list of every thread in one place, similar to – and we already know the solution: assign an engineer to create that tool. Not being able to see all the threads at once is a challenge we already have the answer to, and I doubt it’s going to have any significant impact on community discussion boards.

The Roblox rules are still enforced by the Roblox moderators. Once again, Report Abuse and the incredibly strict filter are still going to be applied. The key difference is that in most cases, board moderators are going to be able to act much more swiftly to ensure a certain level of quality.

The filter still applies. None of these things are going to be able to make it through

The main people who will be running these boards will be game developers. It will be in their best interest to maintain certain quality on their board.

The difference between the Roblox forums and these boards is that while the Roblox forums were a complete anarchy where anyone could participate and with no one in control but the moderators, user-run boards will have an intermediate to take much of the load off of the moderators. And if there is a board that is as toxic as C&G was, with absolutely no moderation and questionable standards, then that board can just be removed.

You’re talking as if large board owners are going to give moderator to random kids who ask for it. This is a silly argument. Myself and many other developers with large discord servers already have a network of trusted moderators.

@EchoReaper (tagging since I can only reply to one person)

Captchas are already plastered over the site. A reCaptcha occurs when you join groups, follow a user, make an account, and other similar things that are commonly botted. It is not the normal reCaptcha, though. Instead, the button you are clicking to follow/join/create is the Captcha button and in the bottom right you will see the little reCaptcha symbol to let you know you are being checked.

(edit: not saying these work well, but they do stop some stolen bot scripts)

It’s not just developers that create massive groups. The ones that aren’t made by developers are run by… random children.

I’m pretty sure that the vast majority of groups are not run by successful game developers. Most groups are some sort of social organization. But since neither of us have any quick way to prove either way, we’ll just agree to disagree on that. I can say with absolute certainty, though, that the majority of groups are created and run by children.

If this were true we wouldn’t have moderators in the first place. Bypassing the filter is easy enough in game, and extremely easy on the website. Even now, children are able to post numbers in games. Scammers are able to post entire links on the website. Users find ways to bypass filters on swear words all the time. People could also post inappropriate Twitter and Youtube content if Roblox moderators weren’t on point in moderating all of these posts.

I thought about this as well, and it’s not a bad idea. The only question is: how quickly are posts going to be created? Will they always be able to read them all? Can they read them faster than they are created? Will they be able to detect things like harassment, bullying that doesn’t use swear words? Will they be able to check Twitter and YouTube links?

My main point in all of this isn’t that the idea of creating this feature is bad. It’s that it requires massive additional moderation manpower. This idea doesn’t really change that. Even with this solution, they’d need a large team of moderators 24/7 to catch some of the content. Once again, I’d love to see this feature, but I think that it’s unlikely that Roblox is going to increase their personnel to this degree to implement it.

Groups are an old and outdated feature that most users don’t touch. If you pay attention to the feature request, you will see it includes boards that are specific to a game. These will by far be the biggest boards, as the average front page game sees more daily visitors than the biggest groups have member counts.

The various tiny group discussion boards that may spring up might as well be party chats for the amount of moderation impact they will have.

Yes, this is because the only authorities in these fields are the filter and site moderators. Moving from chaotic discussions like this to self-policing communities is a positive change. The boards with the most traffic on them (which will be boards run by developers with active games, not the 100k member groups that barely have a thousand active users) will also tend to have the most effective self-policing.

Post rate limits, posting prerequisites, and the ability to ban a user from a board will make it very easy to control this. As someone with a dedicated team of moderators, I can tell you with confidence that given the right tools, moderation will not be an issue.

1 Like

I want to add on to this.

Site moderators do not need to know the context of a reported post in order to review it. They do not need to physically visit that part of a thread in order to determine if it breaks the site rules. They just need to see the raw text of the reported post.

This means that while adding boards may increase the raw amount of content that moderators have to sift through, it won’t make it any harder to go through the content. Therefor there isn’t really a scalability problem in regards to moderation.

1 Like

The first and foremost scalability problem is that they will have to hire an extra team of moderators to implement this.

They don’t have “teams” of moderators though. I’m pretty sure they are hired through some third party. I know for a fact they don’t work in-house and are not full-time employees (aside from a couple heads). It’s really easy and fairly cheap to just hire new moderators as demand goes up, so not really a scalability problem.

1 Like

Although i’d much rather have discord links be allowed, this is a reasonable solution and pretty cool solution and groups need to be overhauled. Support :yum:


I was gonna write a super organized multi header document but I have other stuff to get to.

Try to shoot this idea down:

  • Either start new Discourse site for Roblox forums or rebrand DevForum Discourse site.
  • The flagging system is already in place and works well
  • DevForum members could act as lite mods
  • Integrate group walls with DMs (Each Roblox grouo created has a DevForum group created and all it’s members but I to it. A massive DM could then be opened with all the members of said group.

I have no idea if any of this is making sense because I’m about to pass out now bai